Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2

Complex Software Models Software Estimation – What makes it so hard?

“Any sufficiently advanced technology is indistinguishable from magic.” - Arthur C. Clarke

1

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Acknowledgments

• ICEAA is indebted to TASC, Inc., for the development and maintenance of the Cost Estimating Body of Knowledge (CEBoK®) – ICEAA is also indebted to Technomics, Inc., for the independent review and maintenance of CEBoK® • ICEAA is also indebted to the following individuals who have made significant contributions to the development, review, and maintenance of CostPROF and CEBoK ® • Module 12 Software Cost Estimating – Lead authors: Belinda J. Nethery, Allison L. Horrigan – Assistant authors: Tara L. Eng, Heather F. Chelson – Senior reviewers: Richard L. Coleman, Michael A. Gallo, Fred K. Blackburn – Reviewer: Kenneth S. Rhodes – Managing editor: Peter J. Braxton

2

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Unit Index Unit I – Cost Estimating Unit II – Cost Analysis Techniques Unit III – Analytical Methods Unit IV – Specialized Costing 11. Manufacturing Cost Estimating 12. Software Cost Estimating Unit V – Management Applications

3

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Complex Software Models Overview • Key Ideas • Practical Applications – Software Lifecycle and – Software Sizing Fundamentals Development Methodologies – ESLOC Sizing – Software Sizing – Software Effort Estimation – Software Model Calculations – Schedule Determination – Model Evaluation and Selection

• Analytical Constructs • Related Topics 2 – ESLOC Equation – Costing Techniques – COCOMO II CER Equation – Parametric Estimating 3 n E – Regression Analysis PM A Size ××= Õ EM i i=1 8

– COCOMO II Schedule CER

4

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Complex Software Models • Core Knowledge – Software Overview and Background – Languages and Lifecycle – Size Estimation – Software Cost Estimating Models – Software Model Use – Challenges in Estimating Software • Summary • Resources • Related and Advanced Topics

5

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Software Estimating Issues • Cost and Schedule – Standish Group reported in 2016 that fewer than a third of all software projects were successfully completed on time and on budget of the past year • Performance – Healthcare.gov • Personnel – In house or contractor • Nature of Software – Size and Complexity – Evolving nature

6

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Cost Estimating Issues

• Budget ‘bogies’ get set very early in the lifecycle – Once a number is out there – it’s hard to take it back • Estimators are often encouraged to under-estimate in early stages of lifecycle • Software estimation is fundamentally an uncertain business under the best of conditions

7

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Cost Uncertainty: Accuracy in Estimating

Estimates cannot be more accurate than requirements/design maturity

8

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Critical Software Estimating Process Elements

9

© 2002-2013 ICEAA. All rights reserved. PresentedCost at the 2017 ICEAAUncertainty: Professional Development & Training Workshop Fundamental www.iceaaonline.com/portland2017Reason v1.2 for Underestimation

A downward bias is very likely if estimator does not formally account for the underlying probability distribution

• Typically cost, effort, SLOC distributions are highly skewed to the left • We can capture all this with just three parameters – Low, Most Likely, and High • Point estimates tend to fall between the low and most likely distribution parameters and Most Likely is typically less then 50th percentile

10

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Boehm’s Seven Estimating Steps

• Establish Objectives – Program Phase – Knowledge of scope • Plan for data and resources – Estimate as a project unto itself • Define software requirements – Work breakdown structure – Quantifiable, measurable, testable • Work out as much detail as feasible – Quality inputs • Use several independent techniques and sources • Compare and iterate estimates • Follow up estimates – Compare with actuals

11

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2

Languages and Lifecycle

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Programming Languages

13

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Software Development Cycle

SYSTEM / SOFTWARE HIERARCHY (Software = Computer programs, data, and documentation)

ISO / US 12207.0 Example: Software

Configuration Item (SCI) -- End-Use Function -- Separate Configuration Management SYSTEM

[ Intermediate Levels (Subsystem, Prime Item, etc.) ]

SCI 1 SCI 2 HCI 1

HCI = Hardware Configuration Item SC 11 SC 12 SC = Software Component

SU 111 SU 112 SU = Software Unit (100 – 200 SLOC)

14

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 IEEE/EIA 12207 Software Development Phases

SCI Activities

• Process Implementation – Select appropriate life cycle model • System Requirements Analysis – Determined by intended use of system – Consider feasibility of operation and maintenance • System Architectural Design – Top level architecture for system – Allocate requirements to HCI and SCI • Software Requirements Analysis – Requirements and quality specs for each SCI – Maintain traceability to system requirements and design – Consider operations and maintenance

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 IEEE/EIA 12207 Software Development Phases

• Software architectural design – Requirements to top level architecture – identify SCs – Top level interface and database designs • Software detailed design – Transform SCs to lower level SUs ready for coding – Detailed design for interfaces and databases • Software coding and testing – Develop and document source code – Prepare test procedures and data – Test each SU and database • Software Integration – Integrate SUs and test SCIs

16

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 IEEE/EIA 12207 Software Development Phases

• Software Qualification Testing – Independently test entire SCI • System Integration – Integrate HCI and SCIs into overall system • System Qualification Testing – Test coverage of system requirements and conformance with expected results – Test feasibility for operation and maintenance • Software Installation – Develop software installation plan – Install software in target environment • Software acceptance support – Support acquirers testing and review of software product

17

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Software Support

• Not ‘traditional maintenance’ – Software does not “wear out” or “break” – Support activities are quite similar to development activities • Support categories – Perfective - enhancements – Adaptive – accommodate environmental changes Preventive and – Corrective – fix errors other, 5%

Corrective, 17% – Preventive - Adaptive, 18%

Perfective, 60%

18

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Software Development Methods

System • Waterfall Method Requirements & Design CSCI Requirements • Pros Analysis Preliminary – Provide structure to Design effort Detailed Design

• Limitations Code and – Document Driven CSU Test CSC – Time between Integration requirements and and Test CSCI delivery of testable Test product can be long System – Doesn’t align with Test modern methods or practices

19

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Software Development Models

• Incremental

• Pros – Deliver capability in increments – Enables user involvement • Limitations – Not always appropriate – Requirements defined early in the process

20

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Software Development Methods

• Evolutionary (essence of Spiral) or evolutionary prototyping • Pros – Quality product early – Enables user involvement • Limitations – Can be more time consuming – Evolutions require planning 21

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Software Development Methods

• Agile development – Iterative development which prioritizes requirements – Each iteration is a full development cycle • Pros – Adaptable to change – Customer involvement and prioritization – Focus on business need and value – Sustainable development pace • Limitations – Not structured enough for architecture design or re-design work – May need hybrid with waterfall to satisfy organizational or acquisition needs

22

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Estimation Techniques

Model Category Examples Capabilities Limitations Algorithmic (Parametric, COCOMO II, PRICE True S, Top Down) SEER_SEM, QSM Slim) Easy to use May be innacurate Easy to 'game' the Relatively fast answers Require calibration Little input data needed predictable results

Bottom Up Task-Unit Method May be more accurate Data needed Detailed basis for estimate Time consuming Not useful early

Expert Judgement Wideband Delphi No data needed Bias of experts Useful early in project Knowledge levels

Must have similar Analogy Based on real programs programs Useful early in project

23

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2

Size Estimate

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Software Size Estimation

Size is a significant cost driver for almost all software cost estimating models

• Source Lines of Code • Function Points • Object Points • Story Points • Other Methods

25

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Source Lines of Code (SLOC) Sizing

• Concept – Estimate the number of source lines of code in a program • Methods for obtaining – Analogy (e.g. Aerospace Software Size Estimator, ISBSG) – Expert Judgement (e.g. Bozoki Software Sizing Model) – Parametric Sizing Models (e.g. True S Sizing Module) – Automated Counting Tools (e.g. USC Code Counter – for completed products) • Advantages – Most common used measure of size – Used in most software cost model algorithms – Relatively easy to count and visualize • Limitations – Language dependent – Data not available early in program – Inconsistent definitions of SLOC across industry

26

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Estimating Software Size Using SLOC

• Software size is simply a measure of code ‘bigness’ • The most common way to estimate size is through SLOC • SLOC estimates should include any code delivered as a software release • What specifically to count is not always obvious – Raw Physical – every line in the file counts as one line – Physical – Every line in the file counts except for blanks and comments – Logical – each individual statement to the compiler is a line (language dependent)

27

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Types of SLOC

• Raw Physical SLOC – Total number of lines in the file – Easy and quick to do – Limited value to estimation – Not recommended

28

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Types of SLOC

• Physical SLOC – Total number of lines in the file that are not blanks or comments – Well understood standard that is easy to implement – Language dependent since some languages are more ‘compact than others

29

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Types of SLOC

• Logical SLOC – Logical source statements – Captures size using language specific rules – E.g. In C/C++ the following are used to indicate the end of a line • Preprocessor directives • Terminal Semicolons • Terminal close-brackets – Slight variations in standards around counting logical lines – Tools such as USC Code count enforce standardization

30

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 SLOC Standards - Review

Type Description Pros Cons

Raw Physical A count of all lines in • Very easy to capture with • Code formatting can cause a source code file standard operating system SLOC counts to vary tools significantly even for functionally equivalent code Physical A count of all non- • Provides better accuracy • Generally requires a code comment, non-blank than Raw Physical counting utility lines in source code • Unambiguous definition of • Differences in code formatting file ‘comment’ and ‘blank’ lines between languages and development teams can cause SLOC counts to vary, to a lesser extent than Raw Physical Logical A count of language • Most accurate measure of • Requires a code counter with specific metrics SLOC - normalizes out support for the language to be (USC-SEI many of the counting errors counted, or the use of conventions) inherent to other counting conversion factors (less conventions accurate) • Input for many modern • Language-specific standards software cost estimation can be difficult to understand models

31

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Handling Special Cases

• Auto -generated Code – Not hand coded but generated from another program – An example • A hand written source file (input to the code generator) is 438 lines • The original source file is translated into C source code by a translation utility using default options. Generated output.c file with 3,351 lines – 7 times larger than the original file • Running the translation utility with an optimization flag (improve performance, but bloats the code) generated output.c file with 17,600 lines – What you really need to count is the file that is the input to the code generator not the output

32

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Handling Special Cases

• Dealing with Auto-Generated Code – If the original source file is unavailable • Use a code counter if one is available to count the logical lines • If not use a code counter that counts physical lines and work with your development team or group to determine a decent conversion • Use the following table to convert from generated code size to a size suitable for estimating

33

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Sizing Considerations

• What you choose to count is important – Implementation cost is driven mainly by written and adapted code – Maintenance cost is driven by delivered code • Analogous size data typically is delivered code • Since relatively few software projects are written from scratch – your project is likely to have: – New Functionality – developed from scratch – Reused Functionality – incorporated as is but must be re-tested • Projects tend to overestimate the amount of reused functionality – Modified Functionality – incorporated from elsewhere but modified • Projects tend to underestimate the amount of modifications required. • Each type of code requires rework – nothing is free!

34

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Dividing the SLOC

35

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Cost of Inherited/Reused Code

• Do not assume reuse is free • There have been cases were even a minor modification has cost more than half the cost of developing an application from scratch

36

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Equivalent Lines of Code

• To simplify the process of cost estimation, one must account for scope of incorporating and modified code that is reused or inherited • One way to accomplish this is with Equivalent (Effective) lines of code (ESLOC) – ESLOC weights the different types of code to take into the account differences between new, modified and reused code. – Different organizations and models calculated ESLOC differently – One example might be • ESLOC = New + 0.25 * Reused + 0.6 * Modified

37

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Computing ESLOC

• One Method for Computing ESLOC – Treat inherited code with 50% or more modifications as New – Compute ESLOC as follows: • For Flight Software – ESLOC = New + 0.33 * Reused + 0.84 * Modified • For Ground Software – ESLOC = New + 0.09 * Reused + 0.70 * Modified

38

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Computing ESLOC - Example

• You are inheriting three modules of 5 KSLOC each for ground software project – Module 1 is 5 KSLOC with no modifications – Module 2 is 5 KSLOC requiring 30-40% modifications – Module 3 is 5 KSLOC with 50-60% modifications

• Compute EKSLOC – Module 1 is pure reuse – Module 2 is treated as modified code – Module 3 is treated as new code – EKSLOC = 5 * 0.09 + 5 * 0.7 + 5 = 8.95 KSLOC

39

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Function Point Sizing

• Concept: - Compute size from Five Attributes – External Inputs (EI) – input screens, tables, etc. – External Outputs (EO) – output screens, reports, etc. – External Inquiries (EQ) – prompts, interrupts, etc. – External Interface Files (EIF) – shared databased, math routines, etc. – Internal Logical Files (ILF) – databases, directories, etc. • Methods for Obtaining – Analogy – Automatic generation from requirements or design tools – Backfiring (from SLOC counts) • Advantages – Based on extensive business research (Albrecht, Gaffney) – Can replace SLOC in effort estimation (True-S, SEER-SEM, etc.) – IFPUG Organization – continuous research and updates • Limitations – Applicability to real time and scientific applications – Data not always available early – Function Points are a concept – not a thing

40

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Function Point Sizing

• Albrecht’s equation – FP = 4 * EI + 5 * EO + 4 * EQ + 10 * ILF + 7 * EIF – This is the basic – with a +/- 25% adjustment for complexity

41

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Backfiring SLOC to FP and Vice-Versa

• Potentially useful if models require SLOC when you have FP • Use ratios like those in the table (but with caution!) • Better to use size estimate that matches your model

42

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Other Sizing Methods

• Requirements Documents (Critical Mass by Galorath, 2004) – Size from characteristics of documents such as word count, document length, key words, etc. – Comparisons with past documents and sizes – Can use Tele logic DOORS tool to facilitate • Use Case Points (and normalized Use Case Points) – Size based on Use Cases, Actors and Complexity factors – Suitable for Unified Modeling Language (UML) programs – Can use Rational Rose tool to facilitate

43

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Other Sizing Methods

• Use Case Conversion Points (True-S by PRICE Systems ,Minkiewicz, 2005) – Converts Use Case Information to Function Points – Use Cases, Actors, Complexity Factors

44

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2

Software Cost Estimating Models

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 History of Software Models

Nordan Putnam SLIM IBM US Army / GE QSM SAGE 1970 1976 1979 SEI 1995 Jensen HAC JS1, 2, 3 SEER-SEM 1979 CEI GAI 1980 1989 Doty RADC Validation 1977 COCOMO COCOMO II Boehm Boehm 1981 1995

PRICE-S TruePlanning Software PRICE Systems PRICE Systems 1977 2001

Chart was completed by Dr. Randy Jensen, 2006 ISPA Conference; “Have We Learned Anything In the Last 20 Years”

46

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Software Cost Models • COCOMO II • True S (PRICE Systems) • SEER-SEM (Galorath) • SLIM (QSM)

47

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 COCOMO II

• Sequel to 1981 COCOMO to reflect current environment • Result of project by Dr. Barry Boehm and others at University of Southern California (USC) to update COCOMO for the 90’s and beyond • Described in 2000 book “Software Cost Estimation With COCOMO II” • COCOMO II is available at http://sunset.usc.edu • Three models or stages 1. Application Composition – Uses Application Points 2. Early Design – Uses Function Points (converted to KSLOC) , five scale factors and seven “coarse grained” multiplier 3. Post Architecture – Refinement of the original COCOMO, Uses SLOC, five scale factors and seventeen multipliers

48

© 2002-2013 ICEAA. All rights reserved. PresentedCOCOMO at the 2017 ICEAA Professional IIDevelopment Equations & Training Workshop for Post Architecturewww.iceaaonline.com/portland2017 v1.2 and Early Design • Development Effort in Person Months – = ( ( )) + • Schedule in Months𝐸𝐸 𝑃𝑃𝑃𝑃 𝐴𝐴 ∗ 𝑆𝑆𝑆𝑆𝑆𝑆𝑒𝑒 ∗ 𝑃𝑃𝑃𝑃𝑃𝑃𝑃𝑃 𝐸𝐸𝐸𝐸 𝑃𝑃𝑃𝑃 𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴 – = 2.66 . . /100 • Where 𝐷𝐷+0 2∗ 𝐸𝐸−0 91 – A𝑀𝑀 =𝑀𝑀 Coefficient𝑀𝑀𝑀𝑀푀𝑀𝑀 (nominally∗ 𝑃𝑃 𝑀𝑀2.94) ∗ 𝑆𝑆𝑆𝑆𝑆𝑆�𝑆𝑆푆 – E = B + sum(XD) [ranges from 1.01 to 1.26] – B = scaling base exponent (nominally 0.91) – XD – Five exponent drivers – EM = product of 7 or 17 Effort Multipliers – =Person months from automatic translation activities – D = scaling base schedule exponent (nominally 0.28) 𝑃𝑃𝑃𝑃 𝐴𝐴𝐴𝐴𝐴𝐴𝐴𝐴 – Sched% is schedule compression or expansion (nominally 100)

49

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 COCOMO II Cost Model

50

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 COCOMO II Sizing

• Size is measured in thousands of source lines of code (KSLOC) • Total KSLOC = (1+REVL/100) * (New KSLOC + EKSLOC) – REVL is percentage of requirements evolution and volubility • EKSLOC = Adapted KSLOC * (1-AT/100) * AAM – AT is the percent of adapted KSLOC re-engineered by automatic translation • AAM is the Adaptation Adjustment Multiplier . • AAM = ( + ) for AAF <= 0.5 1+ 0 02∗ 𝑆𝑆𝑆𝑆 ∗𝑈𝑈𝑈𝑈𝑈𝑈𝑈𝑈 • AAM = 𝐴𝐴𝐴𝐴 𝐴𝐴𝐴𝐴𝐴𝐴 ∗ for100 AAF > 0.5 𝐴𝐴𝐴𝐴+𝐴𝐴𝐴𝐴𝐴𝐴+ 𝑆𝑆𝑆𝑆∗𝑈𝑈𝑈𝑈𝑈𝑈𝑈𝑈 – AAF(%) = 0.4 * DM + 0.3 * CM + 0.31IM) 100 – AA = Assessment and Assimilation (For Reuse Stability) (0 to 6)SU = Software Understanding Increments (10 to 50) – UNFM = Programmer Unfamiliarity with the Software (0.0 to 1.0)

51

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 COCOMO II Scale Factors

• These factors impact the exponent (E on the previous slide) – PREC – Precedentness, Newness of project – FLEX – Development flexibility, degree of relaxation – RESL – Architecture/Risk Resolution, degree of risk present – TEAM – Team Cohesion, Cooperation among team members – PMAT – Process Maturity

Scale Factor VL LO NM HI VH

PREC 6.20 4.96 3.72 2.48 1.24

FLEX 5.07 4.05 3.04 2.03 1.01

RESL 7.07 5.65 4.24 2.83 1.41

TEAM 5.48 4.38 3.29 2.19 1.10

PMAT 7.80 6.24 4.68 3.12 1.56

52

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 COCOMO II EFFORT MULTIPLIERS

Multiplier Description RELY Required software reliability

DATA Database size, in bytes per KSLOC

DOCU Documentation suitability to life cycle needs CPLX Product complexity (based on five operational areas) RUSE Additional effort for reuse on current or future programs

TIME Execution time constraints; percentage of time used STOR Main storage constraints; percentage of storage used PVOL Volatility of platform used (hardware, OS, etc.) ACAP Analyst team capability, as a percentile rating PCAP Programmer team capability, as a percentile rating APEX Applications experience of software development team

PLEX Platform experience of software development team LTEX Team experience with language and development tools

PCON Personnel continuity in terms of yearly turnover rate TOOL Level of sophistication of development tools used SITE (Average) Degrees of multiple site co-location and communication

SCED Relative percentage of schedule compression or expansion

53

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 COCOMO II Stage 3 Effort Multipliers

Factor VL LO NM HI VH XH RELY 0.82 0.92 1.00 1.10 1.26 DATA 0.90 1.00 1.14 1.28 DOCU 0.81 0.91 1.00 1.11 1.23 CPLX 0.73 0.87 1.00 1.17 1.34 1.74 RUSE 0.95 1.00 1.07 1.15 1.24 TIME 1.00 1.11 1.29 1.63 STOR 1.00 1.05 1.17 1.46 PVOL 0.87 1.00 1.15 1.30 ACAP 1.42 1.19 1.00 0.85 0.71 PCAP 1.34 1.15 1.00 0.88 0.76 APEX 1.22 1.10 1.00 0.88 0.81 PLEX 1.19 1.09 1.00 0.91 0.85 LTEX 1.20 1.09 1.00 0.91 0.84 PCON 1.24 1.10 1.00 0.92 0.84 TOOL 1.17 1.09 1.00 0.90 0.78 SITE (Average) 1.22 1.09 1.00 0.93 0.86 0.80 SCED 1.43 1.14 1.00 1.00 1.00

54

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 COCOMO II Stage 2 Model

• Effort and schedule equations are the same as Stage 3 except: – Size is initially computed in Unadjusted Function Points, then converted to KSLOC using Capers Jones ratios – Only seven effort multipliers are based on sums of valued for the seventeen stage 3 multipliers

55

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 COCOMO II Stage 2 Multipliers

Factor XL VL LO NM HI VH XH

RCPX 0.49 0.60 0.83 1.00 1.33 1.91 2.72

RUSE 0.95 1.00 1.07 1.15 1.24

PDIF 0.87 1.00 1.29 1.81 2.61

PERS 2.12 1.62 1.26 1.00 0.83 0.63 0.50

PREX 1.59 1.33 1.12 1.00 0.87 0.74 0.62

FCIL 1.43 1.30 1.10 1.00 0.87 0.73 0.62

SCED 1.43 1.14 1.00 1.00 1.00

56

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 COCOMO II Stage 1 Model

• Equation PM = NAP/PROD • Where – NAP is New Application Points = AP * (100 - %Reuse)/100 – AP is Application Points, computed from number of screens, reports, 3GL components, adjusted for complexity – PROD is a measure of developer’s experience and capability and Integrated CASE (ICASE) maturity and capability

57

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2

Software Model Use

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Model Selection and Use

• Selecting a Cost Model • Proper Use of Models • Software Metrics • Future Directions

59

© 2002-2013 ICEAA. All rights reserved. PresentedModel at the 2017 ICEAA Professional Selection Development & Training Workshop – 4 Step Approachwww.iceaaonline.com/portland2017 v1.2 (Ferens, 1986) 1. Determine Needs – General Statement (e.g. “Must estimate software life cycle costs during conception definition”) – Accuracy Requirements (e.g. “Within 25%”) 2. Select family (or families of models (e.g. Parametric) 3. Select Best Model(s) – Qualitative – maybe “Weighted Factors Approach” – Less subjective, quantifies process, time consuming

Factor Weight Model A B C Ease of Use 10 5 10 8 Life Cycle Capability 8 5 2 8 Data Availability 7 9 6 8 – Quantitative – Insure models calibrated – use internal data if possible 4. Periodically re-evaluate choice (but changing models is expensive)

60

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Plan your project

• Plan the acquisition and usage of an estimating model like the acquisition and execution of any other project with resources

61

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Proper Use of Models – Input Data Issues

• Input Data Challenges – Inputs often sensitive (e.g. CPLX in COCOMO II, Organizational Productivity in PRICE True S) – Some inputs subjective (e.g. ACAP, PCAP in COCOMOII) – Even “hard” inputs (size, complexity) are sometime hard to assess, especially early in the program • Ideas for Meeting the Challenges – Use a team to do estimates, include technical personnel – For size, use a sizing model or method (or two or three) – For subjective inputs make sure you understand what the model is asking – For subjective inputs, work with management personnel and if necessary perform Delphi (or similar) surveys – Perform and “iterative Pareto analysis” to determine which parameters have the most significant impact – When possible, calibrate and validate models

62

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Proper Use of Models – Model Calibration

• Benefits – “Adjusts” models to your particular organization, market, product – Bases estimate on relevant historical data – Can greatly improve accuracy • (Thibodeau 81: Factor of Five) – Most general purpose models are designed to be calibrated • Limitations – Database dependent – Only shows within-database accuracy • Accurate calibration shows robustness of database – Not as likely to show how accurate a model will be for new programs

63

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Proper Use of Models – Model Validation

• Assesses the accuracy of model on programs not used in calibration • Two possible methods – Use a holdout sample from database (e.g. use ½ the points for calibration and the other half for validation) – Resampling – repeating calibration and validation a number of times and taking average (good for small samples) • Creates a more realistic appraisal of accuracy for new programs • Still database dependent

64

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2

Challenges in Estimating Software • System Definition • Sizing and Tech • Quality • COTS • Calibration • Databases • Growth and Demand

65

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Challenges – System Definition • Obtaining System Definition – Must work with experts – Define notional system based on known requirements and include risk assessment for unknowns – Definition often at a high level – May include use of COTS software • Talk to commercial vendors for inputs • Multiple packages may be used – For custom code, look at similar systems for functions that are required – Assess need for both internal and external interfaces – Refine definition over time as system takes shape

66

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Challenges – Sizing and Tech • Sizing Is An Estimate Too – Use standard estimating methods 2 • Rapid Technology Change – Changes during the development process may have to be addressed • COTS Upgrades – May have to reintegrate – Simple retest to complete redo or no change at all – depends on COTS • Development Tool Changes – Newer tools may simplify effort (but still require learning) – May force change to the development process

Unit IV - Module 12 67

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Challenges – Quality • Difficulty in Assessing Quality – “Don’t know how good it is until you’re done” – Good planning impacted by tight schedules and other constraints – Software quality measures may help • Defects Identified • Defects Fixed Space Systems Cost Analysis Group Software Methodology • Failed Fixes Handbook, Version 1.0, June 1995, https://sscag.saic.com/ • Severity of Defects • Location of Defect • Degree of Cohesion and Coupling • Requirements satisfied • Depth of testing • Adequacy of Documentation • MTTD Errors • McCabe’s cyclomatic complexity

68

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Challenges – COTS

• Using Commercial Off-the-Shelf (COTS) Software – No code visibility Warning: – Difficult to customize – no source code COTS ≠ Cheap! 14 – Effort dependent on the software architecture – Might be too rigid to handle changing requirements – Assumes many users will find errors - need additional testing – Upgrades to COTS may force reintegration with custom code – Support costs for custom code may be affected and will vendor need support for COTS – Still must perform requirements definition, design, and test of the overall system – Dealing with licensing, royalties, incompatibilities between packages, lack of source code and understanding package – Estimation of COTS integration not mature

69

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Challenges – Calibration • Calibration of models – Most models built on industry averages therefore calibration may increase accuracy – Adjusts relationships/values to achieve 3 representative known outcomes – Understand how your model calibrates – Must collect cost, technical and programmatic data – Check content of actual data vs. content of model – Generally models have a calibration mode but may need to tweak the model Calibration of models must be done with care but is generally an improvement over default values

70

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Challenges – Growth and Demand

• Requirements and Code Growth – Delivered project is bigger than estimated – Increase driven by: • Poor understanding of requirements initially • New requirements added during development

• Underestimate of required SLOC Warning: Beware • Code reuse optimism requirements creep! – Key is to know the track record and account for expected growth • Supply and Demand of Labor – Affects personnel availability and cost of qualified personnel

71

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Software Cost Estimating Summary

• Understanding software cost estimation is critical because software is part of almost every estimate • Software cost estimating is in many ways similar to hardware estimating • There are a variety of software development approaches that can affect development cost and must be modeled accordingly to estimate • Analogy and Parametric are commonly used to estimate software development costs • There are a number of commercial parametric models available to estimate software costs • Software provides a number of specific challenges for the estimator

72

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Resources

• [Pressman] , A Practitioner’s Approach, Third Edition, Roger S. Pressman, McGraw Hill, Inc., 1992 • [Boehm 81] Software Engineering Economics, Barry W. Boehm, Prentice Hall, 1981 • [Boehm 2000] Software Cost Estimation with COCOMO II, Boehm et al., Prentice Hall PTR, 2000 • [ISPA 1999] Spring 2nd Edition Joint Industry/Government PARAMETRIC ESTIMATING HANDBOOK , http://www.ispa-cost.org/PEIWeb/toc.htm • [GSAM 2000] Guidelines for Successful Acquisition and Management of Software Intensive Systems: Weapon Systems, Command and Control Systems, Management Information Systems, Version 3.0, Dept of the Air Force, Software Technology Support Center, 2000 • [AFIT] Quantitative Management of Software, Graduate School of Logistics and Acquisition Management, Air Force Institute of Technology, Air University, 1995 • [IFPUG] International Function Point Users Group, http://www.ifpug.org • [Taylor] Object-Oriented Technology: A Manager’s Guide, David A. Taylor, Addison-Wesley, 1990 • [Cox] Object-Oriented Programming: An Evolutionary Approach, Brad J. Cox, Addison-Wesley, 1987 • SEER-SEM, Galorath Inc., http://www.galorath.com • [STSC] Crosstalk – The Journal of Defense Software Engineering, http://www.stsc.hill.af.mil/CrossTalk/2003/07/index.html Unit IV - Module 12 73

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Resources

• [Reifer] “Quantifying the Debate: Ada vs. C++,” Donald J. Reifer, Crosstalk: The Journal of Defense Software Engineering, Vol. 9, Number 7, July 1996 • [Jensen] “Software Estimating Model Calibration,” Randall W. Jensen, Crosstalk: The Journal of Defense Software Engineering, Vol. 14, Number 7, July 2001 • [Jones 1] Applied Software Measurement: Assuring Productivity and Quality, 2nd ed, Capers Jones, McGraw Hill, 1996 • [Jones 2] Estimating Software Costs, T. Capers Jones, McGraw Hill, 1998 • COCOMO II, http://sunset.usc.edu • [PRICE S] PRICE S Users Manual, Price Systems, http://www.pricesystems.com • [MIL STD 498]Military Standard 498, “Software Development and Documentation,” December 1994 • [IEEE] IEEE/EIA 12207.2-1997, IEEE/EIA Guide, Industry Implementation of International Standard ISO/IEC 12207; 1995, Standard for Information Technology, Software Life Cycle Processes – Implementation Consideration, April 1998 • [MIL-HDBK-881A] Department of Defense Handbook Work Breakdown Structures for Defense Materiel Items, July, 2005 • [SEI-CMM] Capability Maturity Model for Software, Version 1.1, Paulk, Mark C. et.al., Software Engineering Institute, Carnegie Mellon University, February 1993 • [Schaaf] "Agility XL", Schaaf, R.J., Systems and Software Technology Conference 2007, Tampa, FL, 2007 • Are Parametric Techniques Relevant for Agile Development Projects?, Minkiewicz, Arlene. PRICE Systems, 2012.

Unit IV - Module 12 74

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Advanced and Related Topics

75

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 SEI and CMMI • In 1987, the Software Engineering Institute (SEI) at Carnegie Mellon University (CMU), developed a methodology for assessing organizations’ software capabilities (Paulk, 1993) • This became the framework for the Software Capability Maturity Model (CMM). The CMM was initially developed for the Government to evaluate an organization’s ability to perform software development and maintenance work on Government contracts • SEI replaced the CMM in 2001 with a suite of CMM Integration (CMMI) models • The CMMI-SE/SW has five levels of software process maturity. These characteristics are typically demonstrated by organizations at that level.

76

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 CMMI Maturity levels

An organization is expected to successfully perform all process areas at each level (and all lower levels) to attain that maturity level; however, tailoring is allowed in special circumstances.

77

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAASoftware Professional Development & Training Resources Workshop Datawww.iceaaonline.com/portland2017 v1.2 Reports (SRDRs) • Provide software information across programs • Provide size, effort, schedule, other descriptive development data • DoD’s only standard centralized approach to SW data collection • Used to obtain both the estimated and actual characteristics of new software developments or upgrades • Both the Government program office and, later on after contract award, the software contractor submit this report • For contractors, this report constitutes a contract data deliverable that formalizes the reporting of and resource data • Not intended as a project management device to track software development progress • The SRDR is divided into three reports, Initial/Final Government Report, Initial Developer Report, Final Developer Report • SRDR is Required for: – All major contracts and subcontracts, regardless of contract type – For any element with a projected effort greater than $25M – Contractors developing/producing software elements within ACAT IA, ACAT IC and ACAT ID programs

“Understanding the Software Resource Data Report Requirements”, 5 June 2012, http://dcarc.cape.osd.mil/Files/Training/CSDR_Training/DCARC%20Training%20X.%20SRDR%20102012.pdf

Unit IV - Module 12 78

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Rules of Thumb • Develop your own metrics • Use government or industry standards from literature in interim • Make sure rules are applicable to your environment – Age of rule – Software language used – Purpose of software • Rules of Thumb – Cost per SLOC – Cost per Function Point

Unit IV - Module 12 79

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Cost Per SLOC

Application Domain Ada 83 Ada 95 C C++ 3GL Domain Norm Command and Control Commercial 50 * 40 35 50 45 Military 75 * 75 70 100 80 Commercial Products 35 30 25 30 40 40 Information Systems Commercial * * 25 25 30 30 Military 30 35 25 25 40 35 Telecommunications Commercial 55 * 40 45 50 50 Military 60 * 50 50 90 75 Weapons Systems Airborne and Spaceborne 150 * 175 * 250 200 Ground Based 80 * 65 50 100 75

*=not enough data available Dollar Cost per Delivered Source Line of Code (1995)

“Quantifying the Debate: Ada vs. C++,” Donald J. Reifer, Crosstalk: The Journal of Defense Software Engineering, Vol. 9, Number 7, July 1996

Unit IV - Module 12 80

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Cost Per Function Point

Average Cost per Function Point (Salary + Burden), Dollars Function Points End User MIS Outsource Commercial Systems Military Average 1 120 380 675 800 1,008 3,291 1,046 10 240 570 1,215 1,000 1,575 5,119 1,620 100 336 855 1,475 1,540 2,016 6,581 2,134 1,000 0 1,642 2,376 1,920 2,587 8,336 3,372 10,000 0 2,554 3,861 2,944 3,553 11,232 4,829 100,000 0 4,104 6,426 4,092 5,897 16,161 7,336 Average 232 1,684 2,671 2,049 2,773 8,453 3,389 Median 240 1,248 1,925 1,730 2,302 7,459 2,753 Mode 180 1,022 1,689 1,487 2,059 6,679 2,587

Applied Software Measurement: Assuring Productivity and Quality, 2nd ed, Capers Jones, McGraw Hill, 1996

Unit IV - Module 12 81

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2

Off-The-Shelf Models • COCOMO II • TruePlanning • SEER - SEM

Unit IV - Module 12 82

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 COCOMO II • Constructive Cost Model (COCOMO) 3 – By Barry Boehm and others at the University of Southern California (USC) Center for Software Engineering (CSE) – First presented in Software Engineering Economics in 1981 – Updated in 2000 to COCOMO II in Software Cost Estimation with COCOMO II • Website is http://sunset.usc.edu

Unit IV - Module 12 83

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 COCOMO II Inputs/Outputs • Inputs - Program size (SLOC or function points) - Similarity of the product to previous efforts - Flexibility allowed wrt other reqts, interfaces - Storage constraints - Thoroughness of the design effort - Volatility of associated H/W and software - Risk elimination - Analyst capability - Development team cohesion - Programmer capability - Maturity of the software development process - Continuity of the personnel on the project - Required product reliability - Personnel experience in the application - Size of the database - Personnel experience w platform - Complexity of software operations - Personnel experience w language and tools - Need for re- - Use of software development tools - Degree of documentation required - Development locations - Execution time constraints - Development schedules

• Outputs Development effort in person months and schedule in months

Unit IV - Module 12 84

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 COCOMO II Adjustments

“Software Estimating Model Calibration,” Randall W. Jensen, Crosstalk: The • Calibration Journal of Defense Software Engineering, Vol. 14, Number 7, July 2001 – Consolidating or eliminating statistically redundant parameters • Add cost drivers not in the model • Calibrate to existing actuals 3 – Constant A and Exponent E can be calibrated – Recommend 5 data points for constant only and 10 if adjust Exponent also • Use regression analysis 8

• Pitfalls Warning: If coefficients are changed through calibration, – Over-reliance on model Effort Multipliers and the model as a whole need to be re-validated Unit IV - Module 12 85

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 TruePlanning • Owned by PRICE Systems of Mt. Laurel, New 3 Jersey • Automated model purchased for an annual license fee • TrueAnalyst – Model Builder • Catalog – Collection of models (TRUE S, TRUE IT, etc.) • TruePlanning – creates the cost estimate • Website is http://www.pricesystems.com

Unit IV - Module 12 86

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 True S Inputs/Outputs • Inputs

- Program Size - Developer tools - Language - Development methodology - Application - Development schedule - Degree of new design vs. reuse - Development locations - Required reliability - Labor rates - Operating environment - Inflation rates - Interface constraints - Hours/month - Developer productivity - Phases (if not nominal) - Developer Experience • Output

Development effort in person months or dollars and schedule in months (model provides tailorable reports)

Unit IV - Module 12 87

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 TRUE S Adjustments • Calibration – Uses the productivity variable as the basis for calibrations • Provide actual inputs and cost then run the model backwards – Tailor the model to fit development phases, hours/month, labor rates, etc. • Pitfalls – Over-reliance on model

Unit IV - Module 12 88

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 SEER-SEM • Owned by Galorath Inc. of El Segundo, 3 California • Automated model purchased for annual license fee • In use over 20 years • Website is http://www.galorath.com

Unit IV - Module 12 89

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 SEER-SEM Inputs/Outputs • Inputs - Program Size in either SLOC or function points - Complexity of the software and thus difficulty of adding personnel - Personnel capability - Development environment including tools, practices, resource availability, frequency of change in the environment - Product development requirements such as quality, documentation, test, and frequency of change - requirements - Development complexity such as language, application, and host development system - Target environment such as memory, displays, security - Other factors such as schedule constraints, integration requirements, staffing constraints • Output Development effort in person months or dollars and schedule (reports are available in a variety of formats) Unit IV - Module 12 90

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 SEER-SEM Adjustments • Calibration – Compute an Effective Technology Rating (ETR) – Tailorable for different labor rates, phases, etc. • Pitfalls – Over-reliance on model

Unit IV - Module 12 91

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Other Models • SLIM from Quantitative Software Management – SLIM has been in use for over 20 years – http://www.qsm.com • Cost Xpert from Cost Xpert Group, Inc. – Cost Xpert has been in use since 1992 – http://www.costxpert.com • VERA from Technomics – Contact [email protected] • General listing and discussion of models on C2 Cost site – Sponsored by DoD for joint government-industry use

Unit IV - Module 12 92

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2

Software State of the Art • Object-Oriented Programming • Object Points for Software Sizing • IT Estimating • Software Buzz Words • Estimating New Architectures

Unit IV - Module 12 93

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Object-Oriented Benefits and Pitfalls

• Benefits Object-Oriented Technology: A Manager’s Guide, – Faster product development David A. Taylor, Addison-Wesley, 1990 – Higher quality – Easier maintenance – Reduced cost – Increased – Better information structures – Increased • Pitfalls Many of the pitfalls – First-time increased costs of OO are being – Maturity of the technology and tools overcome with – Need for standards time and use – Speed of execution – Limited support for large-scale modularity – Increased need for discipline, management and training – Allows development with insufficient analysis and design

Unit IV - Module 12 94

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Object Points

• Used to measure size of software developed using Integrated Computer-Aided Software Engineering (ICASE) tools – CASE tools “provide the engineer with the ability to automate manual activities and to improve engineering insight” • Includes Graphic User Interface (GUI) generators, design tools, repository for managing reusable components, etc. – Integrated CASE tools provide a whole development environment Software Engineering, A Practitioner’s Approach, 3rd ed, Roger S. Pressman, McGraw Hill, Inc., 1992 • Counts number of screens, reports, and third- generation modules for basic sizing • Each count is weighted for complexity, added up for a total count, then adjusted for reuse Software Cost Estimation with COCOMO II, Boehm et al., Prentice Hall PTR, 2000 Unit IV - Module 12 95

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Object Points Issues • Advantages – User interface-oriented – Less subjective, easier calculations – Promising measure for ERP implementations • Disadvantages – Cannot be counted until the end of design – Not widely utilized, hence validated productivity metrics unavailable

Unit IV - Module 12 96

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2

Information Technology (IT) Estimating • Automated Information Systems (AIS) • Enterprise Resource Planning (ERP)

Unit IV - Module 12 97

© 2002-2013 ICEAA. All rights reserved. PresentedAutomated at the 2017 ICEAA Professional Development & TrainingInformation Workshop Systemswww.iceaaonline.com/portland2017 v1.2 (AIS) Estimating • An AIS is an acquisition program that acquires Information Technology – Excludes weapon systems and tactical communication systems • Primarily software “development” in nature – Development/modification of non-COTS and COTS – Integration of COTS • Little-to-no hardware development since COTS • Minimal contractor cost data reporting

– Some CPR-like info “Assessment of OSD Cost Estimating Capabilites” Cheshire, Leonard, DODCAS 2001 – No CCDRs • Rapid technology advancement translates into rapid technical baseline (i.e., CARD) obsolescence “Life cycle cost (LCC) estimating for large management information system (MIS) software development projects,” T.M. Lesnoski, IEEE 1992 National Volume, vol. 3, 18-22 May 1992

Automated Information Systems: Cost Estimating Methods and Metrics. Fersch, Geier, Rosa, Wallshein, SCEA 2012. Unit IV - Module 12 98

© 2002-2013 ICEAA. All rights reserved. Presented at theEnterprise 2017 ICEAA Professional Development & Training Resource Workshop Planningwww.iceaaonline.com/portland2017 v1.2 (ERP) Estimating • An ERP system is a single business support system that provides for a variety of business functions • An ERP estimate must take include costs for: – Gap Analysis – Business Process Re-engineering (BPR) – COTS Integration – Custom Code Development – Model Configuration • However ERP systems are still in infancy relative to custom code development

• Typical cost drivers include “ERP: An Emerging Paradigm” Nethery, – Dollars spent on the ERP package Wiley, SCEA, June 2005 – Number of requirements – RICE size - Number of Reports, Interfaces, Conversions and Extensions • The cost driver may be “human” elements – How happy the user is with the current system? – How much pain would be experienced by the user if the new system looked different?

“Demystifying Major Automated Information System Programs” O’brein, Cummings, SCEA, June 2002

Unit IV - Module 12 99

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 AIS vs. ERP • The difference between AIS and ERP systems is quantifiable • Comparison of costs associated with Traditional AIS Projects versus ERPs Cost Element Trad ERP % Delta Program Management 15% 10% -35% Concept Exploration/BPR 3% 13% 306% Systems Engineering / System Implementation 52% 40% -23% System Procurement 17% 17% 1% Other 13% 20% 54% Total 100% 100%

“Demystifying Major Automated Information System Programs” O’brein, Cummings, SCEA, June 2002 – Source: NCCA Factors

Unit IV - Module 12 100

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2

Software Buzzwords • Firmware • Agile Development • Software as a Service (SaaS) • Service Oriented Architecture (SOA) • Interoperability • Net-Centric Operations • Data-Centric Architectures • Open Source Software

Unit IV - Module 12 101

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA ProfessionalSoftware Development & Training Workshop Buzzwords www.iceaaonline.com/portland2017– v1.2 Firmware and Agile • Firmware – A computer program that is embedded in a hardware device, for example a microcontroller – As its name suggests, firmware is somewhere between hardware and software • Like software, it is a computer program which is executed by a microprocessor or a microcontroller • But it is tightly linked to a piece of hardware, and has little meaning outside of it • Agile Development – There are many methods of Agile Development AKA Scrum – Most methods of Agile Development seek to minimize risk by developing software in short iterations • Each iteration is a full software development cycle • A typical iteration last between 2 to 4 weeks • At the end of an iteration, the product can be reviewed and evaluated by the customer for feedback – Agile development stresses team work and face-to-face communication "Agility XL", Schaaf, R.J., Systems and Software Technology Conference 2007, Tampa, FL, 2007

Unit IV - Module 12 102

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA ProfessionalSoftware Development & Training Workshop Buzzwords www.iceaaonline.com/portland2017– v1.2 SaaS and SOA • Software as a Service (SaaS) – Hosting of software applications on a server AKA On-Demand that can be accessed by the user via the web Software – Examples include: • Web mail, mapping services, conferencing solutions, NetSuite, and QuickBooks • Service-Oriented Architecture (SOA) – The underlying structure grouped by business process supporting interoperability between services – A standards-based (e.g., Extensible Markup Language (XML) messaging, Simple Object Access Protocol (SOAP), etc.) software architecture consisting of an application front-end, services, and an enterprise level service bus – Often characterized by: • Rapid development and integration of new capabilities

• Flexible architecture “Service Oriented Architectures: SOA How Is It • Agile mission execution Estimated?,” Snyder, McDonald, SCEA/ISPA, 2007 • Governance “SE/IT/PM Estimating Approaches for Service-Oriented • Workflow Architecture Environments” Snyder, Eckberg, SCEA/ISPA, 2008 • Loosely coupled interfaces Unit IV - Module 12 103

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA ProfessionalSoftware Development & Training Workshop Buzzwords www.iceaaonline.com/portland2017– v1.2 Interoperability and NCO • Interoperability ISO/IEC 2382-01, Information Technology Vocabulary – The capability to communicate, execute programs, or transfer data among various functional units in a manner that requires the user to have little or no knowledge of the unique characteristics of those units • Network-Centric Operations (NCO) – Seeks to translate an information advantage, enabled in part by information technology, into a competitive warfighting advantage through the robust networking of well informed geographically dispersed forces • Combined with changes in technology, organization, processes, and people, this networking may allow new forms of organizational behavior – Specifically, the theory contains the following four tenets in its hypotheses: • A robustly networked force improves information sharing; • Information sharing enhances the quality of information and shared situational awareness; • Shared situational awareness enables collaboration and self- synchronization, and enhances sustainability and speed of command; and • These, in turn, dramatically increase mission effectiveness

“The Implementation of Network-Centric Warfare,” Department of Defense, Washington D.C., 2005.

Unit IV - Module 12 104

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA ProfessionalSoftware Development & Training Workshop Buzzwords www.iceaaonline.com/portland2017– v1.2 Data-Centric Architectures • Designed to maximize the usefulness and accessibility of data enterprise-wide • Goal is to synchronize data, improve data quality, and deliver accurate, consistent data to transactional and operational systems • Utilize databases, COTS tools, ERPs, or other systems to enable a data-centric architecture • Pitfalls of non-Data-Centric Architectures – Erroneous, inconsistent, and obsolete data slow business processes and disrupt automation – Data and Information remains in isolated silos and does not reach across the organization or to important decision makers

“Data-Centric Architectures,” Doug Dineley, InfoWorld, March 11, 2005.

Unit IV - Module 12 105

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA ProfessionalSoftware Development & Training Workshop Buzzwords www.iceaaonline.com/portland2017– v1.2 Open-Source Software • A development method characterized by the free redistribution of software, inclusion of source code, allowance of modifications and derived works, and non-restrictive licensing • Promotes peer review and transparency of process to achieve better quality, higher reliability, more flexibility, lower cost, and prevent vendor lock-in • Maintained by volunteer programmers • Some common examples of open source products are: – Apache HTTP Server – The internet address system Internet Protocol – Internet browser Mozilla Firefox – GNU Emacs - An extensible, customizable text editor – Linux operating system • One of the most successful Open Source Programs • Unix-like operating system that was designed to provide personal computer users a free or very low-cost operating system • Known for efficiency and fast performance AKA FOSS (Free www.OpenSource.org and Open-Source Software) Unit IV - Module 12 106

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2

Other Estimating Development Methodologies • Incremental • Evolutionary • Spiral

Unit IV - Module 12 107

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Incremental

• Software is built in increments, complete requirements for the entire system are defined up front and allocated to increments System Increments: Requirements Incremental Method and Design • Normally sequential; System can be concurrent Requirements Increment 1 Analysis Preliminary • Includes design, code Design and test for Detailed Increments Design requirements in that Increment 2 Code and increment Preliminary Test Design CSCI Benefits: Detailed. Integration • Increased and Test Design System communication Increment 3 Code and Level System Preliminary Test • More frequent and Level Design CSCI faster deliveries Detailed Integration Design and Test Pitfalls: Code and • Requirements must Test be defined CSCI System • Need a sound Integration Test and Test architecture Quantitative Management of Software, Graduate School of Logistics and • Only deliver a small Acquisition Management, Air Force Institute of Technology, Air University, 1995 part of a system at a Unit IV - Module 12 time 108

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Modeling Incremental • Model as multiple Waterfalls – Model each increment as a separate Waterfall – use effort estimated from CSCI design through test • For system costs, model entire system as a single Waterfall and use only system level costs such as requirements analysis and system test • Increment may be at lower level with CSCI treated as system level • If increments are sequential: – May need to adjust productivity for later increments – May need to estimate system test after each increment is delivered- including only those parts of the code being tested

We’ll look again at our example for Incremental

Unit IV - Module 12 109

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Incremental Example • Recommendation of second consultant q Run 1 – CSCI Increment 1 • COTS package 1 • COTS package 2 Use output • COTS package 3 for CSCI • “Glue” code 500 SLOC design q Run 2 - CSCI Increment 2 through test • COTS module 1 • Customer Mgt code 500 SLOC q Run 3 – CSCI Increment 3 • COTS module 3 Use output • Customer Mgt 500 SLOC for system q Run 4 - Total system requirements and test analysis, design and test • CSCI Increment 1 • CSCI Increment 2 Add CSCI output from Runs • CSCI Increment 3 1-3 to System output from Run 4 to get total effort

Unit IV - Module 12 110

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Evolutionary

• Begins with a prototype containing core capability. Customer provides feedback; prototype is adjusted and additional capability added. Process is repeated until the system is complete

System Evolutionary: Requirements Evolutionary Method • Need general objectives, and Design not requirements to start CSCI CSCI CSCI • Prototypes are paper, Requirements Requirements Requirements Analysis Analysis Analysis software model, working Preliminary Preliminary Preliminary product, existing product Design Design Design Benefits: Detailed Detailed Detailed Design Design Design • Gets a product to customer Code and Code and Code and quickly and encourages CSU Test CSU Test CSU Test customer involvement CSC CSC CSC Integration Integration Integration Pitfalls: and Test and Test and Test • More time consuming than CSCI Test CSCI Test CSCI Test other methods for final System System System Test Test Test product • Must have a plan for Quantitative Management of Software, Graduate School of Logistics and execution even without Acquisition Management, Air Force Institute of Technology, Air University, 1995 complete requirements Unit IV - Module 12 111

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Modeling Evolutionary • Model as multiple Waterfalls – Model each pass as a separate Waterfall including the previous pass as reused and/or adapted and deleted code • Include all phases (system requirements through system test) in each pass but make adjustments for reused and adapted code – Passes are sequential therefore may need to adjust productivity for later passes • Have to determine what will be done in each pass even though requirements are not complete We’ll look again at our example for Evolutionary Unit IV - Module 12 112

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Evolutionary Example • Recommendation of third consultant Run 1: First Evolution or Pass Adjust factors to § Customer Order reflect that is only a Management System - Core Capabilities prototype - Prototype Interface § COTS Package 1 Run 2: Second Evolution or Pass Adapted code – use § COTS Package 2 q Reuse code ESLOC or make § Customer Order Mgt System adjustments for reduced § COTS Packages design, code and test q Adapted code § Prototype Interface from Pass 1 q New code Run 3: Third Evolution or Pass § Built in double checks qRe-test code § Customer Order Mgt System Reused code treated § COTS Packages like COTS – included qAdapted code for integration and § Prototype Interface from Pass 2 re-test qNew code § Auto-generated notification Unit IV - Module 12 113

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Spiral

• Breaks effort into pre-defined spirals to allow for Risk Assessment Spirals: 1 2 • Breaks effort into 4 DETERMINE OBJECTIVES, EVALUATE ALTERNATIVES, ALTERNATIVES, AND IDENTIFY AND RESOLVE RISKS quadrants CONTRAINTS Risk Analysis Support • Uses other methods and and Maintenance Risk Objectives, Analysis adds risk review Alternatives, Updated and Implementation Operational – Waterfall Constraints Objectives, Risk Operational Prototyping Alternatives, Design Analysis Prototyping and Objectives, – Incremental Constraints Alternatives, System/ Risk Design and Product Analysis Assessment Constraints Objectives, Prototyping – Evolutionary Alternatives, Risk Demonstration and Analysis Prototyping Constraints Project Conceptual Product Design Rqmts System Definition Prototyping Benefits: Review Review Review Review Concept of Simulations, Models, Engineering and Operation Software and Benchmarks Design Project System Require- • Emphasizes alternative CSCI and Planning Software ments Spec, Detailed Updated Integration Development Spec. Updated Design Detailed Enhanced and Transition System Software Design analysis Operational Test Planning Software Architecture Code Capability Site Specification and Code Integration, Activation Preliminary Unit • Risk driven approach Activation Training SDDs Test and Planning Unit Training Test Pitfalls: Planning Integration and Test Integration • Hard to use contractually IOC Qualification and Test DELIVERY Testing • Takes longer to develop Formal User Testing FOC Acceptance DELIVERY Test and Software Engineering, A Practitioner’s 4 Training 3 Approach, 3rd ed, Roger S. Pressman, FCA/PCA PLANNEXT PHASE DEVELOP NEXT LEVELPRODUCT McGraw Hill, Inc., 1992

Unit IV - Module 12 114

© 2002-2013 ICEAA. All rights reserved. Presented at the 2017 ICEAA Professional Development & Training Workshop www.iceaaonline.com/portland2017 v1.2 Modeling Spiral • Model as multiple passes – similar to Evolutionary – Model each spiral as a separate pass – but include previous spiral as reused and/ or adapted code • Include only those phases actually addressed in that spiral and make adjustments for reused and adapted code – For example, in the spiral diagram, the second spiral only has software requirements specification and system software specification – Later spirals have just code and test • Spirals are sequential therefore may need to adjust productivity for later passes – If model or CER doesn’t accommodate spiral, may need to add effort for risk assessment, planning, and analysis of objectives Unit IV - Module 12 115

© 2002-2013 ICEAA. All rights reserved.