<<

Software Cost Estimation for the LADEE Mission

Howard Cannon Karen Gundy-Burlet Intelligent Systems Division Intelligent Systems Division NASA-Ames Research Center NASA-Ames Research Center Moffett Field, CA 94035 Moffett Field, CA 94035 howard.n.cannon@.gov [email protected]

Abstract—The purpose of the Lunar Atmosphere Dust Environment Explorer (LADEE) mission was to measure the density, composition and time variability of the lunar dust I. INTRODUCTION environment. The ground-support and onboard flight software for the mission was developed using a “Model-Based Software” The Lunar Atmosphere Dust Environment Explorer (LADEE) methodology. In this technique, models of the spacecraft and [1] was a small NASA explorer class mission that launched flight software are developed in a graphical dynamics modeling September 6 of 2013 and was successfully deorbited on April package. Flight Software requirements are prototyped and 18, 2014. It orbited the to determine the density, refined using the simulated models. After the model is shown to composition and variability of the lunar exosphere. To work as desired in this simulation framework, C-code software is minimize fabrication and design costs, the “modular common automatically generated from the models. The auto-generated bus” spacecraft was designed with common structural code is then integrated with the Core Flight Executive and Core components that could be connected together to form the Flight Services (cFE/cFS) packages, VxWorks and appropriate spacecraft. LADEE was formed of four such modules, board support packages. The generated software is then tested in real time Processor-in-the-Loop and Hardware-in-the-Loop test housing the propulsion, avionics and three instruments for beds. observation of the moon.

Software cost estimation for the mission was performed 3 ways: LADEE’s physical construction proceeded down a low-cost 1) Extrapolated from development of a earth-based prototype rapidly prototyped product-line effort, and a similar hover-test vehicle, 2) Estimated through the Goddard Space philosophy drove the development of the software base. With Flight Center “mission design center” 3) Through the use of strict launch deadlines, budget limitations and a short 100-day COCOMO based estimation spreadsheets. In this paper, we will mission plan, the flight software team surveyed successful discuss the characteristics of each of the cost estimation methods, examples of rapidly deployed small-spacecraft missions to and how they were tuned for a model-based development effort discover best practices. Of these, the AFRL XSS-10 and XSS- rather than a traditional effort. The estimates are also compared with actual costing and trend data for the LADEE Flight 11 spacecraft utilized model-based development Software effort. methodologies, and we proceeded with a trade study between modeling languages using a hover test vehicle [2]. One Keywords—software cost estimation, model-based software advantage with model-based development is that controls development engineers are able to rapidly prototype algorithms in their “native language” and software engineers could directly auto- TABLE OF CONTENTS code the models. This minimizes the opportunity for I. Introduction ...... 1 communication errors between algorithm designers and qualified software developers. The model based-methodology II. Software Architecture ...... 2 also enhances early prototyping of requirements, enables validation and verification during early stages of development III. Software Development Process ...... 3 and provides a common platform for communication between subsystems, software engineers and stakeholders. IV. Software Size Estimation ...... 3 V. COCOMO Based Effort Estimation ...... 4 One difficulty with Model Based software development is that while there were anecdotal cases like XSS-10/11, there was VI. Bottoms up level of effort ...... 5 limited literature on how to modify cost estimation to take into account the reputed savings of the technique. An additional VII. IMDC efffort ...... 6 factor in uncertainty of the mission cost was the background of the team: experienced software, systems and GN&C VIII. Results and Discussion ...... 7 researchers and engineers, but without prior experience in IX. Conclusions ...... 7 developing onboard flight software for spacecraft. To minimize budget risk, we researched best practices in cost References ...... 8 estimation, and found extensive literature on cost estimation

U.S. Government work not protected by U.S. copyright. 1 techniques [3]. We utilized three different cost estimation VxWorks [7] and a Board Support Package (BSP) for the techniques, including model based parametric techniques, BroadReach Integrated Avionics Unit (IAU). The BSP was expert judgment estimation and analogy from internal customized to be compliant with the Portable Operating prototyping efforts. System Interface (POSIX) standard defined by the IEEE Standards Association [8]. II. SOFTWARE ARCHITECTURE For the LADEE mission, the flight software was developed Figure 1 shows the integrated architecture for the LADEE utilizing a technique known as “Model-Based Software mission. The yellow circles indicate control functions that Development.” In this technique, models of the spacecraft and were developed in Simulink and auto-coded for integration high-level flight control software are developed in a graphical with the other elements. Blue circles indicate the tasks that modeling language, such as Mathworks’ Simulink [4]. Flight were reused from the cFS package. While the cFS code was Software requirements are prototyped, refined and partially wholly reused, the tasks were customized for the LADEE verified using the simulated models. After the model is shown spacecraft through modifications to tables and headers. The to work as desired in this simulation framework, “C”-code circles in green represent tasks that were newly developed software is automatically generated from the models. A hand-code, including hardware drivers for all sensors, Simulink Interface Layer then wraps the auto-generated code actuators and payloads as well as commanding, telemetry and to communicate across the bus with the Core Flight Executive memory scrub functions. The modules communicate across (cFE), Core Flight Software (cFS) and Operating System the bus using the cFE “publish/subscribe” paradigm. A Abstraction Layer (OSAL) [5,6] GOTS components feature of the LADEE software is the implementation of developed at NASA GSFC. These are then layered on POSIX standards within the device driver layers.

FSW Architecture

OFSW Cmd Attitude Thermal Power Battery Actuator State Safe Mode & Mode Control Control Control Charge Manager Estimator Controller Processor System System System System

Software Bus

Stored Memory Memory Limit House- Memory Hardware Scheduler Commands Manager Dwell Checker keeping Scrub I/O

Health & File CCSDS File Data Telemetry Command Checksum Safety Manager Delivery Storage Output Ingest

System Support and O/S Services Telemetry KEY Gnd Cmds Hdwr Cmds FSW Internal Hand Simulink cFS Sensor Data Written FSW External Task Task Task GSFC OSAL, cFE, cFS, ITOS (GOTS)

Broad Reach Drivers (MOTS) Simulink/Matlab, VxWorks (COTS) 8

Figure 1 Flight Software Architecture

2

Figure 2 Agile Software Development Process

IV. SOFTWARE SIZE ESTIMATION A key component of parametric cost estimation is an estimate III. SOFTWARE DEVELOPMENT PROCESS of Source Lines of Code (SLOC). LADEE utilized the NASA Key to the software estimation effort was the implementation Marshall (MSFC) “Source Lines of Code Estimator” of the agile model-based software development process shown spreadsheet, which quantifies source lines of code to be in Figure 2. For LADEE, the model based development developed given adjective descriptions of the size and effort using the Workstation Simulation (WSIM) environment complexity of the unit. An image of the detailed results from allowed early verification that the algorithms and software the spreadsheet is provided in Table 2. Comparing the design met the detailed control requirements. Software would software architecture in Figure 1 to the categories in the then be generated in the “C” programming language, and then SLOC spreadsheet reveals that the LADEE architecture did integrated with the other software layers and tested in real not map one-to-one with the general categories in the MSFC time Processor-in-the-Loop (PIL) and Hardware-in-the-Loop spreadsheet. For instance, The Safe Mode Controller (SMC), (HIL) environments. One HIL test-bed was denoted the and Attitude Control System (ACS) both map into “Attitude “Travelling Road Show”, and could be taken to payload Control” and “Gyro Package Management”, but “Sun development sites so that early integration testing could be Sensing” is part of the SMC system alone. The State performed. Once the data was generated from all of the Estimation Module (EST) has aspects of “Star Tracker”, various test-beds, it would be integrated into an automated “Attitude Determination”. Other systems map more closely, reporting system that provided bidirectional traceability such as Command Ingest (CI) and Telemetry Output (TO) between requirements, models or code and the test results. with “Command Uplink” and “Data Downlink”. It can be The report system also automatically generated a requirements seen that the Memory Scrub software was not included in the verification matrix that was used to focus the development MSFC worksheet estimate at all because it was not identified effort. Verification of each level 4 requirement was assigned as a requirement at PDR. to 4 of the 5 major build development cycles as part of the concept of operations and software build specifications. The final build cycle was reserved for defect reduction for items found in spacecraft Integration and Test (I&T).

3

Table 1. SLOC estimate from MSFC estimation contained 680 SLOC, while the normally generated wrappers spreadsheet. comprised the rest of the code. So, in these cases, even for auto-code, the estimates were quite good.

The main GN&C control functions were significantly underestimated for the flight software. Attitude determination was estimated as 5000 SLOC, while the actual was 8372. Had we categorized the function as “X-Large”, the estimate would have been 7500, and in retrospect, this would have been a better choice, but at PDR we assumed the operation of the star tracker would be straight-forward, when indeed, it became quite complex. The SMC and ACS modules would both be included in the estimate of attitude control functions with actual SLOC of 8628 well in excess of the “X-Large” estimate of 6500 at PDR. The biggest discrepancy in the SLOC estimate was the underestimation of the hardware I/O drivers and the memory scrub module needed for the mission. They comprised about 12500 SLOC, which is a significant portion of the difference between the estimate of 23500 shown in Table 3 and actual newly developed of 41430 SLOC. The newly developed SLOC even greatly exceeded our estimated high SLOC of 29500. To generate the high and low estimates, we tweaked the size of each unit one up and one down from the most likely values we chose.

V. COCOMO BASED EFFORT ESTIMATION The SLOC estimates developed in the previous section were used as part of a COCOMO-based [10] effort estimation using the Jet Propulsion Laboratory (JPL) Software Cost Analysis Tool (SCAT) spreadsheet [11]. This spreadsheet utilizes the SLOC estimates along with other parameters such as software

complexity, skill of the team, percentage software reuse, and While the mismatch in categories causes some difficulty for many other parameters. The effort multipliers and scale the analysis, a comparison between the actual software SLOC factors are set qualitatively from low to high with descriptive and the MSFC estimates can be performed for some units. language accompanying each factor, useful for a novice trying Table 2 provides the software SLOC for the final flight to tune parameters appropriately in the sheet. Once the software version for LADEE. The SLOC count has been parameters are tuned, one can then generate Monte-Carlo computed using David Wheeler’s “sloccount” [9]. Care had results for the total effort and cost likely needed for the to be taken to not include auto-generated test functions for the mission. As seen in Figure 3, at PDR, the 70% estimate for Simulink module. The MSFC spreadsheet predicted the effort to develop the Class B software was listed as approximately 28 Full Time Equivalents (FTE) from approximately 1000 lines of code for each of the hand development of the Class B software requirements through the generated CI and TO modules, which represented a small software I&T period. This did not include support for mission overestimate for CI (807) and a moderate underestimate for operations TO (1405). This may be because the estimated complexity for the TO routine was too low, and indeed increasing the As a thought experiment on impact of software reuse, we complexity to “X-large” would have increased the SLOC estimate to 1500, very close to the final outcome. The basis modified the numbers in Table 3 such that the adapted SLOC for the SLOC count in the MSFC spreadsheet is also line was reduced by 40K and the new SLOC line was unknown, and a mismatch in the methodology could have an increased by 40K, leaving all other factors the same. This impact on these comparisons. Looking at a representative somewhat simulates the situation of re-developing the auto-generated unit, TCS was reasonably estimated at 1000 vs. cFE/cFS/OSAL services from our estimate of the software size at PDR. Surprisingly, the predicted work effort only 1139 actual SLOC. This unit was largely composed of increased to approximately 29.5 FTE. This is counter- embedded Matlab components, so is an interesting case to intuitive, as we do not believe that we could achieve the level look at the size of the auto-code vs. the hand-coded part of the of services and quality by redeveloping the executive services unit. In particular, the embedded Matlab scripts for the unit layer rather than reusing it. It more than likely reflects the had only 120 SLOC, while the main Simulink function

4

Table 2. Software SLOC for final flight software load.

CSU Description Category SLOC

TCS Thermal Control System Auto-generated, New 1139

PCS Power Control System Auto-generated, New 2201

BCS Battery Control System Auto-generated, New 2139

CMP Command Mode Processor Auto-generated, New 1655

ACT Actuator Manager Auto-generated, New 2498

ACS Attitude Control System Auto-generated, New 4394

EST State Estimator Auto-generated, New 8372

SMC Safe Mode Controller Auto-generated, New 4234

CI Command Ingest Hand Generated, New 807

TO Telemetry Output Hand Generated, New 1405

MS Memory Scrub Hand Generated, New 2785

HWIO Collection of Hardware I/O Drivers Hand Generated, New 9801

GOTS cFE/cFS/OSAL components Some Adaptation. 66771

OS VxWorks/Drivers OS: Full Reuse, Drivers Modified 37694

New 41430 Development Total 145895

Table 3 Estimates of New, Modified and Adapted SLOC at PDR

relatively high cost estimate of adapting and testing re-used spreadsheet for the case where the entire software SLOC software. count is listed as new with no reuse at all. In this case, the work effort is estimated at 70 FTE. Another interesting experiment is to take the final flight software SLOC counts from Table 2 with all other factors VI. BOTTOMS UP LEVEL OF EFFORT remaining the same in the SCAT sheet and redo the Monte- In this approach, we developed a detailed Work Breakdown Carlo calculation. In this case, the 70% level of effort is Structure (WBS) for the project based on our experience estimated as 33 FTE. A final expement is to evaluate the with the hover test vehicle rapid prototyping process. For

5 each Computer Software Component (CSC), we estimated Maturity Model (CMMI) [12] Level 2 over the course of the the number Computer Software Units (CSU) that would be mission. The program management related estimates were required to implement the functionality needed. Based on integrated with the software effort estimates and added up in our prototyping effort to develop the hover test vehicle, we a master spreadsheet. Through this process, we estimated estimated that it would take 10 hours to implement each the total software development effort from the “Bottoms basic unit in either Simulink or hand code. The 10 hours Up” estimate to require approximately 27 FTEs. did not include the development of the requirements, design, integration, test or base-lining of the unit. We then VII. IMDC EFFFORT took a poll across the project management soliciting estimates of multiplies applied to the base development to perform the rest of the required software activities. The Flight Software team also participated in an early Estimates for the multiplier ranged from a low of 5 to a high costing exercise for the overall LADEE mission that was of 20 hours per hour of actual software development. These conducted by the Integrated Mission Design Center (IMDC) were then calibrated against the hover test effort to settle on at NASA Goddard Spaceflight Center. The IMDC performs a multiplier of 7.7 rapid concept studies for future missions using a skilled resident engineering team working closely with the Project management activities were estimated separately customer team in a collaborative, concurrent design from the sheer software development effort. For project environment. As part of the costing exercise, IMDC management, including documentation, quality system setup personnel examined the types of functionality that needed to and top level system engineering, we estimated the hours of be provided for the mission by the Flight Software. The effort by each discipline lead for each Work Breakdown team elicited mission and spacecraft parameters, such as Structure (WBS) element on a per build, per document or numbers of sensors and actuators, software reuse, CPU type per as necessary. For instance, we estimated 16-24 and capabilities, test-bed requirements, etc.. The hours per discipline lead per major review (such as the spreadsheet is calibrated based on previous experience for Preliminary Design Review). This turned out to be a the level of effort for each of these areas of functionality. significant underestimate as the effort usually turned out to The spreadsheet rollup breaks out the FTE effort into 6 be more than a week per person. Another major effort that functional areas and includes procurement estimates. The was not captured in our estimates was the hundreds of hours LADEE mission was estimated to take 38.5 FTEs required for certification and recertification in Capability

Figure 3. Estimation of Effort from JPL SCAT spreadsheet 6

and $1.1M in procurement from start of development spacecraft, and worked many hours that are not reflected in through pre-launch preparation. Post launch estimates are the official totals. The internal software cost estimations rolled up into Mission Operations budgets. Details of the techniques also generally under predicted the totals, in part, IMDC methodology can be found in [13]. because we responded to management concerns about the size of the software budget. We also did not properly VIII. RESULTS AND DISCUSSION estimate the extraordinary amount of time used for official Table 4 shows the labor and overall cost that was estimated reviews, key decision points, and for the CMMI using the three methodologies previously described. As the certification process that occurred multiple times during the table shows, the estimates were relatively similar, and development process. closely match the actual costs that were recorded. One difference between the methods is that support from GSFC was counted as procurement dollars rather than FTE in the IX. CONCLUSIONS Bottoms up and parametric estimations. It is apparent The software cost estimation process for the LADEE though, that the vast majority of costs for flight software program produced good estimates despite the relative development are from personnel time rather than raw inexperience of the members involved. The use of the procurement. The IMDC methodology was closest, MSFC SLOC estimation spreadsheet could have been remarkably within 2% of the actual cost of the flight improved by making the software structure more closely software effort. The higher level of accuracy for the IMDC aligned with the structure of the spreadsheet, but it is not method is somewhat expected because it was built on years appropriate to impose structure on the software architecture of experience in cost estimation. Given perfect knowledge due to the cost estimation process. The COCOMO at PDR of the ultimate SLOC count of the software, the parametric analysis is remarkably good given a decent COCOMO parametric estimation method would have been estimate of the code size even for a model-based remarkably close to the final actual effort (33 vs. 33.7 FTE). development effort. This is surprising, because the effort of Even with the various mistakes, all three methods were developing functionality in a graphical modeling relatively close and well within 10% of the ultimate cost of environment is only loosely related to actual SLOC count. the software. This adds confidence that the early cost The auto-coding template and methodology used can impact estimates were at least reasonable despite the inexperience SLOC counts. There are many rumors of “software bloat” of the team. due to model-based techniques that were not born out in this development effort, perhaps due to the templates used. In It should be noted that in the early stages of the project, the the end, the software was developed on schedule and Flight Software and Project Management teams chose to relatively close to all of the estimates, with the IMDC utilize the lowest of the three estimates as an optimistic clearly being the closest to actuals. Once calibrated for the plan, while keeping in sufficient reserve to meet the cost of particular type of development methodology used, the highest estimate should the cost of the Flight Software parametric cost estimation such as COCOMO can be quick exceed that amount. and effective. For an inexperienced team on a new-start project, it is recommended that multiple estimations be used While the estimates are all quite close to the actuals, it to improve confidence in the proposed budget and schedule. should be noted that the flight software team for LADEE was very invested in the software and well-being of the

Table 4 Estimates versus Actual Costs Estimated FTE Estimated Procurements Overall Cost* Parametric Model 28 $2.5 M $9.5 M Bottoms Up Analysis 27 $2.5 M $9.3 M IMDC Estimate 38.5 $1.1 M $10.7 M Actuals 33.7 $2.49 M $10.9 M *Average cost for FTE assumed at $250K/yr.

7

REFERENCES Biography [1] Delory, G.T.; Elphic, R.; Morgan, T.; Colaprete, T., Horanyi, M;, Mahaffy, P.; Hine, B.; Boroson, D, “The Howard Cannon is a Computer Lunar Atmosphere and Dust Environment Explorer Engineer at NASA Ames Research (LADEE)”, 40th Lunar and Planetary Science Conference, (Lunar and Planetary Science XL), held Center. He served as the LADEE Flight March 23-27, 2009 in The Woodlands, Texas, id.2025 Software Subsystem Lead throughout [2] Hine, B., Turner, M., Marshall, W. S. , “Prototype the formulation phase and most of the Common Bus Spacecraft: Hover Test Implementation spacecraft development phase. He then and Results”, NASA/TM-2009-214597. served as a Flight Director in Mission Operations for the [3] Trivailo O, et al. Review of hardware cost estimation methods, models and tools applied to early phases of remainder of the mission. As the Flight Software Lead, he space mission planning. Progress in Aerospace participated in the development of the estimates in order to Sciences (2012) negotiate and establish the software budget. He is currently [4] Mathworks Corporation, http://www.mathworks.com. leading the software development effort for the Resource [5] Wildermann, Charles P, “NASA/GSFC’s Flight Prospector mission. Cannon has a B.S. in Mechanical Software Core Flight System”, Flight Software Engineering from Bradley University in Peoria IL, and an Workshop, http://flightsoftware.jhuapl.edu/, 2008. M.S. in Robotics from Carnegie Mellon University. [6] McComas, David, “NASA/GSFC’s Flight Software Core Flight System”, Flight Software Workshop, http://flightsoftware.jhuapl.edu/, 2012.

[7] Wind River Corporation http://www.windriver.com/products/vxworks/ Karen Gundy-Burlet is a Research [8] http://standards.ieee.org/develop/wg/POSIX.html Scientist at NASA Ames Research [9] Wheeler, David. http://www.dwheeler.com/sloccount/ Center. Se served as the LADEE [10] COCOMO website Flight Software Subsystem Quality http://sunset.usc.edu/csse/research/COCOMOII/cocom Engineer, became the Flight o_main.html Software Lead and served in Mission [11] Lum, Karen. “Software Cost Analysis Tool User Operations. She led the development Document”, Caltech/NASA Jet Propulsion Laboratory, of the LADEE FSW V&V system and 2005. performed other tasks pertinent to this report such as [12] Paulk, Mark C.; Weber, Charles V; Curtis, Bill; Chrissis, Mary Beth (1995). The Capability Maturity software cost estimation. She currently leads the Common Model: Guidelines for Improving the Software Process. Avionics and Software Technologies (CAST) team that is SEI series in software engineering. Reading, Mass.: generalizing the LADEE FSW to other spacecraft Addison-Wesley. ISBN 0-201-54664-7. missions. Dr. Gundy-Burlet graduated with a B.S. in [13] Karpati, G.; Martin, J.; Steiner, M.; Reinhardt, K., "The Integrated Mission Design Center (IMDC) at NASA Mechanical Engineering from U.C. Berkeley and Goddard Space Flight Center," Aerospace Conference, obtained M.S. and Ph.D. degrees in Aerospace 2003. Proceedings. 2003 IEEE , vol.8, no., Engineering from Stanford. pp.8_3657,8_3667, March 8-15, 2003

8