Software Size Measurement: a Framework for Counting Source Statements
Total Page:16
File Type:pdf, Size:1020Kb
Technical Report CMU/SEI-92-TR-020 ESC-TR-92-020 Software Size Measurement: A Framework for Counting Source Statements Robert E. Park with the Size Subgroup of the Software Metrics Definition Working Group and the Software Process Measurement Project Team Technical Report CMU/SEI-92-TR-020 ESC-TR-92-020 September 1992 (Draft) /Helvetica /B -52 /UL .8 /gray exch def /start exch def /rotval exch def /mode exch def findfont /infont exch def /printme exch def Software Size Measurement: A Framework for Counting Source Statements Robert E. Park with the Size Subgroup of the Software Metrics Definition Working Group and the Software Process Measurement Project Team Unlimited distribution subject to the copyright. Software Engineering Institute Carnegie Mellon University Pittsburgh, Pennsylvania 15213 This report was prepared for the SEI Joint Program Office HQ ESC/AXS 5 Eglin Street Hanscom AFB, MA 01731-2116 The ideas and findings in this report should not be construed as an official DoD position. It is published in the interest of scientific and technical information exchange. (Draft) /Helvetica /B -52 /UL .8 /gray exchFOR def THE COMMANDER /start exch def /rotval exch def /mode exch(signature def on file) findfont /infont exch def /printme exch def Thomas R. Miller, Lt Col, USAF SEI Joint Program Office This work is sponsored by the U.S. Department of Defense. Copyright © 1996 by Carnegie Mellon University. Permission to reproduce this document and to prepare derivative works from this document for internal use is granted, provided the copyright and “No Warranty” statements are included with all reproductions and derivative works. Requests for permission to reproduce this document or to prepare derivative works of this document for external and commercial use should be addressed to the SEI Licensing Agent. NO WARRANTY THIS CARNEGIE MELLON UNIVERSITY AND SOFTWARE ENGINEERING INSTITUTE MATERIAL IS FURNISHED ON AN “AS-IS” BASIS. CARNEGIE MELLON UNIVERSITY MAKES NO WARRAN- TIES OF ANY KIND, EITHER EXPRESSED OR IMPLIED, AS TO ANY MATTER INCLUDING, BUT NOT LIMITED TO, WARRANTY OF FITNESS FOR PURPOSE OR MERCHANTIBILITY, EXCLUSIVITY, OR RESULTS OBTAINED FROM USE OF THE MATERIAL. CARNEGIE MELLON UNIVERSITY DOES NOT MAKE ANY WARRANTY OF ANY KIND WITH RESPECT TO FREEDOM FROM PATENT, TRADEMARK, OR COPYRIGHT INFRINGEMENT. This work was created in the performance of Federal Government Contract Number F19628-95-C-0003 with Carnegie Mellon University for the operation of the Software Engineering Institute, a federally funded research and development center. The Government of the United States has a royalty-free government-purpose license to use, duplicate, or disclose the work, in whole or in part and in any manner, and to have or permit others to do so, for government purposes pursuant to the copyright license under the clause at 52.227-7013. This document is available through Research Access, Inc., 800 Vinial Street, Pittsburgh, PA 15212. Phone: 1-800-685-6510. FAX: (412) 321-2994. RAI also maintains a World Wide Web home page. The URL is http://www.rai.com Copies of this document are available through the National Technical Information Service (NTIS). For informa- tion on ordering, please contact NTIS directly: National Technical Information Service, U.S. Department of Commerce, Springfield, VA 22161. Phone: (703) 487-4600. This document is also available through the Defense Technical Information Center (DTIC). DTIC provides access to and transfer of scientific and technical information for DoD personnel, DoD contractors and potential contrac- tors, and other U.S. Government agency personnel and their contractors. To obtain a copy, please contact DTIC directly: Defense Technical Information Center / 8725 John J. Kingman Road / Suite 0944 / Ft. Belvoir, VA 22060-6218. Phone: (703) 767-8222 or 1-800 225-3842.] Use of any trademarks in this report is not intended in any way to infringe on the rights of the trademark holder. Table of Contents List of Figures v Preface ix Acknowledgments xi 1. Introduction 1 1.1. Scope 1 1.2. Objective and Audience 2 1.3. Relationship to Other Work 2 1.4. Endorsement of Specific Measures 3 2. A Framework for Size Definition 5 2.1. Definition Checklists 7 2.2. Supplemental Rules Forms 9 2.3. Recording and Reporting Forms 10 3. A Checklist for Defining Source Statement Counts 13 3.1. Describing Statement Counts 13 3.2. Modifying the Checklist to Meet Special Needs 21 3.3. Assigning Rules for Counting and Classifying Statement Types 21 4. Size Attributes and Values: Issues to Consider When Creating Definitions 23 4.1. Statement Type 23 4.1.1. Executable statements 24 4.1.2. Declarations 24 4.1.3. Compiler directives 26 4.1.4. Comments 27 4.1.5. Blank lines 28 4.1.6. Other statement types 28 4.1.7. Order of precedence 28 4.1.8. General guidance 29 4.2. How Produced 29 4.2.1. Programmed 30 4.2.2. Generated with source code generators 30 4.2.3. Converted with automated translators 30 4.2.5. Modified 32 4.2.6. Removed 34 4.3. Origin 34 4.3.1. New work: no prior existence 35 4.3.2. Prior work 35 4.4. Usage 37 4.4.1. In or as part of the primary product 38 4.4.2. External to or in support of the primary product 38 CMU/SEI-92-TR-20 i 4.5. Delivery 39 4.5.1. Delivered as source 39 4.5.2. Delivered in compiled or executable form, but not as source 40 4.5.3. Not delivered 41 4.6. Functionality 42 4.6.1. Operative statements 42 4.6.2. Inoperative statements (dead, bypassed, unused, unreferenced, or unaccessed) 42 4.7. Replications 45 4.7.1. Master source statements (originals) 45 4.7.2. Physical replicates of master statements, stored in the master code 46 4.7.3. Copies inserted, instantiated, or expanded when compiling or linking 46 4.7.4. Postproduction replicates 47 4.8. Development Status 48 4.9. Language 49 4.10. Clarifications (general) 50 4.10.1. Nulls, continues, and no-ops 51 4.10.2. Empty statements 51 4.10.3. Statements that instantiate generics 52 4.10.4. Begin…end and {…} pairs used as executable statements 52 4.10.5. Begin…end and {…} pairs that delimit (sub)program bodies 52 4.10.6. Logical expressions used as test conditions 53 4.10.7. Expression evaluations used as subprogram arguments 53 4.10.8. End symbols that terminate executable statements 53 4.10.9. End symbols that terminate declarations or (sub)program bodies 54 4.10.10. Then, else, and otherwise symbols 54 4.10.11. Elseif and elsif statements 54 4.10.12. Keywords like procedure division, interface, and implementation 54 4.10.13 Labels (branching destinations) on lines by themselves 54 4.11. Clarifications for Specific Languages 55 4.11.1. Stylistic use of braces 55 4.11.2. Expression statements 56 4.11.3. Format statements 56 5. Using the Checklist to Define Size Measures—Nine Examples 59 5.1. Physical Source Lines—A Basic Definition 60 5.2. Logical Source Statements—A Basic Definition 66 5.3. Data Specifications—Getting Data for Special Purposes 72 5.3.1. Specifying data for project tracking 72 5.3.2. Specifying data for project analysis 78 5.3.3. Specifying data for reuse measurement 82 5.3.4. Combining data specifications 85 5.3.5. Counting dead code 88 ii CMU/SEI-92-TR-20 5.3.6. Relaxing the rule of mutual exclusivity 88 5.3.7. Asking for lower level details—subsets of attribute values 89 5.4. Other Definitions and Data Specifications 90 5.5. Other Measures 91 6. Defining the Counting Unit 93 6.1. Statement Delimiters 93 6.2. Logical Source Statements 93 6.3. Physical Source Lines 97 6.4. Physical and Logical Measures are Different Measures 99 6.5. Conclusion and Recommendation 101 7. Reporting Other Rules and Practices—The Supplemental Forms 103 7.1. Physical Source Lines 104 7.2. Logical Source Statements 105 7.3. Dead Code 106 8. Recording Measurement Results 107 8.1. Example 1—Development Planning and Tracking 108 8.2. Example 2—Planning and Estimating Future Products 109 8.3. Example 3—Reuse Tracking, Productivity Analysis, and Quality Normalization 110 8.4. Example 4—Support for Alternative Definitions and Recording Needs 112 9. Reporting and Requesting Measurement Results 115 9.1. Summarizing Totals for Individual Attributes 115 9.2. Summarizing Totals for Data Spec A (Project Tracking) 119 9.3. A Structured Process for Requesting Summary Reports 120 9.4. Requesting Arrayed Data 120 9.5. Requesting Individual (Marginal) Totals 126 10. Meeting the Needs of Different Users 129 11. Applying Size Definitions to Builds, Versions, and Releases 133 12. Recommendations for Initiating Size Measurement 135 12.1. Start with Physical Source Lines 135 12.2. Use a Checklist-Based Process to Define, Record, and Report Your Measurement Results 137 12.3. Report Separate Results for Questionable Elements 138 12.4. Look for Opportunities to Add Other Measures 138 12.5. Use the Same Definitions for Estimates that You Use for Measurement Reports 139 12.6. Dealing with Controversy 139 12.7. From Definition to Action—A Concluding Note 140 References 141 Appendix A: Acronyms and Terms 143 A.1. Acronyms 143 A.2. Terms Used 144 CMU/SEI-92-TR-20 iii Appendix B: Candidate Size Measures 147 Appendix C: Selecting Measures for Definition—Narrowing the Field 153 C.1. The Choices Made 153 C.2. The Path Followed 155 Appendix D: Using Size Measures—Illustrations and Examples 157 D.1.