Software Testing Tools : Analyses of Effectiveness on Procedural and Object-Orientated Source Code

Software Testing Tools : Analyses of Effectiveness on Procedural and Object-Orientated Source Code

Calhoun: The NPS Institutional Archive Theses and Dissertations Thesis Collection 2001-09 Software testing tools : analyses of effectiveness on procedural and object-orientated source code Snyder, Byron B. Monterey, California. Naval Postgraduate School http://hdl.handle.net/10945/1938 NAVAL POSTGRADUATE SCHOOL Monterey, California THESIS SOFTWARE TESTING TOOLS: METRICS FOR MEASUREMENT OF EFFECTIVENESS ON PROCEDURAL AND OBJECT-ORIENTED SOURCE CODE by Bernard J. Bossuyt Byron B. Snyder September 2001 Thesis Advisor: J. Bret Michael Second Reader: Richard H. Riehle Approved for public release; distribution is unlimited. REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork Reduction Project (0704-0188) Washington DC 20503. 1. AGENCY USE ONLY (Leave blank) 2. REPORT DATE 3. REPORT TYPE AND DATES COVERED September 2001 Master’s Thesis 4. TITLE AND SUBTITLE: Title (Mix case letters) 5. FUNDING NUMBERS Software Testing Tools: Analyses of Effectiveness on Procedural and Object-Oriented Source Code 6. AUTHOR(S) Bernard J. Bossuyt & Byron B. Snyder 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING Naval Postgraduate School ORGANIZATION REPORT Monterey, CA 93943-5000 NUMBER 9. SPONSORING / MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSORING / MONITORING N/A AGENCY REPORT NUMBER 11. SUPPLEMENTARY NOTES The views expressed in this thesis are those of the author and do not reflect the official policy or position of the Department of Defense or the U.S. Government. 12a. DISTRIBUTION / AVAILABILITY STATEMENT 12b. DISTRIBUTION CODE Approved for public release; distribution is unlimited. 13. ABSTRACT (maximum 200 words) The levels of quality, maintainability, testability, and stability of software can be improved and measured through the use of automated testing tools throughout the software development process. Automated testing tools assist software engineers to gauge the quality of software by automating the mechanical aspects of the software-testing task. Automated testing tools vary in their underlying approach, quality, and ease-of-use, among other characteristics. Evaluating available tools and selecting the most appropriate suite of tools can be a difficult and time- consuming process. In this thesis, we propose a suite of objective metrics for measuring tool characteristics, as an aide in systematically evaluating and selecting automated testing tools. Future work includes further research into the validity and utility of this suite of metrics, conducting similar research using a larger software project, and incorporating a larger set of tools into similar research. 14. SUBJECT TERMS software testing tool metrics, procedural, object-oriented, software testing 15. NUMBER OF tools, metrics, testing tool evaluation, testing tool selection, PAGES 209 16. PRICE CODE 17. SECURITY 18. SECURITY 19. SECURITY 20. LIMITATION CLASSIFICATION OF CLASSIFICATION OF THIS CLASSIFICATION OF OF ABSTRACT REPORT PAGE ABSTRACT Unclassified Unclassified Unclassified UL NSN 7540-01-280-5500 Standard Form 298 (Rev. 2-89) Prescribed by ANSI Std. 239-18 i THIS PAGE INTENTIONALLY LEFT BLANK ii iii THIS PAGE INTENTIONALLY LEFT BLANK iv ABSTRACT The levels of quality, maintainability, testability, and stability of software can be improved and measured through the use of automated testing tools throughout the software development process. Automated testing tools assist software engineers to gauge the quality of software by automating the mechanical aspects of the software- testing task. Automated testing tools vary in their underlying approach, quality, and ease- of-use, among other characteristics. Evaluating available tools and selecting the most appropriate suite of tools can be a difficult and time-consuming process. In this thesis, we propose a suite of objective metrics for measuring tool characteristics, as an aide in systematically evaluating and selecting automated testing tools. Future work includes further research into the validity and utility of this suite of metrics, conducting similar research using a larger software project, and incorporating a larger set of tools into similar research. v THIS PAGE INTENTIONALLY LEFT BLANK vi TABLE OF CONTENTS I. INTRODUCTION........................................................................................................1 A. PROBLEM STATEMENT .............................................................................1 B. RESEARCH ISSUES ......................................................................................1 1. Identifying Metrics...............................................................................2 2. Testing of Procedural versus Object-oriented Source Code............2 3. Evaluating Tools ..................................................................................2 C. CASE STUDY: CSMA/CD LAN DISCRETE-EVENT SIMULATION PROGRAM ......................................................................................................2 II. RELATED WORK ......................................................................................................5 A. IEEE STANDARD 1175 WORKING GROUP’S TOOL- EVALUATION SYSTEM...............................................................................5 1. Analyzing User Needs..........................................................................5 2. Establishing Selection Criteria ...........................................................8 3. Tool Search.........................................................................................10 4. Tool Selection .....................................................................................10 5. Reevaluation .......................................................................................14 6. Summary.............................................................................................14 B. INSTITUTE FOR DEFENSE ANALYSES REPORTS.............................15 C. SOFTWARE TECHNOLOGY SUPPORT CENTER’S SOFTWARE TEST TECHNOLOGIES REPORT............................................................16 III. METHODOLOGY ....................................................................................................17 A. TOOL SEARCH ............................................................................................17 1. BoundsChecker ..................................................................................17 a. Summary..................................................................................17 b. Features...................................................................................17 2. C-Cover...............................................................................................17 a. Summary..................................................................................17 b. Features...................................................................................17 3. CTC++ (Test Coverage Analyzer for C/C++) .................................18 a. Summary..................................................................................18 b. Features...................................................................................18 4. Cantata++ ...........................................................................................19 a. Summary..................................................................................19 b. Features...................................................................................19 5. ObjectChecker/Object Coverage/ObjectDetail...............................19 a. Summary..................................................................................19 b. Features...................................................................................20 6. Panorama C/C++ ...............................................................................20 vii a. Summary..................................................................................20 b. Features...................................................................................20 7. TCAT C/C++......................................................................................21 a. Summary..................................................................................21 b. Features...................................................................................21 B. TOOLS SELECTED FOR EVALUATION................................................21 1. LDRA TESTBED...............................................................................21 a. Summary..................................................................................21 b. Static Analysis Features .........................................................22 c. Dynamic Analysis Features....................................................24 2. Parasoft Testing Products .................................................................25 a. Summary..................................................................................25

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    210 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us