Title Automated Test of Evolving Software Name Hazel Anne Shaw

Title Automated Test of Evolving Software Name Hazel Anne Shaw

University of • Bedfordshire Title Automated Test of Evolving Software Name Hazel Anne Shaw This is a digitised version of a dissertation submitted to the University of Bedfordshire. It is available to view only. This item is subject to copyright. AUTOMATED TESTING OF EVOLVING SOFTWARE Hazel Anne Shaw A thesis submitted to the University of Luton, in partial fulfilment of the requirements for the degree of Doctor of Philosophy October 2005 ABSTRACT: AUTOMATED TESTING OF EVOLVING SOFTWARE Hazel Anne Shaw Computers and the software they run are pervasive, yet released software is often unreliable, which has many consequences. Loss of time and earnings can be caused by application software (such as word processors) behaving incorrectly or crashing. Serious disruption can occur as in the 14th August 2003 blackouts in North East USA and Canada1, or serious injury or death can be caused as in the Therac-25 overdose incidents2. One way to improve the quality of software is to test it thoroughly. However, software testing is time consuming, the resources, capabilities and skills needed to carry it out are often not available and the time required is often curtailed because of pressures to meet delivery deadlines3• Automation should allow more thorough testing in the time available and improve the quality of delivered software, but there are some problems with automation that this research addresses. Firstly, it is difficult to determine ifthe system under test (SUT) has passed or failed a test. This is known as the oracle problem4 and is often ignored in software testing research. Secondly, many software development organisations use an iterative and incremental process, known as evolutionary development, to write software. Following release, software continues evolving as customers demand new features and improvements to existing ones5• This evolution means that automated test suites must be maintained throughout the life ofthe software. A contribution of this research is a methodology that addresses automatic generation of the test cases, execution of the test cases and evaluation of the outcomes from running each test. "Predecessor" software is used to solve the oracle problem. This is software that already exists, such as a previous version of evolving software, or software from a different vendor that solves the same, or similar, problems. However, the resulting oracle is assumed not be perfect, so rules are defined in an interface, which are used by the evaluator in the test evaluation stage to handle the expected differences. The interface also specifies functional inputs and outputs to the SUT. An algorithm has been developed that creates a Markov Chain Transition Matrix (MCTM) model of the SUT from the interface. Tests are then generated automatically by making a random walk of the MCTM. This means that instead of maintaining a large suite of tests, or a large model of the SUT, only the interface needs to be maintained. 1) NERC Steering Group (2004). Technical Analysis ofthe August 14, 2003, Blackout: What Happened, Why, and What Did We Learn? July 13 th 2004. Available from: ftp://www.nerc.com/pub/sys/all _updl/docs/blackout/NERC _Final_Blackout_ Report_ 07_13_ 04.pdf 2) Leveson N. G., Turner C. S. (1993) An investigation of the Therac-25 accidents. IEEE Computer, Vol 26, No 7, Pages 18-41. 3) LogicaCMG (2005) Testing Times for Board Rooms. Available from http://www.logicacmg.com/pdf/tracked/testing TimesBoardRooms.pdf 4) Bertolino, A. (2003) Software Testing Research and Practice, ASM 2003, Lecture Notes in Computer Science, Vol 2589, Pages 1-21. 5) Sommerville, I. (2004) Software Engineering, 7th Edition. Addison Wesley. ISBN 0-321-21026-3. TABLE OF CONTENTS Abstract: Automated Testing of Evolving Software ...................................................ii Table of Contents ........................................................................................................ iii List of Tables ............................................................................................................. vii List ofFigures ........................................................................................................... viii Preface...........................................................................................................................x Author's Declaration ................................................................................................. xii List of Abbreviations ................................................................................................ xiii Chapter 1 Introduction .................................................................................................. 1 1.1 Commercial Software Testing and Quality .......................................................................... 1 1.1.1 Software Failures 2 1.1.2 Software Development Practice 6 1.1.3 Testing Resources 9 1.1.4 Completeness ofTesting 10 1.1.5 Commercial Software Development Case Study 11 1.2 Overview of Research ........................................................................................................ 13 1.3 Current Knowledge ............................................................................................................. 14 1.3. l Alternative Quality Assurance Techniques 16 1.3.2 Test Case Selection 19 1.3.3 Evaluating Test Results 30 1.3.4 Test Frameworks 45 1.4 Implications of Literature Review on Research................................................................. 48 1.4.1 The Oracle Problem 48 1.4.2 Test Case Generation using Models 50 1.4.3 An Holistic Automated Testing Methodology 50 1.4.4 Maintenance 51 1.4.5 Automated Software Testing for Industry 51 1.4.6 Black Box or White Box? 52 iii 1.5 Research Aims .................................................................................................................... 53 1.5.1 Criteria for a Successful Test Methodology 53 1.5 .2 Objectives for the Methodology Developed 54 1.5 .3 The Research Questions 55 1.6 Novel Features ofthe Research .......................................................................................... 57 1.7 Thesis Structure .................................................................................................................. 58 Chapter 2 Automated Testing Methodology ............................................................. 59 2.1 The Interface ....................................................................................................................... 62 2.1.l Unit of Functionality (UOF) 63 2.1.2 TCD files 63 2.1.3 Rules 63 2 .1.4 Test configuration 63 2.1.5 Creating the Interface 64 2.2 The Oracle ........................................................................................................................... 66 2.2.1 Reverse Engineered Specification 67 2.2.2 Software Predecessors 72 2.3 The Generation Phase ......................................................................................................... 73 2.3.1 Stochastic Test Case Generation 73 2.3.2 Markov Chain Transition Matrix 74 2.3.3 Calculating the States 79 2.4 The Execution Phase .......................................................................................................... 85 2.5 The Evaluation Phase ......................................................................................................... 85 2.6 Implementation ................................................................................................................... 88 2.6.1 Choosing a Language for Test Cases 88 2.6.2 Producing the Test Cases 95 2.7 Management of the Test Process ........................................................................................ 97 2.7.1 Static and Dynamic Test Management 97 2.7.2 Management of the Test Data Repository 99 2.7.3 The Bug Tracking Database 105 2.8 How to use Alltest ............................................................................................................ 107 Chapter 3 Evaluation and Discussion ...................................................................... 110 3.1 Experimental Design ........................................................................................................ 110 3.1.1 Manager 110 3.1.2 NTFS 111 3.2 Checking Alltest ............................................................................................................... 112 iv 3.3 Description of the Experiment ......................................................................................... 117 3 .4 Experiment and Results .................................................................................................... 11 9 3.5 Effort ................................................................................................................................. 130 3.6 Effectiveness ..................................................................................................................... 131 3.7 Choosing an Oracle .......................................................................................................... 133 3.8 Applying the Methodology in Practice ............................................................................ 136 3.9 Improvements to the Implementation of Alltest .............................................................. 136 3.10 Research Aims Reviewed ............................................................................................... 138 3.10.1 Oracle Problem 140 3.10.2 Maintainability 140 3.10.3 Automation 141 3.10.4 Procedures 141 3.10.5 Implementation

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    237 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us