Locating Faults in External C Code
Total Page:16
File Type:pdf, Size:1020Kb
Automated fault localization in external C code of Eiffel programs Reto Ghioldi Master Thesis Supervising Professor: Bertrand Meyer Chair of Software Engineering, ETH Zürich Supervisors: Ilinca Ciupa, Andreas Leitner March 2006 Masters of war Come you masters of war How much do I know You that build all the guns To talk out of turn You that build the death planes You might say that I’m young You that build the big bombs You might say I’m unlearned You that hide behind walls But there’s one thing I know You that hide behind desks Though I’m younger than you I just want you to know Even Jesus would never I can see through your masks Forgive what you do You that never done nothin’ Let me ask you one question But build to destroy Is your money that good You play with my world Will it buy you forgiveness Like it’s your little toy Do you think that it could You put a gun in my hand I think you will find And you hide from my eyes When your death takes its toll And you turn and run farther All the money you made When the fast bullets fly Will never buy back your soul Like Judas of old And I hope that you die You lie and deceive And your death’ll come soon A world war can be won I will follow your casket You want me to believe In the pale afternoon But I see through your eyes And I’ll watch while you’re lowered And I see through your brain Down to your deathbed Like I see through the water And I’ll stand o’er your grave That runs down my drain ’Til I’m sure that you’re dead You fasten the triggers For the others to fire Bob Dylan, 1963 Then you set back and watch When the death count gets higher You hide in your mansion As young people’s blood Flows out of their bodies And is buried in the mud You’ve thrown the worst fear That can ever be hurled Fear to bring children Into the world For threatening my baby Unborn and unnamed You ain’t worth the blood That runs in your veins iii Abstract AutoTest is a fully automatic testing tool for Eiffel classes and is based on the ideas of Design by Contract. However, external code (such as external C/C++ code) is usually not equipped with contracts. This means that often the manifestation of a bug occurs far from the actual location of the fault in the software and there is no indication of the type of error that occurred. In this thesis we evaluate different approaches for gaining additional information about faults in external code and for tracing them back to the Eiffel code. The solution is based on a technique called dynamic binary analysis (DBA). We chose Valgrind’s Memcheck as the tool that our work is based on, designed and implemented a universal framework for automated DBA and finally built an extension to AutoTest in order to allow it to use this automated analysis. We successfully detected the real error causes in over 93% of our Eiffel test appli- cations with different kinds of bugs in external code and could provide the programmer useful debugging information. iv Acknowledgment This master thesis has been written during the winter of 2005/2006 at the Chair of Software Engineering, ETH Zürich. I am deeply thankful for the open Swiss educa- tional system. I would like to thank my supervisors, Ilinca Ciupa and Andreas Leitner, for their great support during these months, their comments and helpful suggestions. Further, I would like to thank Prof. Dr. Bertrand Meyer who made this thesis possible. I also thank Hans Dubach for his support during my time at ETH Zürich. Without him in the student administration, life would have been so much more stressful and complicate. I always will appreciate my parents and siblings for their love and for their support of my education right from the very beginning. Thank you Beatrice, Rinaldo, Andrea, Daniela and Stefan. Last but not least, I would like to thank my girlfriend Sandra Käser for her endless love, support and patience. Contents 1 Introduction & Overview 1 1.1 Testing .................................. 2 1.1.1 Manual testing ......................... 2 1.1.2 Automated testing ....................... 2 1.1.3 Black and white box testing .................. 3 1.2 Design by Contract ........................... 3 1.2.1 AutoTest: The Eiffel approach ................. 4 1.2.2 Korat: The Java approach ................... 5 1.3 Eiffel compilation process ....................... 6 1.4 Scope of thiswork ........................... 6 1.4.1 Theoretical part ......................... 7 1.4.2 Practical part .......................... 9 1.4.3 Test results ........................... 10 1.4.4 Conclusion & Outlook ..................... 10 2 Related Work 11 2.1 Testing Eiffel programs in general ................... 12 2.1.1 Test input ............................ 12 2.1.2 Test results ........................... 12 2.2 TestStudio ................................ 13 2.2.1 Overview ............................ 13 2.2.2 Advantages ........................... 13 2.2.3 Limitations ........................... 13 2.3 AutoTest ................................. 14 2.3.1 Overview ............................ 14 2.3.2 Advantages ........................... 14 2.3.3 Limitations ........................... 14 3 Dynamic binary analysis 17 3.1 Testing binary code in general ..................... 18 3.1.1 Problems of testing binary code ................ 18 3.1.2 Scope of dynamic binary analysis ............... 19 3.2 Solutions and limitations ........................ 20 vi CONTENTS 3.2.1 Replace memory allocators and de-allocators ......... 20 3.2.2 Dynamic binary compilation and instrumentation ....... 21 3.3 Available tools ............................. 22 3.3.1 Valgrind ............................. 22 3.3.2 DynamoRIO .......................... 22 3.3.3 DynInst ............................. 22 3.3.4 Pin ............................... 23 3.3.5 DIOTA ............................. 23 3.3.6 Strata .............................. 23 3.3.7 Commercial and closed source tools .............. 23 3.4 A closer look at the Valgrind Framework ................ 24 3.4.1 Why Valgrind? ......................... 24 3.4.2 Glossary ............................ 24 3.4.3 The basic idea ......................... 25 3.4.4 Example translation ...................... 26 3.4.5 Example analysis ........................ 28 3.4.6 Limitations ........................... 29 4 Enspect 31 4.1 Overview ................................ 32 4.2 Intended results ............................. 32 4.2.1 Scope .............................. 32 4.2.2 Architecture ........................... 32 4.2.3 AutoTest integration ...................... 32 4.2.4 Usecases ............................ 33 4.3 Execution model ............................ 33 4.3.1 Normal execution ........................ 33 4.3.2 Execution under enspect .................... 33 4.4 Core - Plugin concept .......................... 35 4.4.1 Core ............................... 35 4.4.2 Deferred Plugin classes ..................... 41 4.5 Further design aspects .......................... 45 4.5.1 Structure of a bug situation ................... 45 4.5.2 Command line tool ....................... 45 4.6 Example plugin for AutoTest ...................... 46 4.6.1 Instruction execution ...................... 46 4.6.2 Analysis of bugs ........................ 51 5 Results 61 5.1 Synthetic test cases ........................... 62 5.1.1 Bad free ............................. 62 5.1.2 Bad address value ....................... 62 5.1.3 Badjump ............................ 63 CONTENTS vii 5.1.4 Badloop ............................ 63 5.1.5 Bad read/write ......................... 63 5.1.6 Bad C library calls to brk() and sbrk() .......... 64 5.1.7 Double free ........................... 64 5.1.8 Custom allocators ....................... 65 5.1.9 File descriptors ......................... 65 5.1.10 Read/write using pointers ................... 66 5.1.11 Leak check ........................... 66 5.1.12 Too long function match .................... 68 5.1.13 Overlapping src and dst ................... 69 5.1.14 Allocator errors ......................... 70 5.1.15 Buffer overflow ......................... 70 5.1.16 NULL pointers ......................... 71 5.1.17 Results ............................. 71 5.2 Real World testcase in Eiffel Media .................. 71 6 Conclusion and Outlook 73 6.1 Conclusion ............................... 74 6.2 Limitations ............................... 74 6.3 Further work & Outlook ........................ 75 Bibliography 76 A Source Code 81 A.1 enspect: ESP_AUTO_TEST_ANALYZER ............... 82 A.2 AutoTest: AUT_C_EIFFEL_LOOKUP_TABLE ............ 87 B Documentation 93 B.1 Requirements .............................. 94 B.2 Installation notes ............................ 94 B.3 Usage .................................. 95 C Project Plan 97 C.1 Project Description ........................... 98 C.1.1 Overview ............................ 98 C.1.2 Scope of the work ....................... 98 C.1.3 Intended results ......................... 98 C.2 Background Material .......................... 98 C.2.1 Reading list ........................... 98 C.2.2 Tools .............................. 99 C.3 Project Management .......................... 100 C.3.1 Objectives and priorities .................... 100 C.3.2 Criteria for success ....................... 100 viii CONTENTS C.3.3 Method of work ......................... 100 C.3.4 Quality management ...................... 100 C.4 Plan with Milestones .......................... 101 C.4.1 Project steps .......................... 101 C.4.2 Deadline ............................ 101 C.4.3 Tentative