Software Analytics As a Learning Case in Practice: Approaches and Experiences

Total Page:16

File Type:pdf, Size:1020Kb

Software Analytics As a Learning Case in Practice: Approaches and Experiences Software Analytics as a Learning Case in Practice: Approaches and Experiences Dongmei Zhang1, Yingnong Dang1, Jian-Guang Lou1, Shi Han1, Haidong Zhang1, Tao Xie2 1Microsoft Research Asia, Beijing, China 2North Carolina State University, Raleigh, NC, USA {dongmeiz,yidang,jlou,shihan,haizhang}@microsoft.com, [email protected] ABSTRACT analysis in order to obtain insightful and actionable information for 2 Software analytics is to enable software practitioners to perform data-driven tasks around software and services . data exploration and analysis in order to obtain insightful and ac- Insightful information is information that conveys meaningful tionable information for data-driven tasks around software and ser- and useful understanding or knowledge towards performing the tar- vices. In this position paper, we advocate that when applying ana- get task. Typically insightful information is not easily attainable by lytic technologies in practice of software analytics, one should (1) directly investigating the raw data without aid of analytic technolo- incorporate a broad spectrum of domain knowledge and expertise, gies. Actionable information is information upon which software e.g., management, machine learning, large-scale data processing practitioners can come up with concrete solutions (better than ex- and computing, and information visualization; and (2) investigate isting solutions if any) towards completing the target task. how practitioners take actions on the produced information, and Developing a software analytic project typically goes through it- provide effective support for such information-based action taking. erations of the life cycle of four phases: task definition, data prepa- Our position is based on our experiences of successful technology ration, analytic-technology development, and deployment and feed- transfer on software analytics at Microsoft Research Asia. back gathering. Task definition is to define the target task to be as- sisted by software analytics. Data preparation is to collect data to be analyzed. Analytic-technology development is to develop prob- Categories and Subject Descriptors lem formulation, algorithms, and systems to explore, understand, D.2.5 [Software Engineering]: Testing and Debugging—Debug- and get insights from the data. Deployment and feedback gather- ging aids, diagnostics; D.2.9 [Software Engineering]: Manage- ing involves two typical scenarios. One is that, as researchers, we ment—Productivity, software quality assurance (SQA) have obtained some insightful information from the data and we would like to ask domain experts to review and verify. The other is that we ask domain experts to use the analytic tools that we have General Terms developed to obtain insights by themselves. Most of the times it is Experimentation, Management, Measurement the second scenario that we want to enable. Among various analytic technologies, machine learning is a well- recognized technology for learning hidden patterns or predictive Keywords models from data. It plays an important role in software analyt- Software analytics, machine learning, technology transfer ics. In this position paper, we argue that when applying analytic technologies in practice of software analytics, one should 1. INTRODUCTION • incorporate a broad spectrum of domain knowledge and ex- A huge wealth of various data exists in the software development pertise, e.g., management, machine learning, large-scale data process, and hidden in the data is information about the quality of processing and computing, and information visualization; software and services as well as the dynamics of software develop- • investigate how practitioners take actions on the produced ment. With various analytic technologies (e.g., data mining, ma- insightful and actionable information, and provide effective chine learning, and information visualization), software analytics support for such information-based action taking. is to enable software practitioners1 to perform data exploration and 1 Our position is based on a number of software analytic projects Software practitioners typically include software developers, that have been conducted at the Software Analytics (SA) group3 testers, usability engineers, and managers, etc. at Microsoft Research Asia (MSRA) in recent years (in the rest of this paper, we refer to members of the software analytic projects at MSRA as the SA project teams). These software analytic projects have undergone successful technology transfer within Microsoft for Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are enabling informed decision making and improving quality of soft- not made or distributed for profit or commercial advantage and that copies ware and services. We expect that our position will provide useful bear this notice and the full citation on the first page. To copy otherwise, to 2 republish, to post on servers or to redistribute to lists, requires prior specific We coin this term software analytics to expand the scope of previ- permission and/or a fee. ous work [1,6] on analytics for software development and previous work on software intelligence [5]. MALETS ’11, November 12, 2011, Lawrence, Kansas, USA 3 Copyright 2011 ACM 978-1-4503-1022-2/11/11 ...$10.00. http://research.microsoft.com/groups/sa/ guidelines for other researchers and practitioners in successfully code-clone detection, we should not stop at measuring the preci- carrying out software analytics research for technology transfer. sion and recall of detected clones; rather, we should push further to accomplish that the detected clones could effectively help address 2. PROJECT OBJECTIVES ultimate tasks such as refactoring and defect detection, and should measure such benefits in evaluations. The main objectives of MSRA software analytic projects are to Engagement of customers during the development process of a advance the state of the art in software analytics and help improve software analytic project. It is well recognized that engaging cus- the quality of Microsoft products as well as the productivity of Mi- tomers is a challenging task especially in the context of software crosoft product development. The target customers (in short as cus- engineering tools. Customers may have resistance to proposed tomers) are primarily practitioners from Microsoft product teams. changes (due to analytic-tool adoption) on their existing way of Such emphasis also benefits the broader research and industrial carrying out a task. In addition, due to tight development sched- communities because Microsoft product development belongs to ule, they may not be able to invest time on gaining understanding important representatives of modern software development, along of the best/worst scenarios for applying an analytic tool. However, dimensions such as the team scale, software complexity, and soft- developing a software analytic project typically needs the engage- ware types. Some of the MSRA project outcomes could also even- ment of customers in iterations of the four phases in the project tually reach out to broad communities through their integration life cycle, e.g., to get better understanding on the tasks and domain with Microsoft development tools. For example, the code-clone knowledge. Among the phases, especially the phase of deployment detection tool [2] resulted from the the first example project dis- and feedback gathering, it is crucial for the produced analytic tools cussed in Section 4 has been integrated in Microsoft Visual Studio to have good usability, e.g., providing effective visualization and vNext, benefiting broader communities. manipulation of analysis results. 3. CHALLENGES 4. EXPERIENCES There are two main categories of challenges to overcome in or- We next discuss our experiences in different phases of develop- der to achieve the stated objectives. The first category is rooted ing two example MSRA analytic tools that have been successfully from the characteristics of the data being analyzed with analytic adopted at various Microsoft product teams. XIAO [2] is a tool for technologies. detecting code clones of source-code bases, targeting at tasks such Data scale. Typical data in software analytics is of large scale, as refactoring and defect detection. StackMine is a tool for learn- e.g., due to the large scale of software being developed and the large ing performance bottlenecks from call-stack traces collected from size of software development teams. Some tasks require to analyze real-world usage, targeting at tasks such as performance analysis. division-wide or even company-wide code bases, which are far be- yond the scope of a single code base (e.g., when conducting code- 4.1 Task Definition clone detection [2]). Some tasks require to analyze a large quantity Task definition is to define the target task to be assisted by soft- of (likely noisy) data samples within or beyond a single code base ware analytics. There are two models (or their mixture) of initiat- (e.g., when conducting runtime-trace analysis [3]). Although lack- ing a software analytic project at MSRA: the pull model and push ing data samples may not be an issue in this context of machine model. learning, the large scale of data poses challenges for data process- StackMine primarily follows the pull model. Before the Stack- ing and analysis, including learning-algorithm design and system Mine project was started, during one of the meetings with the SA building. team, a member of a Microsoft
Recommended publications
  • Information Needs for Software Development Analytics
    Information Needs for Software Development Analytics Raymond P.L. Buse Thomas Zimmermann The University of Virginia, USA Microsoft Research [email protected] [email protected] Abstract—Software development is a data rich activity with decision making will likely only become more difficult. many sophisticated metrics. Yet engineers often lack the tools Analytics describes application of analysis, data, and sys- and techniques necessary to leverage these potentially powerful tematic reasoning to make decisions. Analytics is especially information resources toward decision making. In this paper, we present the data and analysis needs of professional software useful for helping users move from only answering questions engineers, which we identified among 110 developers and of information like “What happened?” to also answering managers in a survey. We asked about their decision making questions of insight like “How did it happen and why?” process, their needs for artifacts and indicators, and scenarios Instead of just considering data or metrics directly, one can in which they would use analytics. gather more complete insights by layering different kinds of The survey responses lead us to propose several guidelines for analytics tools in software development including: Engi- analyses that allow for summarizing, filtering, modeling, and neers do not necessarily have much expertise in data analysis; experimenting; typically with the help of automated tools. thus tools should be easy to use, fast, and produce concise The key to applying analytics to a new domain is under- output. Engineers have diverse analysis needs and consider most standing the link between available data and the information indicators to be important; thus tools should at the same time needed to make good decisions, as well as the analyses support many different types of artifacts and many indicators.
    [Show full text]
  • Software Analytics for Incident Management of Online Services: An
    Software Analytics for Incident Management of Online Services: An Experience Report Jian-Guang Lou, Qingwei Lin, Rui Ding, Tao Xie Qiang Fu, Dongmei Zhang University of Illinois at Urbana-Champaign Microsoft Research Asia, Beijing, P. R. China Urbana, IL, USA {jlou, qlin, juding, qifu, dongmeiz}@microsoft.com [email protected] Abstract—As online services become more and more popular, incident management needs to be efficient and effective in incident management has become a critical task that aims to order to ensure high availability and reliability of the services. minimize the service downtime and to ensure high quality of the A typical procedure of incident management in practice provided services. In practice, incident management is conducted (e.g., at Microsoft and other service-provider companies) goes through analyzing a huge amount of monitoring data collected at as follow. When the service monitoring system detects a ser- runtime of a service. Such data-driven incident management faces several significant challenges such as the large data scale, vice violation, the system automatically sends out an alert and complex problem space, and incomplete knowledge. To address makes a phone call to a set of On-Call Engineers (OCEs) to these challenges, we carried out two-year software-analytics trigger the investigation on the incident in order to restore the research where we designed a set of novel data-driven techniques service as soon as possible. Given an incident, OCEs need to and developed an industrial system called the Service Analysis understand what the problem is and how to resolve it. In ideal Studio (SAS) targeting real scenarios in a large-scale online cases, OCEs can identify the root cause of the incident and fix service of Microsoft.
    [Show full text]
  • Patterns for Implementing Software Analytics in Development Teams
    Patterns for Implementing Software Analytics in Development Teams JOELMA CHOMA, National Institute for Space Research - INPE EDUARDO MARTINS GUERRA, National Institute for Space Research - INPE TIAGO SILVA DA SILVA, Federal University of São Paulo - UNIFESP The software development activities typically produce a large amount of data. Using a data-driven approach to decision making – such as Software Analytics – the software practitioners can achieve higher development process productivity and improve many aspects of the software quality based on the insightful and actionable information. This paper presents a set of patterns describing steps to encourage the integration of the analytics activities by development teams in order to support them to make better decisions, promoting the continuous improvement of software and its development process. Before any procedure to extract data for software analytics, the team needs to define, first of all, their questions about what will need to be measured, assess and monitored throughout the development process. Once defined the key issues which will be tracked, the team may select the most appropriate means for extracting data from software artifacts that will be useful in decision-making. The tasks to set up the development environment for software analytics should be added to the project planning along with the regular tasks. The software analytics activities should be distributed throughout the project in order to add information about the system in small portions. By defining reachable goals from the software analytics findings, the team turns insights from software analytics into actions to improve incrementally the software characteristics and/or its development process. Categories and Subject Descriptors: D.2.8 [Software and its engineering]: Software creation and management—Metrics General Terms: Software Analytics Additional Key Words and Phrases: Software Analytics, Decision Making, Agile Software Development, Patterns, Software Measurement, Development Teams.
    [Show full text]
  • Evaluating and Improving Software Quality Using Text Analysis Techniques - a Mapping Study
    Evaluating and Improving Software Quality Using Text Analysis Techniques - A Mapping Study Faiz Shah, Dietmar Pfahl Institute of Computer Science, University of Tartu J. Liivi 2, Tartu 50490, Estonia {shah, dietmar.pfahl}@ut.ee Abstract: Improvement and evaluation of software quality is a recurring and crucial activ- ity in the software development life-cycle. During software development, soft- ware artifacts such as requirement documents, comments in source code, design documents, and change requests are created containing natural language text. For analyzing natural text, specialized text analysis techniques are available. However, a consolidated body of knowledge about research using text analysis techniques to improve and evaluate software quality still needs to be estab- lished. To contribute to the establishment of such a body of knowledge, we aimed at extracting relevant information from the scientific literature about data sources, research contributions, and the usage of text analysis techniques for the im- provement and evaluation of software quality. We conducted a mapping study by performing the following steps: define re- search questions, prepare search string and find relevant literature, apply screening process, classify, and extract data. Bug classification and bug severity assignment activities within defect manage- ment are frequently improved using the text analysis techniques of classifica- tion and concept extraction. Non-functional requirements elicitation activity of requirements engineering is mostly improved using the technique of concept extraction. The quality characteristic which is frequently evaluated for the prod- uct quality model is operability. The most commonly used data sources are: bug report, requirement documents, and software reviews. The dominant type of re- search contributions are solution proposals and validation research.
    [Show full text]
  • Software Analytics in Practice
    FOCUS: THE MANY FACES OF SOftWARE ANALYTICS • software practitioners could Software conduct software analytics by themselves, • researchers from academia or in- dustry could collaborate with soft- Analytics ware practitioners from software companies or open source commu- nities and transfer their analytics in Practice technologies to real-world use, or • practitioners could organically adopt analytics technologies devel- Dongmei Zhang, Shi Han, Yingnong Dang, Jian-Guang Lou, oped by researchers. and Haidong Zhang, Microsoft Research Asia However, no matter how this is pur- Tao Xie, University of Illinois at Urbana-Champaign sued, it remains a huge challenge. We’ve worked on a number of soft- ware analytics projects that have had a // The StackMine project produced a software analytics high impact on software development 7–10 system for Microsoft product teams. The project provided practice. To illustrate the lessons we’ve learned from them, we describe several lessons on applying software analytics technologies our experiences developing StackMine, to make an impact on software development practice. // a scalable system for postmortem per- formance debugging.7 The Promise and Challenges of Software Analytics As we mentioned before, software an- alytics aims to obtain insightful and actionable information. Insightful in- formation conveys knowledge that’s meaningful and useful for practitioners THE SOFTWARE DEVELOPMENT software domain, we formed Micro- performing a specific task. Actionable process produces various types of data soft Research Asia’s Software Analyt- information is information with which such as source code, bug reports, check- ics Group (http://research.microsoft. practitioners can devise concrete ways in histories, and test cases. Over the past com/groups/sa) in 2009.
    [Show full text]
  • Software Analytics to Software Practice: a Systematic Literature Review
    2015 IEEE/ACM 1st International Workshop on Big Data Software Engineering Software Analytics to Software Practice: A Systematic Literature Review Tamer Mohamed Abdellatif, Luiz Fernando Capretz, and Danny Ho Department of Electrical & Computer Engineering Western University London, Ontario, Canada [email protected], [email protected], [email protected] Abstract—Software Analytics (SA) is a new branch of big data suitable and supportive insights and facts to software industry analytics that has recently emerged (2011). What distinguishes stakeholders to make their decision making easier, faster, more SA from direct software analysis is that it links data mined from precise, and more confident. The main difference between SA many different software artifacts to obtain valuable insights. and direct software analysis is that rather than just providing These insights are useful for the decision-making process straightforward insights extraction SA performs additional throughout the different phases of the software lifecycle. Since SA is currently a hot and promising topic, we have conducted a advanced steps. As clarified by Hassan [1], SA must provide systematic literature review, presented in this paper, to identify visualization and useful interpretation of insights in order to gaps in knowledge and open research areas in SA. Because many facilitate decision making. researchers are still confused about the true potential of SA, we This paper is organized as follows: In Section II, we will had to filter out available research papers to obtain the most SA- illustrate our review methodology. Our review results are relevant work for our review. This filtration yielded 19 studies illustrated in Section III. Section IV presents the limitation of out of 135.
    [Show full text]
  • Em All: Watchdog, a Family of IDE Plug-Ins to Assess Testing
    How to Catch ’Em All: WatchDog, a Family of IDE Plug-Ins to Assess Testing Moritz Beller,* Igor Levaja,* Annibale Panichella,* Georgios Gousios,# Andy Zaidman* *Delft University of Technology, The Netherlands #Radboud University Nijmegen, The Netherlands {m.m.beller, a.panichella, a.e.zaidman}@tudelft.nl [email protected], [email protected] ABSTRACT vidual, users of WatchDog benefit from immediate testing and de- As software engineering researchers, we are also zealous tool smiths. velopment analytics, a feature that neither Eclipse nor IntelliJ sup- Building a research prototype is often a daunting task, let alone ports out-of-the-box. After the introduction of the testing reports, building a industry-grade family of tools supporting multiple plat- even accomplished software engineers were surprised by their own forms to ensure the generalizability of our results. In this paper, recorded testing behavior, as a reaction from a software quality con- we give advice to academic and industrial tool smiths on how to sultant and long-term WatchDog user exemplifies: “ Estimated design and build an easy-to-maintain architecture capable of sup- time working on tests: 20%, actual time: 6%. Cool statistics!”, porting multiple integrated development environments (IDEs). Our The key contributions of this paper are: experiences stem from WatchDog, a multi-IDE infrastructure that 1) The introduction of WatchDog for IntelliJ, an instantiation of assesses developer testing activities in vivo and that over 2,000 reg- WatchDog’s new multi-IDE framework. istered developers use. To these software engineering practitioners, 2) The description of a set of experiences and suggestions for WatchDog, provides real-time and aggregated feedback in the form crafting a multi-IDE plug-in infrastructure.
    [Show full text]
  • Software Runtime Analytics for Developers: Extending Developers’ Mental Models by Runtime Dimensions
    Zurich Open Repository and Archive University of Zurich Main Library Strickhofstrasse 39 CH-8057 Zurich www.zora.uzh.ch Year: 2018 Software runtime analytics for developers: extending developers’ mental models by runtime dimensions Cito, Jürgen Posted at the Zurich Open Repository and Archive, University of Zurich ZORA URL: https://doi.org/10.5167/uzh-152966 Dissertation Published Version Originally published at: Cito, Jürgen. Software runtime analytics for developers: extending developers’ mental models by runtime dimensions. 2018, University of Zurich, Faculty of Economics. Department of Informatics Software Runtime Analytics for Developers Extending Developers’ Mental Models by Runtime Dimensions Dissertation submitted to the Faculty of Business, Economics and Informatics of the University of Zurich to obtain the degree of Doktor / Doktorin der Wissenschaften, Dr. sc. (corresponds to Doctor of Science, PhD) presented by Jürgen Cito from Vienna, Austria approved in February 2018 at the request of Prof. Dr. Harald C. Gall, University of Zurich, Switzerland Dr. Philipp Leitner, Chalmers University of Technology, Sweden Prof. Dr. Nenad Medvidovic, University of Southern California, USA The Faculty of Business, Economics and Informatics of the University of Zurich hereby authorizes the printing of this dissertation, without indicating an opinion of the views expressed in the work. Zurich, February 14th, 2018 Chairwoman of the Doctoral Board: Prof. Dr. Sven Seuken Abstract Software systems have become instrumental in almost every aspect of modern society. The reliability and performance of these systems plays a crucial role in the every day lives of their users. The performance of a software system is governed by its program code, executed in an environment.
    [Show full text]
  • Practical Experiences and Value of Applying Software Analytics to Manage Quality
    Practical experiences and value of applying software analytics to manage quality Anna Maria Vollmer Silverio Mart´ınez-Fernandez´ Alessandra Bagnato Fraunhofer IESE Fraunhofer IESE Softeam Kaiserslautern, Germany Kaiserslautern, Germany Paris, France [email protected] [email protected] [email protected] Jari Partanen Lidia Lopez´ Pilar Rodr´ıguez Bittium UPC-BarcelonaTech University of Oulu Oulu, Finland Barcelona, Spain Oulu, Finland [email protected] [email protected] pilar.rodriguez@oulu.fi Abstract—Background: Despite the growth in the use of soft- more actors via one or more media so that the technology ware analytics platforms in industry, little empirical evidence recipient sustainably adopts the object in the recipients context is available about the challenges that practitioners face and the in order to evidently achieve a specific purpose.” [8]. We report value that these platforms provide. Aim: The goal of this research is to explore the benefits of using a software analytics platform the industrial scenario and describe our experiences on the for practitioners managing quality. Method: In a technology industry-academia journey in two pilot projects that used a transfer project, a software analytics platform was incrementally software analytics platform to solve their quality management developed between academic and industrial partners to address problems. Our previous work focused on incrementally de- their software quality problems. This paper focuses on exploring veloping a software analytics platform in a previous forma- the value provided by this software analytics platform in two pilot projects. Results: Practitioners emphasized major benefits includ- tive stage [9]. This study reports our experiences with the ing the improvement of product quality and process performance software analytics platform Q-Rapids (hereafter referred to as and an increased awareness of product readiness.
    [Show full text]
  • Software Analytics: Opportunities and Challenges
    Software Analytics: Opportunities and Challenges Olga Baysal Latifa Guerrouj ! ! School of Computer Science Département de GL et des TI Carleton University École de Technologie Supérieure [email protected] [email protected] olgabaysal.com latifaguerrouj.ca @olgabaysal @Latifa_Guerrouj Development (Big) Data Development Data 2 Mining Software Repositories Detection and analysis of hidden patterns and trends Code Tests Build/ Version Control Config Source code information Usage data Docs Issues Mailing lists Crash repos Field logs Other artifacts Runtime data 3 Problem Development Stakeholders artifacts Data Is NOT Actionable 4 Decisions Drive Development! Program defects Product deadlines How do I fix this bug? Are we ready to release? Risks, cost, operation, Release engineers Developers Program correctness planning How effective is our When do we release? test suite? Managers QA Decisions are often based on intuition or experience 5 Solution – Analytics 6 Analytics In Industry 7 Software Analytics 8 Benefits of Software Analytics 9 Supporting Development Decisions Program defects Product deadlines How do I fix this bug? Are we ready to release? Risks, cost, operation, Release engineers Developers Program correctness planning How effective is our When do we release? test suite? Managers QA Data-driven decision making, fact-based views of projects 10 Software Artifacts • Source Code • Execution Trace • Development History • Bug Reports • Code Reviews • Developer Activities • Software Forums • Software Microblogs 11 Artifact:
    [Show full text]
  • Mastering Software Testing with Junit 5: Comprehensive Guide to Develop
    Mastering Software Testing with JUnit 5 Comprehensive guide to develop high quality Java applications Boni García BIRMINGHAM - MUMBAI Mastering Software Testing with JUnit 5 Copyright © 2017 Packt Publishing All rights reserved. No part of this book may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, without the prior written permission of the publisher, except in the case of brief quotations embedded in critical articles or reviews. Every effort has been made in the preparation of this book to ensure the accuracy of the information presented. However, the information contained in this book is sold without warranty, either express or implied. Neither the author, nor Packt Publishing, and its dealers and distributors will be held liable for any damages caused or alleged to be caused directly or indirectly by this book. Packt Publishing has endeavored to provide trademark information about all of the companies and products mentioned in this book by the appropriate use of capitals. However, Packt Publishing cannot guarantee the accuracy of this information. First published: October 2017 Production reference: 1231017 Published by Packt Publishing Ltd. Livery Place 35 Livery Street Birmingham B3 2PB, UK. ISBN 978-1-78728-573-6 www.packtpub.com Credits Author Copy Editor Boni García Charlotte Carneiro Reviewers Project Coordinator Luis Fernández Muñoz Ritika Manoj Ashok Kumar S Commissioning Editor Proofreader Smeet Thakkar Safis Editing Acquisition Editor Indexer Nigel Fernandes Aishwarya Gangawane Content Development Editor Graphics Mohammed Yusuf Imaratwale Jason Monteiro Technical Editor Production Coordinator Ralph Rosario Shraddha Falebhai About the Author Boni García has a PhD degree on information and communications technology from Technical University of Madrid (UPM) in Spain since 2011.
    [Show full text]
  • Developer Testing in the IDE: Patterns, Beliefs, and Behavior
    IEEE TRANSACTIONS ON SOFTWARE ENGINEERING, VOL. 14, NO. 8, AUGUST 2015 1 Developer Testing in the IDE: Patterns, Beliefs, and Behavior Moritz Beller, Georgios Gousios, Annibale Panichella, Sebastian Proksch, Sven Amann, and Andy Zaidman, Members, IEEE Abstract—Software testing is one of the key activities to achieve software quality in practice. Despite its importance, however, we have a remarkable lack of knowledge on how developers test in real-world projects. In this paper, we report on a large-scale field study with 2,443 software engineers whose development activities we closely monitored over 2.5 years in four integrated development environments (IDEs). Our findings, which largely generalized across the studied IDEs and programming languages Java and C#, question several commonly shared assumptions and beliefs about developer testing: half of the developers in our study do not test; developers rarely run their tests in the IDE; most programming sessions end without any test execution; only once they start testing, do developers do it extensively; a quarter of test cases is responsible for three quarters of all test failures; 12% of tests show flaky behavior; Test-Driven Development (TDD) is not widely practiced; and software developers only spend a quarter of their time engineering tests, whereas they think they test half of their time. We compile these practices of loosely guiding one’s development efforts with the help of testing in an initial summary on Test-Guided Development (TGD), a behavior we argue to be closer to the development reality of most developers than TDD. Index Terms—Developer Testing, Unit Tests, Testing Effort, Field Study, Test-Driven Development (TDD), JUnit, TestRoots WatchDog, KaVE FeedBag++.
    [Show full text]