Scenario Testing) Few Defects Found Few Defects Found
Total Page:16
File Type:pdf, Size:1020Kb
QUALITY ASSURANCE Michael Weintraub Fall, 2015 Unit Objective • Understand what quality assurance means • Understand QA models and processes Definitions According to NASA • Software Assurance: The planned and systematic set of activities that ensures that software life cycle processes and products conform to requirements, standards, and procedures. • Software Quality: The discipline of software quality is a planned and systematic set of activities to ensure quality is built into the software. It consists of software quality assurance, software quality control, and software quality engineering. As an attribute, software quality is (1) the degree to which a system, component, or process meets specified requirements. (2) The degree to which a system, component, or process meets customer or user needs or expectations [IEEE 610.12 IEEE Standard Glossary of Software Engineering Terminology]. • Software Quality Assurance: The function of software quality that assures that the standards, processes, and procedures are appropriate for the project and are correctly implemented. • Software Quality Control: The function of software quality that checks that the project follows its standards, processes, and procedures, and that the project produces the required internal and external (deliverable) products. • Software Quality Engineering: The function of software quality that assures that quality is built into the software by performing analyses, trade studies, and investigations on the requirements, design, code and verification processes and results to assure that reliability, maintainability, and other quality factors are met. • Software Reliability: The discipline of software assurance that 1) defines the requirements for software controlled system fault/failure detection, isolation, and recovery; 2) reviews the software development processes and products for software error prevention and/or controlled change to reduced functionality states; and 3) defines the process for measuring and analyzing defects and defines/derives the reliability and maintainability factors. • Verification: Confirmation by examination and provision of objective evidence that specified requirements have been fulfilled [ISO/IEC 12207, Software life cycle processes]. In other words, verification ensures that “you built it right”. • Validation: Confirmation by examination and provision of objective evidence that the particular requirements for a specific intended use are fulfilled [ISO/IEC 12207, Software life cycle processes.] In other words, validation ensures that “you built the right thing”. From: http://www.hq.nasa.gov/office/codeq/software/umbrella_defs.htm Software Quality Assurance Technology Objective: Designing a quality system and writing quality software √ The tech team aims to deliver a correctly behaving system to the client Software Quality Assurance is about assessing if the system meets expectations Доверяй, но проверяй (Russian Proverb - Doveryay, no proveryay) Trust, but verify Validation Versus Verification Validation Verification Are we building the right Are we building the product or service? product or service right? Both involve testing – done at every stage but “testing can only show the presence of errors, not their absence” Dijkstra Validation Typically a client-leaning activity After all, they are the ones who asked for the system Product Trials User Experience Evaluation Verification Optimist: It’s about showing correctness/goodness Pessimist: It’s about identifying defects Good Input ? Good Output System Bad Input Bad Output ? Quality versus Reliability Quality Assurance Reliability Assessing whether a Probability of failure-free software component or software operation for a system produces the specified duration in a expected/correct/accepted particular environment behavior or output relationship between a Cool phrases given set of inputs Five 9’s OR No down-time Assessing features of the software Fun Story – First Computer Bug (1947) The First "Computer Bug". Moth found trapped between points at Relay # 70, Panel F, of the Mark II Aiken Relay Calculator while it was being tested at Harvard University, 9 September 1947. The operators affixed the moth to the computer log, with the entry: "First actual case of bug being found". They put out the word that they had "debugged" the machine, thus introducing the term "debugging a comp...uter program". In 1988, the log, with the moth still taped by the entry, was in the Naval Surface Warfare Center Computer Museum at Dahlgren, Virginia. The log is now housed at the Smithsonian Institution’s National Museum of American History, who have corrected the date from 1945 to 1947. Courtesy of the Naval Surface Warfare Center, Dahlgren, VA., 1988. NHHC Photograph Collection, NH 96566-KN (Color). From https://www.facebook.com/navalhistory/photos/a.77106563343.78834.76845133343/10153057920928344/ Testing is Computationally Hard The space is huge and it is generally infeasible to test anything completely Assessing quality is an exercise in establishing confidence in a system Or Minimizing Risks Other factors include App1 Quality of the Process OS1 Each layer introduces Quality of the Team VM risk Quality of the Environment Host OS Hardware Lots to Consider • Component behavior • Interactions between components • System and sub- system behavior • Interactions between sub-systems • Negative path • Behavior under load • Behavior over time • Usability Two Approaches Static Evaluations Dynamic Evaluations Making judgments Involves executing the without executing the code and judging code performance Static Technique - Reviews Fundamental QA Technique Peer(s) reviews artifact for correctness and clarity Often a formal process Value: finding issues at design/definition time rather than waiting for results of the step to complete Requirements Test Plans Architecture Highly effective, but does not Implementation replace the need for dynamic & Design techniques One Extreme: Jury/Peer Reviews Before anything is accepted, someone other than the creator must review it and approve it • Single reviewer model – Usually a “certified” / senior person • Panel model – Highly structured reviews – Can take significant preparation • Usually done at the design or development stage • May introduce delay between when code is written and when it gets reviewed Reviews Models exist for both reviewer or author to lead the discussion Author usually provides participants materials to study in advance Requires positive and open attitudes and preparation Value Review Meeting Second opinion on clarity, effectiveness, and efficiency Moderator Scribe Learning from others Review Panel Peers Author Avoids “board blindness” on seeing Experts flaws Client(s) Peer pressure to be neat and tie up loose ends Paired Programming Lightweight Peer Reviews One person drives while the other watches/reviews Derived from Extreme Programming, current favorite in agile Continuous review When compared to solo dev models, Shared problem solving MAY cause higher initial cost per Better communications module created (time and Learning from Peer resource), BUT higher quality and Social! lower overall cost Peer Pressure See as an example http://collaboration.csc.ncsu.edu/laurie/Papers/XPSardinia.PDF What do reviews look for? Clarity Can the reader easily and directly understand what the artifact is doing Correctness Analysis of algorithm used Common Code Faults 1. Data initialization, value ranges and type mismatches 2. Control: are all the branches really necessary (are the conditions properly and efficiently organizated)? Do loops terminate? 3. Input: are all parameters or collected values used? 4. Output: every output is assigned a value? 5. Interface faults: Parameter numbers, types, and order; structures and shared memory 6. Storage management: memory allocation, garbage collection, inefficient memory access 7. Exception handling: what can go wrong, what error conditions are defined and how are they handled List adapted from W. Arms: http://www.cs.cornell.edu/Courses/cs5150/2015fa/slides/H2-testing.pdf Examples You are asked to sort an array. There are many algorithms to sort an array. [You aren’t going to use a library function so you have to write this] Many choices exist. Suppose you are deciding between bubble sort, quicksort, and merge sort. All will work (sort an array), but which will be the better code ? Bubble sort is very easy to write: two loops. Slow on average O(n2) – how big will n be?? O(n) for memory. Quicksort is complicated to write. O(n log(n)) on average, O(n2) worst case. Requires constant memory O(n). Very effective on in-memory data. Most implementations are very fast. Mergesort is moderate to write. O(n log(n)) worst case. Memory required is a function of the data structure. Very effective on data that requires external access. Expressively Logical… boolean SquareRoot (double dValue, boolean SquareRoot (double dValue, double &dSquareRoot) double &dSquareRoot) { { boolean bRetValue = false; dSquareRoot = NULL; if (dValue < 0) { if (dValue < 0) dSquareRoot = NULL; return false; bRetValue = false; } dSquareRoot = pow(dValue, 0.5); else { return true; dSquareRoot = pow(dValue, 0.5); } bRetValue = true; } return bRetValue; } Static Program Analyzers Evaluate code modules automatically looking for errors or odd things Loops or programs within multiple exits (more common) or entries (less common) Undeclared, uninitialized, or unused variables Unused functions/procedures, parameter mismatches Unassigned pointers Memory leaks