
Validation, SAS, and the Systems Development Life Cycle: An Oxymoron? Neil Howard, Parker-Davis/Warner-Lambert, Ann Arbor, MI Michelle Gayari, Parker-Davis/Warner-Lambert, Ann Arbor, MI Abstract detract from the importance of the topic. It should also be noted that there will an additional handout at The elements of the validation process, the features the presentation that will focus primarily on SAS code and tools of SAS software, and the structure of the and SAS solutions to validation issues. Systems Development Life Cycle (SDLC) are indeed compatible. This paper will address: the definitions of Definition of Terms verification, validation, testing, and debugging, as well as the structure of the SDLC. It will illustrate how the The responsibility for ensuring program quality SAS system can be used to help satisfy the remains with the programmer. Today's competitive requirements of your SDLC and accomplish the tasks environment demands that we discuss testing of verification and validation. methods and useful SAS system tools that will help us meet the challenges of verification, validation, testing, Since as much as 80% of a programmer’s time is and debugging. The following definitions were invested in testing and validation, it’s important to provided by the validation maven at Parke-Davis and focus on tools that facilitate correction of syntax, data, serve as the benchmarks for this paper. and logic errors in SAS programs. The presentation focuses on wide variety of SAS features, tips, VERIFICATION: Checking (often visual) of a result techniques, tricks, and system tools that can become based on predetermined criteria, e.g., check that a part of your routine testing methodology. title is correct. Introduction VALIDATION: The process of providing documented evidence that a computerized system performs its [...Overheard at an interview for a SAS programming functions as intended and will continue to do so in the position: “But you don’t have to test SAS future. programs!!!”....] TESTING: Expected results are predetermined, so As the interviewers quickly escort the confused actual results can be either accepted or rejected by candidate out the door, they recall how often it is one or more of the testing types: unit, integration, assumed that a fourth generation language “does so worst case, valid case, boundary value, alpha, beta, much for you” that you don’t have to test the code. black box, white box, regression, functional, structural, The SAS system is easy to use, and the learning performance, stability, etc. curve to productivity is relatively short. But SAS is just as easy to ABUSE. Programmers and analysts must DEBUGGING: The process of finding and correcting not lose sight of the indisputable facts: data is seldom the root cause of an unexpected result. clean, logic is too often faulty, and fingers walk clumsily over keyboards. Condition codes are not an Terms are Relative accurate indicator of successful programs. The primary author conducted a brief informal survey: There are traditional methodologies for preventative 1) within Parke-Davis, among the Clinical Reporting pest control, but there is no PROC TEST-MY-CODE Systems management team, selected senior or TEST-MY-CONCEPT or READ-MY-MIND. The programmers and systems analysts, clinical team SAS system offers many features for identifying leaders, developers and biometricians, and 2) beyond syntax, logic, and data errors. The results will the company, within a selected network of colleagues undoubtedly include reduced stress and bigger raises in the SAS community. The intent of the survey was for SAS programmers, satisfied clients, accurate to see how well our baseline definitions help up output, and quality programs that are reliable and against greater scrutiny, perception and application. maintainable. This supports the business need to deliver results to the FDA, minimize time to approval IEEE on Terms and time to market. One survey respondent follows the Teri Stokes school Handout (and disclaimer) of validation, based on IEEE standards. Std 1012- 1986, "IEEE Standard for Software Verification and Validation Plans", states: The text of this paper is somewhat theoretical and a tad philosophical, perhaps dry. But that doesn’t VERIFICATION is "the process of determining the ringer, comes in many disguises (unit, integration, whether or not the products of a given phase of the systems, etc.). “If someone asks me to test software development cycle fulfill the requirements something, I ask them if they want it verified or established during the previous phase." Informally, validated.” making sure your program is doing what you think it does. Respondents said a program wasn’t ready for validation if it still had bugs in it. Debugging was said VALIDATION is "the process of evaluating software at to be “testing of logical systems to ensure they’re not the end of the software development process to subject to errors of certain a priori identified types.” ensure compliance with software requirements." Generally, debugging was felt to be the fixing of errors Informally, making sure the client is getting what they in the development phase of the SDLC. wanted. Zero Defect Programs TESTING is "the process of analyzing a software item to detect the differences between existing an required [...Lubarsky’s Law of Cybernetic Entomology: There’s conditions (that is, bugs), and to evaluate the features always one more bug....] of the software item." How are errors introduced into a program? Naturally, IEEE Standard 610.12-1990, "IEEE Standard no one intends to produce flawed code. “Bugs” just Glossary of Software Engineering Terminology, offers turn up mysteriously to wreak havoc when we least DEBUG: "To detect, locate, and correct faults in a expect it. However, we should recognize “bugs” for computer program. Techniques include use of the errors that they are and attempt to produce code breakpoints, desk checking, dumps, inspection, with zero defects. It is not enough to assume without reversible execution, single-step operation, and verification that the work of good programmers will be traces." correct. The slightest typo in the code or misunderstanding of the user requirements can cause Survey Results serious errors during later execution. For the most part, the survey respondents were The purpose of testing is to identify any errors and consistent in their definitions, especially for validation inconsistencies that exist in a system. Debugging is and debugging. Verification and testing were murkier the art of locating and removing the source of these subjects, often assumed to be the same thing. errors. Together, the testing and debugging tasks are thought to cost 50% to 80% of the total cost of Comments on verification included: 1) quick and dirty developing the first working version of a system. check, 2) the systems testing portion of the SDLC, 3) Clearly, any techniques that will facilitate these tasks it’s definitely a Department of Defense word, not will lead to improved productivity and reduced costs of commonly used, 4) identifying that something is system development. correct, 5) creation of test data for test plan, 6) making sure a program does what [it] says, the results are We must broaden our philosophy of testing to go accurate and stand up to the specs. beyond the syntax check. As shown in Figure 1, the cost of correcting errors increases exponentially with Validation was consistently related to the SDLC – it the stage of detection. An error in a large system “is” requirements, specifications, development, discovered after release of the product to the user can verification, and user acceptance. Respondents said cost over two hundred times the cost of correcting the validation was: 1) thorough documentation of the same error if it were discovered at the beginning of SDLC, 2) formal, accomplishes the specs for inclusion the system’s development. Costs to fix these later in the research reports, 3) inclusive of change control errors include changes in documentation, re-testing, and retirement, 4) to ensure compliance with and possible changes to other programs affected by regulations, making it legal, 5) formal test plan, test the one in error. While costs of correcting errors in data, system verification, creation of valid protocol, smaller systems or segments of code do not increase user acceptance. Recurring words were: reproduce- so dramatically over time, early testing can still reduce able, efficacious. total costs substantially and improve system accuracy and reliability. Testing yielded the vaguest responses. Some felt it was synonymous with verification or that it is part of validation. Other comments: informal testing during the development phase, putting the program through Figure 1. Relative Cost of Correcting Errors Relative Cost 200 150 100 50 0 Requirements Coding Acceptance Testing Design Development Testing Operation Stage of Error Detection Source: Farley The life cycle begins with a need expressed by a Testing and debugging are recommended at all future user of the system. These requirements stages of the software development life cycle: specifications become a statement of needs given determination of functional requirements; systems or from the user’s point of view, in the form of a detailed program design; coding, unit testing, and statement of what the system is supposed to do. implementation; and validation of results. Many features of the SAS System can facilitate testing and Obviously, the successful preparation of the debugging, resulting in time saving, programmer- requirements document depends on clear efficient, cost-effective program design into your daily communication between the requester and the routine. Incorporation of these techniques into the programmer. For example, does the manager want a design, from the smallest module to the overall separate listing by patient ID and by treatment group, system, will result in a higher quality product delivered or a single listing with totals by patient ID within each in a shorter time, not to mention reduced stress and treatment group? Include frequent, detailed, increased self-esteem among the programming staff.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages10 Page
-
File Size-