An Ideal Architecture for Model-Based Verification & Test Systems

An Ideal Architecture for Model-Based Verification & Test Systems

<p> An Ideal Architecture for Model-based Verification & Test Systems</p><p>David Gelperin ([email protected]) Software Quality Engineering (stickyminds.com, sqe.com) Draft 1.8 July 12, 2001</p><p>This position statement records my biases and is meant to catalyze reflection, debate, discussion and research in order to develop some consensus on the characteristics of an ideal system. It is hoped that this consensus will stimulate and influence the development of both research and commercial systems. Suggestions for change are solicited and appreciated.</p><p>1. System Architecture</p><p>The following diagram shows an “ideal” architecture for a MbV&T system: </p><p>1. Requirements Specifier</p><p>Essential models of software usage, behavior, & 2. Model structure in the Test- Checker Essential UML Scripts (complete & incomplete) for Checked software 3. Auto Test execution engine specs Designer</p><p>Software Functional & Usage 4. Auto Scripter Designer test specs in “4th generation” test language Scripts (complete & Additional test specs in incomplete) for Manual Scripter “4th generation” test execution engine language</p><p>Manual Scripts Manual Test Designer</p><p>Each of the numbered boxes is a piece of automation, possibly Non-Functional, Design, interactive. The primary results of each automated process become the & Code-based, test primary input to its succeeding automated process. objectives</p><p>Page 1 of 10 This architecture supports “all at once” development of a test suite. Test suites can also be developed incrementally based on both a model and execution outcome information about the partial suite --- in particular, the outcome of the most recently produced test case [TA2]. Initial model development and checking would be the same for an incremental suite, but the development strategy would be different.</p><p>2. Assumptions</p><p> In important situations, effective software development strategies entail the development of precise models of behavior, structure, and usage. This is true at both product and component levels.</p><p> In these situations, a Model-based Verification and Test (MbV&T) System can be very cost-effective at supporting the development and debugging of these models as well as automatically generating executable function and usage tests.</p><p> Widespread use of model-based test approaches requires a strong collection of commercial offerings which in turn requires a market sufficient to ensure adequate profits for cost-effective systems.</p><p> Achieving such a market depends on: (1) being applicable to a broad range of software (2) inclusion of mainstream modeling techniques (3) seamless integration with software design systems (4) user-friendly interfaces and ultra-understandable outputs (5) spectacular system operation and results including reduced development costs and increased reliability</p><p> Data flow between the components would be in XML Metadata Interchange (XMI) Format. [see http://www-4.ibm.com/software/ad/standards/xmi.html]</p><p>3. Criteria for Ideal</p><p>The description of this system architecture provides a framework for specifying the characteristics of an ideal MbV&T system. Characteristics are ideal based on psychological (e.g., usability) and economic factors. Some of the economic factors are influenced by psychological ones e.g., unusable systems are a poor investment.</p><p>Psychological factors include ease of learning and use, helpfulness of results, and clarity of individual payoff. </p><p>The major economic factor from the customer perspective is payoff i.e., is it sufficient to support the customer’s investment? From the supplier perspective, the major factor is profit i.e., will pricing and demand allow sufficient profit?</p><p>Page 2 of 10 4. General Characteristics</p><p>Some ideal characteristics apply to more than one component, but our intent is not to create a list of characteristics that apply to components in general, but rather list those that have special import to a cost-effective MbV&T system. These general characteristics include:</p><p>GC1. The association between each test and its precursors in the essential models must be maintained to provide traceability in both directions. This means that changes either to tests or essential models require updating of the associations.</p><p>GC2. Each automated component requires control/constraint (secondary) input [not shown in the architecture diagram] as well as the primary input of software or test specifications. Secondary inputs should be separated from primary inputs. </p><p>GC3. Messages and reports should be very clear and helpful in suggesting ways to fix and/or avoid problems.</p><p>GC4. All major outputs should be controlled by an integrated configuration management (CM) system that “understands” their nature and relationships. This may be a custom CM system or a smart wrapper on one or more existing CM, requirements management, and test management system(s).</p><p>5. Component Characteristics</p><p>We provide a description, a set of component-specific characteristics, and some implementation options for each system component.</p><p>5.1. Requirements Notation & Specifier</p><p>Description --</p><p>For mainstream acceptance, the basic specification notation should be grounded in the UML (the mainstream modeling notation) but limited to those elements that support Essential Modeling [RN9, RN12, RN1]. Because the UML is not test adequate, additional elements must be added. Only additions that are essential to automatic testing are necessary. This Test-Essential UML would include Action Contracts [RN4], Interface Structures [RN5], and Precise Use Cases [RN6].</p><p>The specifier is a process support system that assists in the development and modification of annotated specifications of software behavior, structure, and usage. Annotations supplement a basic model with information about usage, failure risk, and</p><p>Page 3 of 10 possibly test execution outcomes. The specifier provides specification patterns and templates along with strategic advice to support all aspects of specification modeling.</p><p>Ideal characteristics of notation & specifier --</p><p>Spec notation must: RNC1. be mainstream (i.e., based on UML) RNC2. include elements for essential modeling of usage, behavior and structure RNC3. support automated verification and automated test design i.e., be test adequate RNC4. be ultra-understandable to enable reliable validation via review RNC5. be a seamless precursor to mainstream software design</p><p>Specifier must: RSC1. be a true process support system including abilities to –  specify test adequate models  add layers of supplementary information about usage, failure risk, and execution outcomes to the models  flag violations of modeling guidelines specified by the client  translate any special client terminology into normal form logical expressions  expand abstract terminology (e.g., derived attributes) into its associated basic expressions RSC2. verify the syntactic correctness, consistency and relative completeness of a spec RSC3. have the ability to simulate product usage by “animating” models.</p><p>Implementation Options --</p><p>1. Custom-built specifier 2. MetaCase based specifier (e.g., Meta Edit + [www.metacase.com/index.html] or ObjectMaker TDK [www.markv.com/markv.com/objectmaker.htm]) 3. Restricted UML modeler incorporating test-essential extensions</p><p>5.2. Model Checker</p><p>Description --</p><p>Supports the specification of required properties or constraints (e.g., safety-oriented operational constraints such as a traffic light can not be green in all directions) that are then shown by the checker to be provable from the current model or unprovable via the generation of a counter-example.</p><p>Ideal characteristics of checker --</p><p>Page 4 of 10 Checker must: MCC1. proactively support (with examples and advice) the process of constraint specification and result determination. MCC2. trace the derivation of each proof, so that incorrect information in the spec can be quickly located when unprovable properties are proven.</p><p>Implementation Options --</p><p>1. Custom-built checker 2. Existing model-checker (e.g., Spin or FormalCheck [productized SMV - www.cadence.com/datasheets/formalcheck.html]) with a wrapper</p><p>5.3. Automatic Test Designer</p><p>Description --</p><p>The Test Designer uses – (1) information about behavior (e.g., pre & post conditions), structure, and usage (including probabilities) from the specification model, (2) knowledge of a 4th Generation Test Language (4GTL) that uses the terminology of the specification model, and (3) knowledge of sound test design principles to design 4GTL test cases and procedures that create pre-conditions and check post- conditions. 4GTL refers to any test specification language (analogous to 4GL’s for software development, e.g. report writers) that specify the essential aspects of a test in high-level abstractions. </p><p>The products of the automatic test designer are of three types (in any mix): (1) sufficient for an automated scripter to produce cost-effective, automated test scripts that are complete i.e., setup all pre-conditions and check all post- conditions, or incomplete i.e., setup all preconditions and possibly check some post-conditions, (2) sufficient for a manual scripter to produce cost-effective, automated scripts that are complete or incomplete or (3) sufficient for a manual scripter to produce cost-effective, manual scripts.</p><p>Ideal characteristics of automatic designer & test specs --</p><p>Automatic designer must: ADC1. provide guidance (e.g., estimates) in the selection of parameters that determine the number and character of the tests to be generated ADC2. trace the derivation of each test, so that incorrect tests can quickly lead to incorrect information in the spec or bugs in the derivation routines</p><p>Page 5 of 10 Generated test specs must: ADC3. be easily supplemented, since no automated designer generates all tests that need to be run ADC4. be ultra-understandable (i.e., specified in a 4GTL) to enable reliable validation via review</p><p>Implementation Options --</p><p>1. Custom-built automatic test designer</p><p>5.4. Automatic Scripter</p><p>Description --</p><p>The scripter creates scripts for specific execution engines (e.g., Mercury, Rational, Segue) by using application interface maps and terminology translation tables to translate the 4GTL specs to the concrete scripts required by execution engines.</p><p>Ideal characteristics of scripter --</p><p>The scripter must: ASC1. proactively support the creation of terminology translation tables via test spec scans that prime the tables by loading in the terminology and identifying the type of each entry.</p><p>Implementation Options --</p><p>1. Custom-built scripter</p><p>5.5. Downstream Components</p><p>Description --</p><p>System should have the ability to instrument design (e.g., components) and implementation (e.g., lines of code) level elements, monitor their execution, and report those that are not executed. </p><p>The report of unexecuted elements would be passed to a manual test designer, who would translate these design and implementation based test objectives into scenario or function based objectives by asking “what scenario or variant of a function must be tested in order to execute these uncovered elements?”. </p><p>This translation may uncover defects in the design or implementation by revealing elements that can never be executed. If execution is possible, the translation may </p><p>Page 6 of 10 uncover defects in the specs or weaknesses in the test design process. When execution is possible, new objectives would lead to new test specs, either manually [See bottom right corner of the architecture diagram] or automatically designed. </p><p>6. Business History of Automatic Test Generation Tools</p><p>We can distinguish levels success: </p><p>Level Definition Examples 1. Commercial Generates a (a) substantial or (b) a. None Product marginal profit as a stand alone b. T-Vec product TestMaster ToolSet_Certify 2. Commercial Generates sufficient profit as a StP/T ? Component component in or add-on to a tool SoftTest ? suite 5 Use Case products ? 2 UI products ? 3. Strategic Adds value to a consulting practice TestFrame 4. Research Demonstrates feasibility of a Many state-machine technical approach based systems</p><p>History suggests that incorporating a test generation tool into a suite significantly increases the likelihood of commercial success.</p><p>7. Knowns and Unknowns</p><p>We know:</p><p> there are over 2 dozen commercial, UML-compliant, modeling systems [RS3]. (Some will not survive.)</p><p> essential models are adequate for model checking and logical test design [RS2, RS4, MC2].</p><p> state-based, tabular-format, research tools can be successful specifiers [RS5], linked to model checkers [MC2], and execution-level script designer/generators [TD2]. </p><p> SCR (1978) has been around over 20 years and still has a very small user base</p><p> a 4GTL can be very effective for test specification [SC1, SC2].</p><p>We do NOT know:</p><p> the exact composition of a test-adequate UML</p><p>Page 7 of 10  if a test-adequate UML can effectively model a broad range of systems, seamlessly integrate with software design systems, and provide ultra-understandable results for review.</p><p> the exact content and format of a 4GTL spec</p><p> if a test designer can produce 4GTL results that can be easily translated into executable scripts.</p><p> if an MbV&T system with these (or any other) characteristics can attract a substantial number of users.</p><p>8. Acknowledgements</p><p>Thanks to Ian Craggs, Alan Jorgensen and Vera Johnson for careful review and thoughtful suggestions. </p><p>Thanks to Alan Hartman and his co-authors for providing an alternative architecture [TA2].</p><p>9. References</p><p>Testing Architectures --</p><p>TA1. Blackburn, Mark R., Busser, Robert, Nauman, Aaron, "Removing Requirement Defects and Automating Test" Proceedings of STAR East 2001 Software Quality Engineering </p><p>TA2. Craggs, Ian and Griffiths, Ian "Model-Based Testing for Large-Scale Industrial Software" IBM Technical Report. </p><p>TA3. Gronau, Ilan, Hartman, Alan, Kirshin, Andrei, Nagin, Kenneth, Olvovsky, Sergey, "A Methodology and Architecture for Automated Software Testing" IBM Research Lab in Haifa, Technical Report. Available at http://www.haifa.il.ibm.com/projects/verification/gtcb/publications.html</p><p>Requirements Notation --</p><p>RN1. Constantine, Larry L. and Lockwood, Lucy A. D., Software Use: A Practical Guide to the Models & Methods of Usage-Centered Design, ACM Press 1999 [Essential Modeling]</p><p>RN2. Douglass, Bruce Powel Real-Time UML: Developing Efficient Objects for Embedded Systems Addison-Wesley 2000 [UML]</p><p>RN3 Faulk, S. R., Brackett, J., Ward, P., and Kirby, J., Jr. “The CoRE method for real-time requirements” IEEE Software Vol. 9, No. 5 (Sept. 1992) pp. 22-33 [CoRE]</p><p>RN4. Gelperin, David “Specifying Consequences with Action Contracts” Available in the U3 Modeling Resource Kit at http://www.stickyminds.com — Use search term U3 [Action Contracts]</p><p>Page 8 of 10 RN5. Gelperin, David “Specifying Software with If-Then Tables: What, Why, & How” Available in the U3 Modeling Resource Kit at http://www.stickyminds.com — Use search term U3 [Tables]</p><p>RN6. Gelperin, David “Ultra-Understandable Usage (U3) Modeling Resource Kit” Available at http://www.stickyminds.com — Use search term U3 [Usage Modeling]</p><p>RN7. Janicki, R., Parnas, D. L., and J. Zucker “Tabular Representations in Rational Documents” CRL Report No. 313 1995. Available at http://www.crl.mcmaster.ca/SERG/serg.publications.html [Tables]</p><p>RN8. Leveson, Nancy G., Mats P.E. Heimdahl, and Jon Damon Reese, “Designing Specification Languages for Process Control Systems: Lessons Learned and Steps to the Future” SIGSOFT FOSE '99 (Foundations of Software Engineering), Toulouse, September 1999 Available at http://sunnyday.mit.edu/papers.html [Modeling]</p><p>RN9. McMenamin, Stephen, Palmer, John Essential Systems Analysis Prentice-Hall 1984 [Essential Modeling]</p><p>RN10. Paterno, Fabio Model-Based Design and Evaluation of Interactive Applications Springer Applied Computing 1999 [Task Modeling]</p><p>RN11. Schmuller, Joseph SAMS Teach Yourself UML in 24 Hours Sams Press 1999 [UML]</p><p>RN12. Ward, P.T and Mellor, S.J. Structured Development for Real-Time Systems, Vol. 2: Essential Modeling Techniques Yourdon Press, 1985 [Essential Modeling]</p><p>Requirements Specifier --</p><p>RS1. Bharadwaj, R. and C. Heitmeyer, "Developing High Assurance Avionics Systems with the SCR Requirements Method" in Proc. 19th Digital Avionics Systems Conference, 7-13 October 2000, Philadelphia, PA. Available at http://chacs.nrl.navy.mil/personnel/heitmeyer.html</p><p>RS2. Checkik, Marsha “SC(R)3: Towards Usability of Formal Methods” Proceedings of CASCON'98, pp. 177--189, November 1998. Available at http://www.cs.toronto.edu/~chechik/pubs/cascon98.ps</p><p>RS3. Gelperin, David “Commercial Tools for Software/System Modeling” Available in the U3 Modeling Resource Kit at http://www.stickyminds.com — Use search term U3</p><p>RS4. Heitmeyer, Constance L., Ralph D. Jeffords, and Bruce G. Labaw, "Automated Consistency Checking of Requirements Specifications," ACM Trans. on Software Eng. and Methodology 5, 3, July 1996, 231-261. Available at http://chacs.nrl.navy.mil/personnel/heitmeyer.html</p><p>RS5. C. Heitmeyer, J. Kirby, B. Labaw and R. Bharadwaj, "SCR*: A Toolset for Specifying and Analyzing Software Requirements" Proc. Computer-Aided Verification, 10th Ann. Conf. (CAV'98), Vancouver, Canada, 1998. Available at http://chacs.nrl.navy.mil/personnel/heitmeyer.html</p><p>Model Checker --</p><p>MC1. Chan, William Richard J. Anderson, Paul Beame, Steve Burns, Francesmary Modugno, David Notkin, and Jon D. Reese. Model checking large software specifications. In IEEE Transactions on Software Engineering 24(7), pages 498-520, July 1998. </p><p>Page 9 of 10 MC2. Heitmeyer, Constance L., Kirby, James Jr., Labaw, Bruce, Archer, Myla, Bharadwaj, Ramesh, "Using Abstraction and Model Checking to Detect Safety Violations in Requirements Specifications" IEEE Transactions on Software Engineering, vol. 24, no. 11, November 1998. Available at http://chacs.nrl.navy.mil/personnel/heitmeyer.html</p><p>MC3. Holzmann, Gerard J., “The Spin Model Checker”, IEEE Trans. on Software Engineering, Vol. 23, No. 5, May 1997, pp. 279-295.</p><p>Test Designer --</p><p>TD1. Ammann, Paul E., Black, Paul E., and Majurski, William “Using Model Checking to Generate Tests from Specifications”. Available at http://hissa.nist.gov/~black/Papers/icfem98.ps.gz</p><p>TD2. Blackburn, Mark “Automatic Generation of Test Vectors for SCR-style Specifications” Proceedings of Compass 1997. Available in Publications at www.t-vec.com.</p><p>TD3. Blackburn, Mark “Using Models for Test Generation and Analysis” Proceedings of Digital Avionics System Conference, 1998. Available in Publications at www.t-vec.com.</p><p>TD4. Gargantini, A. and C. Heitmeyer, "Using Model Checking to Generate Tests from Requirements Specifications" Proc., Joint 7th Eur. Software Engineering Conf. and 7th ACM SIGSOFT Intern. Symp. on Foundations of Software Eng. (ESEC/FSE99), Toulouse, FR, Sept. 6-10, 1999. Available at http://chacs.nrl.navy.mil/personnel/heitmeyer.html</p><p>TD5. Statezni, David “A Look at T-Vec’s Test Vector Generation System” STQE Vol. 3, No. 3, May/June 2001.</p><p>Script Generator --</p><p>SC1. Buwalda, Hans “Testing with Action Words” in Fewster, Mark and Graham, Dorothy Software Test Automation Addison Wesley 1999</p><p>SC2. Buwalda, Hans and Kasdorp, Maartje “Getting Automated Testing Under Control” Software Testing & Quality Engineering Vol. 1, No. 6 Nov./Dec. 1999</p><p>SC3. Nagle, Carl Interest Group on Data Driven Testing at http://groups.yahoo.com/group/RobotDDEUsers</p><p>Page 10 of 10</p>

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    10 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us