Appendix a – Checklist: Sanity Check on the Design
Total Page:16
File Type:pdf, Size:1020Kb
Appendix A – Checklist: Sanity Check on the Design A.1 Conclusion Conclusion Y / N The specifications (test base) describe the system clearly and precisely. They are therefore sufficient as the basis for a structured test project. If the conclusion is negative: High-level errors that were recorded during the sanity check are dis- played below. For each error, indicate the risks for the test design and the measures needed to rectify them. Error Risk Measure 358 Appendix A A.2 Result Description Y/N Not applicable Solution The functionality, the processes and their coherence is described sufficiently The functionality, the processes and their coherence concur with the anticipated goal All of the relevant quality requirements have been sufficiently taken care of The result of the risk analysis can be traced in the test base A.3 Control Description Y/N Not applicable Solution Contains history log (including version administration) Contains approval/distribution list Contains document description (including author) The document has a status Contains reference list A.4 Structure Description Y/N Not applicable Solution Contains executive (or management) summary Contains table of contents (current) Contains introduction Contains chapter(s) with an overview or description of the functionality (and related screens/reports and messages) Contains chapters with data models and data descriptions Contains chapters with functional specifications Contains chapters with non-functional specifications Contains chapters with interface specifications Appendix A 359 A.5 Content Concerned functionality Y/N Not applicable Solution Contains system overview • System description • Purpose of the system, what is the use for the users and how does it support their business processes (high level) • Scope diagram/system diagram • System environment and dependencies • System owners, both functional and technical • Sites where the system will be implemented The system contains various subsystems • For every subsystem: a short description with basic functionality, processes and related data • Boundaries of and between subsystems • Relations between subsystems Quality attributes • Quality attributes are defined in the functional descriptions. • Relevant quality attributes (e. g. performance, security) are defined for the whole system. • Relevant quality attributes (e. g. performance, security) are defined for each subsystem • Quality attributes are defined in functional descriptions. • Contains workflow/process flow/state transition or other diagrams • Description • Lists actions, conditions and situations • Describes actions, conditions and situations Contains use case (UC) • Use case description • Use case diagram (UCD) • Connection between UCD and relevant business processes • UCD contains boundaries • UCD contains primary and secondary UC 360 Appendix A Concerned functionality Y/N Not applicable Solution • Connection between UC and functionality • UC contains primary scenario, alterna- tive scenario and fallback scenario Contains screen information • Screen layout • Screen/system navigation • I/O information Contains report information • Report description • Parameters • Fields • Grouping and sorting • Report layout • Contents Data model and data description Y/N Not applicable Solution Contains graphic model (ERD) Contains description of the entities • Entity definition: a clear and unambi- guous definition of the notion of entity • Attribute definition: a clear and unam- biguous definition of every attribute of the entity, including a reference to a domain or description of a type of data, length and permitted values • Primary and foreign keys: the attributes that identify the (related) entities • Quantities: minimum, average and maximum number of entities and growth rate per period Contains description of relations • Name: of the relation and the inverse relation • Cardinality: how many entities are linked to the relations • Option: the extent to which the relation is mandatory or optional for the entities of the relation • Constraints: have the regulations been established that determine how the system should react according to the cardinality and possibilities within the relations? Appendix A 361 Data model and data description Y/N Not applicable Solution Contains description of domains • Name • Data type and length • Permitted values Contains description of constraints Contains description of regulations Contains description of stored procedures Contains description of triggers Contains description of defaults System interfaces Y/N Not applicable Solution Contains the name of the interface; for example <interface System A – System B> Contains description of the purpose Contains workflow/scheme/diagram Contains details about the process Contains description of the type of interface: operational/data/batch/online Contains detailed information about the format of the interface: structure, data elements, data types, length, permitted values Contains description of error handling Contains description of error logging Contains description of constraints Functional specifications Y/N Not applicable Solution Contains name of the function Contains purpose of the function Contains process information Contains I/O information Contains description of data processing Contains description of calculation functionality Contains description of constraints (input checks) Contains description of error handling 362 Appendix A Authorization Y/N Not applicable Solution Description of the required authorization (profiles) Contains authorization matrix Technical architecture Y/N Not applicable Solution Description and/or schemes of the infra- structure, for example, network, servers, DBMS, operating systems, middleware Description of technical aspects of interfaces Appendix B – Checklist: Sanity Check on the Testware B.1 Conclusion Conclusion Y / N The testware is described accurately enough and/or works well enough for reuse If the conclusion is negative High-level errors that were recorded during the sanity check are dis- played below. For each error, indicate the risks for the test design and the measures needed to rectify them. Error Risk Measure 364 Appendix B B.2 Result Description Y/N Not applicable Solutions The functionality that is to be tested and the processes and their coherence are sufficiently described The functionality and the processes and their coherence concur with the anticipated goal All of the relevant quality attributes have been sufficiently taken care of The outcome of the risk analysis is traceable in the test design Test results from the last time the testware was used are available and provide insight into the quality of the tests Errors that were found after last time the testware was used are available and indicate on what points the coverage of the testware is insufficient B.3 Logical tests Description Y/N Not applicable Solution The test basis on which the logical test is based is clearly specified The logical test design is up to date, there are no major outstanding issues in the design specified in the test base The structure of the test design is clear Logical tests can be traced back to related specification elements so changes in the test base can efficiently implemented Logical tests can be traced back to the related risk category in the TRA so changes in the TRA can be efficiently implemented Logical tests are set up on the basis of test techniques Where needed, test techniques are applied correctly and completely Choices that have been made in the logical test design have been clearly indicated (for example, deviations from the technique) Appendix B 365 B.4 Physical test Description Y/N Not applicable Solution The test base on which the physical test case is based is clearly specified The physical test design is up to date, there are no major outstanding issues in the design specified in the test base The structure of the test design is clear The physical tests can be traced back to the related specification elements so changes in the test base can be efficiently implemented Physical tests can be traced back to the related risk category in the TRA so changes in the TRA can be efficiently implemented Physical tests can be traced back to the related logical test design so changes in the logical test design can efficiently implemented B.5 Test cases Description Y/N Not applicable Solution Test cases have a unique ID Test cases contain all the required fields Test cases are detailed, it is clear what the purpose of the test is and when the test is successful Test cases clearly describe which test actions will be executed Test cases describe the input data that is used and the expected outcome 366 Appendix B B.6 Test tools Description Y/N Not applicable Solution The test base on which the tooling is based is clearly indicated The tooling is up to date, there are no major outstanding issues in the design specified in the test base There are errors in the tooling and they have a clear status. It can be estimated which bug fixes/ changes will have to be performed on the tools Documentation for the tooling is available, so changes can be efficiently implemented The infrastructure is such that the tooling can be tested before the actual test run starts Appendix C – Checklist: Checklist smoke test system C.1 Conclusion Conclusion Y / N The system can enter the planned test phase. It is expected that the tests will run without too many obstacles or problems The required products are available and their content is sufficient The system is