<<

.Meenakshi et al, / (IJCSIT) International Journal of and Technologies, Vol. 5 (3) , 2014, 3729-3731

Software Testing Techniques in Development Life Cycle

D.Meenakshi , J. Swami Naik , M.Raghavendra Reddy

Computer Science Department, G.Pulla Reddy College Kurnool,AndhraPradesh, India

Abstract: provides a means to minimize by the . [Verification: Are we building the product errors, cut maintenance and decrease overall software costs. right?] Different software development and testing methodologies, 2. Error Detection: Testing makes an attempt to go tools, and techniques have emerged to enhance software things wrong and to determine if things happen when quality. At each and every phase of software development life they shouldn‟t or things don‟t happen when they cycle testing is performed with different tools and techniques. One of the major problems within software testing area is how should. to get a suitable set of cases to test a software . The 3. Validation looks at the system correctness – i.e. is software should assure maximum effectiveness with the least defined as checking that what has been specified is hat possible number of test cases. At present there are many the user actually wanted. [Validation: Are we building testing techniques available for generating test cases. the right prduct?]

Keywords: Software Testing, Level of Testing, Testing The purpose of testing is verification, validation and error Technique, Testing Process detection in order to detect problems – and the purpose of

finding those problems is to get them set. Most generic I. INTRODUCTION Software problems are : Inadequate software performance, Software testing is as getting on as the hills in the the past searches that yields incorrect results. Incorrect data digital computers. Testing the software is the only means entry & unsuccessful data edits, wrong coding / for assessing the quqlity of software . in view of the fact of rules, Incorrect calculation, that testing typically consumes 40 - 50% of effort in ineffective data edits, Incorrect processing of data devolopment, and requires more effort for that relationship, Incorrect or inadequate interfaces with other require more reliability, it is an important t part of the programs, Inadequate performance and security controls, . Contemporary software systems Incorrect handling, insufficient support of business must be more reliable and correct. Automatic methods are needs, Unreliable results or performance, Confusing or used for ensuring software correctness which range from misleading data, Software usability by end users & static techniques, such as (software) model checking or Obsolete Software, Inconsistent processing. static analysis, to dynamic techniques , such as testing. All these testing techniques have strengths and weaknesses: Terminology: model checking (with abstraction) is automatic, exhaustive,  Mistake – A manual action done which produce an but may suffer from scalability issues. Static analysis, on incorrect result. the other hand, scales to very large programs but may give  Fault [or Defect] – An incorrect step, process, or too many fake warnings, while testing alone may miss data definition in a program. significant errors, since it is intrinsically incomplete.  Failure – The lack of ability of a system or Software Testing: component to execute its Required function within Software testing is not only error detection; Testing the precise Performance . software means operating the programs under controlled  Error – The variation between a computed, conditions, to (1) checks whether it behaves “as specified”; observed, or Metric or condition and the (2) to detect errors, and (3) to validate that what has been true,specified, or theoretically correct value or specified is what the user actually wants. condition.

 Specification – A document that specifies in a

Verifica Error Validation complete, accurate, confirmable manner, the

tion Detection , design, performance, or other

feature of a system or component, and often the

Procedures for determining whether these

Activities in software testing provisions have been Satisfied. We observe errors,

which can often be connected with failures. But 1. Verification is defined as checking or testing of items, the decisive cause of the fault is often very hard to including software, for conformance and reliability by find. evaluating the results against requirements specified

www.ijcsit.com 3729 D.Meenakshi et al, / (IJCSIT) International Journal of Computer Science and Information Technologies, Vol. 5 (3) , 2014, 3729-3731

II. OBJECTIVES :  To verify if the system is “Fit for function”.  A good test case is one that has a of  To confirm if the system meets the requirements detecting an error. and be executed successfully in the Intended  A superior test is not redundant. environment.  A successful test is one that uncovers a yet  Executing a program with the intention of finding undiscovered error. an error.  A good test supposed to be “best of breed”.  To check if the method does what it is predictable to do. Testing Specification Scope Opacity Who do it? Low level design Actual Unit Classes White box code Low level and high level Integration Multiple classes White and Black box programmer design Function High level design Whole product Black box Independent test group Whole product in System Requirement analysis Black box Independent test group environment Whole product in Acceptance Requirement analysis Black box Customer environment Whole product in Beta Adhoc Black box Customer environment Whole product in Independent test group Regression Changed software White and Black box environment programmer

`III. SOFTWARE TESTING LIFECYCLE – PHASES

 Testing Cycle starts with the study of user‟s Requirement requirements. study  Understanding of the requirements is very essential for

testing the product

 Component Identification Design  Test Specification Design  Test Specification Review

Test Execution Construction  , Test execution and evaluation  Performance and simulation Testing

Test Closure Deployment  Test summary report ,Project De-brief , Project Documentation Test Process Analysis  Analysis done on the reports and improving the application‟s performance by implementing new technology and additional

IV LEVELS OF TESTING: :  The initial level of testing.  Tests done on particular methods or components.  Requires information about internal program design and code.  Test wll be done by (not by testers).

www.ijcsit.com 3730 D.Meenakshi et al, / (IJCSIT) International Journal of Computer Science and Information Technologies, Vol. 5 (3) , 2014, 3729-3731

To test the function of a program or unit of : code such as a program or module,To test To check whether the software meets customers internal logic,To verify internal design,To requirements or not. test path & conditions coverage,To test Beta testing: exception conditions & error handling,To test Testing the software under the customers environment. Objectives the function of a program or unit of code such : as a program or module,To test internal Testing done to the software after making the changes or logic,To verify internal design,To test path & enhancements conditions coverage,To test exception conditions & error handling V. TEST PLAN When After modules are coded Purpose of preparing a Test Plan Who Developer  Validate the acceptability of a software product. Internal Application Design,Master Test  Help the people outside the test group to Input Plan,Unit Test Plan understand „why‟ and „how‟ of product Output Unit Test Report validation. White Box testing techniques,Test Coverage  A Test Plan should be thorough enough (Overall Methods techniques coverage of test to be conducted) ,Re-,Code Scope Tools Analyzers,Path/statement coverage tools  The areas to be tested by the QA team.  Specify the areas which are out of scope (screens, Incremental , mainframe processes etc).  It is done based on design and Test Approach of an application as and when a new functionality  Details on how the testing is to be performed. is added.  Any specific strategy is to be followed for testing  Application’s functionality aspects are required to (including ). be independent enough to work separately before completion of development. VI. CONCLUSION :  Done by programmers or testers. Testing can show the presence of faults in a system; it Top-down integration: cannot prove that there are no remaining faults. Component  Develop the skeleton of the system and integrates developers are dependable for component testing; system it with components. testing is the responsibility of a separate team. Integration Bottom-up integration testing is testing increments of the system; release testing  Integrate infrastructure components then add involves testing a system to be released to a customer. Use functional components. experience and guidelines to design test cases in defect  To simplify error localization, systems should be testing. Interface testing is designed to discover defects in incrementally integrated the interfaces of composite components. Equivalence partitioning is a way of discovering test cases - all cases in To technically verify proper interfaces a partition should behave in the same way. Structural Objectives between modules, and within sub- relies on analyzing a program and deriving tests from this analysis. reduces testing costs When After modules are unit tested by supporting the test process with a range of software Who Developer tools. Internal & External Application Design Input ,Master Test Plan ,Integration Test Plan REFERENCES Output Integration Test report [1]. Lessons Learned in Software Testing, by C. Kaner, J. Bach, and B. White and Black Box techniques ,Problem / Pettichord Methods Configuration Management [2]. Testing Computer Software, by C. Kaner, J. Falk, and H. Nguyen [3]. Effective Software Testing, by E. Dustin Tools Debug ,Re-structure ,Code Analyzers [4]. Software testing, by Ron Patton [5]. Software engineering, by Roger Pressman System testing: [6]. http://people.engr.ncsu.edu/txie/testingresearchsurvey.htm To verify that the system components perform control [7]. http://www.engpaper.com [8]. http://www.people.engr.ncsu.edu/txie/testingresearchsurvey.htm functions,To perform inter-system test,To demonstrate that [9]. www.cs.cmu.edu/luluo/Courses/17939Report.pdf the system performs both functionally and operationally as [10]. www.findwhitepapers.com specified,To perform appropriate types of tests relating to [11]. www.scribd.com Transaction Flow, Installation, Reliability, Regression etc.

www.ijcsit.com 3731