Load Testing, Performance Testing, Volume Testing, and Stress Testing

Total Page:16

File Type:pdf, Size:1020Kb

Load Testing, Performance Testing, Volume Testing, and Stress Testing STRESS, LOAD, VOLUME, PERFORMANCE, BENCHMARK AND BASE LINE TESTING TOOL EVALUATION AND COMPARISON Cordell Vail Copyright 2005 by Cordell Vail - All rights reserved www.vcaa.com NOTE: The information contained in this document or on the handout CD at the seminars is for use only by the participants who attend one of our seminars. Distribution of this information to anyone other than those attending one of the seminars is not authorized by the authors. It is for educational purposes of the seminar attendees only. It is our intention that by putting this information here from the vendor web pages and from other testers evaluations, that you will have a tool that will let you do your own evaluation of the different tool features. Hopefully this will let you find tools that will best meet your needs. The authors are not recommending any of these tools as one being better than another. All the information has been taken from reviews we have found on the Internet, from their own web pages, or from correspondence with the vendor. They have been grouped here according to our presentation outline. For some of the tools, we were not able to determine the cost or type (ie open source) from their web page or correspondence with the vendor. Users are cautioned that they will be installing any downloaded tools at their own risk. TABLE OF CONTENTS ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ Table Of Contents Key: EVALUATION SECTION Page VENDOR (and foreign country if known) Tool Name [type and price if known] ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~ TOOLS WORTH LOOKING AT (the listing by price does not imply prefereance)............................12 MICROSOFT ...................................................................................................................................................................12 Web App Stress Tool (WAS) [free]...................................................................................................................................................................... 12 APPPERFECT ................................................................................................................................................................37 LoadTester [Free]................................................................................................................................................................................................ 37 PUSHTOTEST ................................................................................................................................................................40 Performance Kit [Free] ........................................................................................................................................................................................ 40 PILOT SOFTWARE LTD (TURKEY)...............................................................................................................................43 SiteTester [$29.00] .............................................................................................................................................................................................. 43 SOFTLOGICA LLC (Russia) ..........................................................................................................................................47 2 WAPT [$299] ....................................................................................................................................................................................................... 47 LOADTESTING.COM......................................................................................................................................................49 Portent Supreme [$279.95] ................................................................................................................................................................................. 49 PAESSLER (Germany) ..................................................................................................................................................53 Webserver Stress Tool 7 [$624.95]..................................................................................................................................................................... 53 SIS –Florida Tech...........................................................................................................................................................60 Holodeck [$1,500]................................................................................................................................................................................................ 60 MAJOR TOOLS (in the 93% of Market Share)...................................................................................... 67 MERCURY INTERACTIVE..............................................................................................................................................67 LoadRunner [Over $5,000] 54%.......................................................................................................................................................................... 67 IBM RATIONAL...............................................................................................................................................................74 Performance Tester [Over $5,000] 11%.............................................................................................................................................................. 74 COMPUWARE ................................................................................................................................................................75 QALoad [Over $5,000] 9% .................................................................................................................................................................................. 75 SEGUE ............................................................................................................................................................................78 SilkPerformer [Over $5,000] 7%.......................................................................................................................................................................... 78 EMPIRIX..........................................................................................................................................................................85 e-TEST suite [Over $5,000] 6% .......................................................................................................................................................................... 85 3 RADVIEW........................................................................................................................................................................88 WebLoad [[Over $5,000] 3% ............................................................................................................................................................................... 88 OTHER LEADING VENDORS................................................................................................................. 91 MICROSOFT ...................................................................................................................................................................91 Application Center Test [$2,999]......................................................................................................................................................................... 91 REDGATE SOFTWARE..................................................................................................................................................92 ANTS Load Professional Edition [$2,490]........................................................................................................................................................... 92 OPENDEMAND...............................................................................................................................................................94 OpenLoad [$2,195].............................................................................................................................................................................................. 94 TEVRON..........................................................................................................................................................................95 CitraTest [Over $5,000] ....................................................................................................................................................................................... 95 HOSTEDLABS................................................................................................................................................................98 HostedToolBox [$20 per month].......................................................................................................................................................................... 98 AUTOMATEDQA ..........................................................................................................................................................101 AQtime [$599]...................................................................................................................................................................................................
Recommended publications
  • Hardware & Software Standards
    Hardware & Software Standards Introduction This document identifies the current City of Chicago standards for its hardware and software environments, and is intended primarily for City department and vendor use. These standards do not mean that other software and hardware, which might have been previously listed as standard, may not be used or supported, but the following items should be purchased for any new initiative or growth/replacement needs. Any proposals for non-standard hardware or software purchases or questions/comments should be forwarded to the Department of Innovation and Technology (DoIT) Enterprise Architecture Board for review, and will need to be approved via the Technology Purchase Review and Approval (TPRA) process. Standards denoted with an asterisk (*) are currently under review. Platform Standards Operating System (O/S) Hardware Platform Solaris 10 (Unix) (Oracle) Sun Microsystems RedHat Linux Enterprise Server 6.x, 7.x Dell RedHat Linux Enterprise Server 7.x (PCI Services) Dell VMWare VSphere 6.5U1 Dell Windows 2012 R2 & 2016 (Standard and Enterprise) Dell Windows 7, Windows 10 Dell, Panasonic Page 1 of 6 Last Revised January 2018 Hardware & Software Standards Enterprise Services Type Windows 2008 Server All other platforms Oracle Enterprise 11gR2, 12cR1; Postgres 9.x or 10.x (EnterpriseDB or Database N/A community) Print O/S n/a File O/S n/a Email Exchange 2016 / Office365 n/a Desktops, Laptops, & Tablets Type Model Standard Users Dell OptiPlex 5050 SFF, Dell OptiPlex 7450 All-In-One Mobile User Latitude 12 Rugged Extreme Latitude 14 Rugged 5414 Latitude 12 2 in 1 with case and Doc Latitude 5480 14" Laptop 6th gen proc High-End Workstation Dell Precision T5810 Laptop Accessories Docking- For the E-5470 units, Dell Business Dock - WD15 with 130W Adapter Monitor Dell 23 Monitor – P2317H Page 2 of 6 Last Revised January 2018 Hardware & Software Standards Printing and Scanning The Department of Fleet and Facility Management (2FM) oversees print services for the City of Chicago.
    [Show full text]
  • IT CLASSIFICATION TECHNOLOGY LIST – ISSUE DATE: April 18, 2017
    IT CLASSIFICATION TECHNOLOGY LIST – ISSUE DATE: April 18, 2017 Technology Definition: A set of knowledge, skills and/or abilities, taking a significant time (e.g. 6 months) to learn, and applicable to the defined classification specification assigned. Example of Tools: These are examples only for illustration purposes and are not meant to constitute a full and/or comprehensive list. CLASSIFICATION DISCIPLINE TECHNOLOGY DEFINITION EXAMPLE OF TOOLS Omegamon, IBM Admin Tools, Log Analyzer, Relational The relational database management system provided by IBM that runs on Unix, Linux, DBA DB2 DB2 Compare, Nsynch, TSM, Universal Database Windows and z/OS platforms including DB2 Connect and related tools. Command, SQL SQL Server Mgmt. Studio, Red Gate, Vantage, SQL Server The relational database management system and related tools provided by Microsoft Corp. Tivoli, Snap Manager, Toad, Enterprise Manager, SQL ORACLE The relational database management system and related tools provided by Oracle Corp. ASE SYBASE The relational database management system and related tools provided by Sybase. Cincom SUPRA SQL – Cincom’s relational database management system provides access to data Supra 2.X in open and proprietary environments through industry-standard SQL for standalone and client/server application solutions. Open Source Open Source database management system such as Mysql. Phpadmin, mysqladmin, Vertica Hierarchical The hierarchical database management system provided by IBM that runs on z/OS mainframe IMS BMC IMS Utilities, Strobe, Omegamon Database platform including related tools. Cincom SUPRA® PDM – Cincom’s networked, hierarchical database management system provides access to your data through a Physical Data Manager (PDM) that manages the data Supra 1 structures of the physical files that store the data.
    [Show full text]
  • Automated Web Application Testing Using Search Based Software Engineering
    Automated Web Application Testing Using Search Based Software Engineering Nadia Alshahwan and Mark Harman CREST Centre University College London London, UK fnadia.alshahwan.10,[email protected] Abstract—This paper introduces three related algorithms and [21]. However, of 399 research papers on SBST,1 only one a tool, SWAT, for automated web application testing using Search [20] mentions web application testing issues and none applies Based Software Testing (SBST). The algorithms significantly search based test data generation to automate web application enhance the efficiency and effectiveness of traditional search based techniques exploiting both static and dynamic analysis. The testing. combined approach yields a 54% increase in branch coverage and Popular web development languages such as PHP and a 30% reduction in test effort. Each improvement is separately Python have characteristics that pose a challenge when ap- evaluated in an empirical study on 6 real world web applications. plying search based techniques such as dynamic typing and identifying the input vector. Moreover, the unique and rich Index Terms—SBSE; Automated Test data generation; Web nature of a web application’s output can be exploited to aid applications the test generation process and potentially improve effective- ness and efficiency. This was the motivation for our work: We seek to develop a search based approach to automated I. INTRODUCTION web application testing that overcomes challenges and takes advantage of opportunities that web applications offer. The importance of automated web application testing de- rives from the increasing reliance on these systems for busi- In this paper we introduce an automated search based ness, social, organizational and governmental functions.
    [Show full text]
  • Webload IDE User's Guide  I 
    IDE User’s Guide Version 10.3 The software supplied with this document is the property of RadView Software and is furnished under a licensing agreement. Neither the software nor this document may be copied or transferred by any means, electronic or mechanical, except as provided in the licensing agreement. The information in this document is subject to change without prior notice and does not represent a commitment by RadView Software or its representatives. WebLOAD IDE User’s Guide © Copyright 2015 by RadView Software. All rights reserved. August, 2015, RadView Publication Number WL-OSSPRO-0913-IUG10 WebLOAD, TestTalk, Authoring Tools, ADL, AppletLoad, and WebExam, are trademarks or registered trademarks of RadView Software IBM, and OS/2 are trademarks of International Business Machines Corporation. Microsoft Windows, Microsoft Windows 95, Microsoft Windows NT, Microsoft Word for Windows, Microsoft Internet Explorer, Microsoft Excel for Windows, Microsoft Access for Windows and Microsoft Access Runtime are trademarks or registered trademarks of Microsoft Corporation. SPIDERSESSION is a trademark of NetDynamics. UNIX is a registered trademark of AT&T Bell Laboratories. Solaris, Java and Java-based marks are registered trademarks of Sun Microsystems, Inc. HP- UX is a registered trademark of Hewlett-Packard. SPARC is a registered trademark of SPARC International, Inc. Netscape Navigator and LiveConnect are registered trademarks of Netscape Communications Corporation. Any other trademark name appearing in this book is used for editorial purposes only and to the benefit of the trademark owner with no intention of infringing upon that trademark. For product assistance or information, contact: Toll free in the US: 1-888-RadView Fax: +1-908-864-8099 World Wide Web: www.RadView.com North American Headquarters: International Headquarters: RadView Software Inc.
    [Show full text]
  • Web Gui Testing Checklist
    Web Gui Testing Checklist Wes recrystallizing her quinone congruously, phytophagous and sulphonic. How imponderable is Schroeder when barbate whileand soft-footed Brewer gliff Zachery some incisure yakety-yak affluently. some chatoyancy? Fulgurating and battiest Nealson blossoms her amontillados refine Wbox aims to the field to be able to the automated support data, testing web gui checklist Planned testing techniques, including scripted testing, exploratory testing, and user experience testing. This gui content will the css or dynamic values? Test all input fields for special characters. For instance, create test data assist the maximum and minimum values in those data field. Assisted by timing testing is not tested to the order to achieve true black art relying on gui testing web checklist will best. The web hosting environments you start all web testing gui checklist can provide tests has had made. The gui testing procedures are the weak factors causing delays in agile here offering, gui testing web? At anytime without giving us a testing web gui checklist can also has on. How gui testing checklist for a gui testing web checklist to induce further eliminating redundant if there is transmitted without the below to use of jobs with. Monkey testing tool that an application or even perform testing web gui changes some test android scripts behind successful only allows an. Discusses the preceding css or if a sql injections through an application penetration testing on gui testing web? How much regression testing is enough? Fully automated attack simulations and highly automated fuzzing tests are appropriate here, and testers might also use domain testing to pursue intuitions.
    [Show full text]
  • Model Driven Scheduling for Virtualized Workloads
    Model Driven Scheduling for Virtualized Workloads Proefschrift voorgelegd op 28 juni 2013 tot het behalen van de graad van Doctor in de Wetenschappen – Informatica, bij de faculteit Wetenschappen, aan de Universiteit Antwerpen. PROMOTOREN: prof. dr. Jan Broeckhove dr. Kurt Vanmechelen Sam Verboven RESEARCH GROUP COMPUTATIONAL MODELINGAND PROGRAMMING Dankwoord Het behalen van een doctoraat is een opdracht die zonder hulp onmogelijk tot een goed einde kan gebracht worden. Gelukkig heb ik de voorbije jaren de kans gekregen om samen te werken met talrijke stimulerende collega’s. Stuk voor stuk hebben zij bijgedragen tot mijn professionele en persoonlijke ontwikkeling. Hun geduldige hulp en steun was essentieel bij het overwinnen van de vele uitdagingen die met een doctoraat gepaard gaan. Graag zou ik hier dan ook enkele woorden van dank neerschrijven. Allereerst zou ik graag prof. dr. Jan Broeckhove en em. prof. dr. Frans Arickx bedanken om mij de kans te geven een gevarieerde en boeiend academisch traject te starten. Bij het beginnen van mijn doctoraat heeft Frans mij niet alleen geholpen om een capabel onderzoeker te worden, ook bij het lesgeven heeft hij mij steeds met raad en daad bijgestaan. Na het emiraat van Frans heeft Jan deze begeleiding overgenomen en er voor gezorgd dat ik het begonnen traject ook succesvol kon beeïndigen. Beiden hebben ze mij steeds grote vrijheid gegeven in mijn zoektocht om interessante onderzoeksvragen te identificeren en beantwoorden. Vervolgens zou ik graag dr. Peter Hellinckx en dr. Kurt Vanmechelen bedanken voor hun persoonlijke en vaak intensieve begeleiding. Zelfs voor de start van mijn academische carrière, bij het schrijven van mijn Master thesis, heeft Peter mij klaar- gestoomd voor een vlotte start als onderzoeker.
    [Show full text]
  • Enterprise Development with Flex
    Enterprise Development with Flex Enterprise Development with Flex Yakov Fain, Victor Rasputnis, and Anatole Tartakovsky Beijing • Cambridge • Farnham • Köln • Sebastopol • Taipei • Tokyo Enterprise Development with Flex by Yakov Fain, Victor Rasputnis, and Anatole Tartakovsky Copyright © 2010 Yakov Fain, Victor Rasputnis, and Anatole Tartakovsky.. All rights reserved. Printed in the United States of America. Published by O’Reilly Media, Inc., 1005 Gravenstein Highway North, Sebastopol, CA 95472. O’Reilly books may be purchased for educational, business, or sales promotional use. Online editions are also available for most titles (http://my.safaribooksonline.com). For more information, contact our corporate/institutional sales department: (800) 998-9938 or [email protected]. Editor: Mary E. Treseler Indexer: Ellen Troutman Development Editor: Linda Laflamme Cover Designer: Karen Montgomery Production Editor: Adam Zaremba Interior Designer: David Futato Copyeditor: Nancy Kotary Illustrator: Robert Romano Proofreader: Sada Preisch Printing History: March 2010: First Edition. Nutshell Handbook, the Nutshell Handbook logo, and the O’Reilly logo are registered trademarks of O’Reilly Media, Inc. Enterprise Development with Flex, the image of red-crested wood-quails, and related trade dress are trademarks of O’Reilly Media, Inc. Many of the designations used by manufacturers and sellers to distinguish their products are claimed as trademarks. Where those designations appear in this book, and O’Reilly Media, Inc. was aware of a trademark claim, the designations have been printed in caps or initial caps. While every precaution has been taken in the preparation of this book, the publisher and authors assume no responsibility for errors or omissions, or for damages resulting from the use of the information con- tained herein.
    [Show full text]
  • Towards Better Performance Per Watt in Virtual Environments on Asymmetric Single-ISA Multi-Core Systems
    Towards Better Performance Per Watt in Virtual Environments on Asymmetric Single-ISA Multi-core Systems Viren Kumar Alexandra Fedorova Simon Fraser University Simon Fraser University 8888 University Dr 8888 University Dr Vancouver, Canada Vancouver, Canada [email protected] [email protected] ABSTRACT performance per watt than homogeneous multicore proces- Single-ISA heterogeneous multicore architectures promise to sors. As power consumption in data centers becomes a grow- deliver plenty of cores with varying complexity, speed and ing concern [3], deploying ASISA multicore systems is an performance in the near future. Virtualization enables mul- increasingly attractive opportunity. These systems perform tiple operating systems to run concurrently as distinct, in- at their best if application workloads are assigned to het- dependent guest domains, thereby reducing core idle time erogeneous cores in consideration of their runtime proper- and maximizing throughput. This paper seeks to identify a ties [4][13][12][18][24][21]. Therefore, understanding how to heuristic that can aid in intelligently scheduling these vir- schedule data-center workloads on ASISA systems is an im- tualized workloads to maximize performance while reducing portant problem. This paper takes the first step towards power consumption. understanding the properties of data center workloads that determine how they should be scheduled on ASISA multi- We propose that the controlling domain in a Virtual Ma- core processors. Since virtual machine technology is a de chine Monitor or hypervisor is relatively insensitive to changes facto standard for data centers, we study virtual machine in core frequency, and thus scheduling it on a slower core (VM) workloads. saves power while only slightly affecting guest domain per- formance.
    [Show full text]
  • A Systematic Review on Regression Testing for Web-Based Applications
    Journal of Software A Systematic Review on Regression Testing for Web-Based Applications Anis Zarrad* Department of Computer Science and Information Systems, Prince Sultan University, Riyadh, Saudi Arabia. * Corresponding author. Tel.: +966114948531; email: [email protected] Manuscript submitted December 12, 2014; accepted April 16, 2015. doi: 10.17706/jsw.10.8.971-990 Abstract: Web-based applications and their underlying parts are growing rapidly to offer services over the internet around the world. Web applications play critical roles in various areas such as community development, business, academic, sciences etc. Therefore their correctness and reliability become important factors with each succeeding release. Regression testing is an important means to ensure such factors for every new released version. Regression testing is a verification process to discover the impact of changes in other interconnected modules. With the goal of selecting an appropriate regression testing approach to respond adequately to any changes in Web applications, we conduct a complete systematic study of regression testing techniques in Web applications. Out of 64, we identified a total of 22 papers reporting experiments and case studies. We present a qualitative analysis for used tools, an overview of test case section techniques and empirical study evaluation for every selected work. No approaches were found clearly superior to other since results depend on many varying factors and the deployment environments. We identified the need of evidences where approaches are evaluated cost effective rather than technical description. Key words: Regressing testing, web-based application testing, empirical study, test set, software testing. 1. Introduction In today’s scenario, as the world became global and with the advent of internet technologies, Web-based applications become more effective manner for enterprises, and academic entities to produce business strategies and policies.
    [Show full text]
  • Big Data Benchmarking Workshop Publications
    Benchmarking Datacenter and Big Data Systems Wanling Gao, Zhen Jia, Lei Wang, Yuqing Zhu, Chunjie Luo, Yingjie Shi, Yongqiang He, Shiming Gong, Xiaona Li, Shujie Zhang, Bizhu Qiu, Lixin Zhang, Jianfeng Zhan INSTITUTE OFTECHNOLOGY COMPUTING http://prof.ict.ac.cn/ICTBench 1 Acknowledgements This work is supported by the Chinese 973 project (Grant No.2011CB302502), the Hi- Tech Research and Development (863) Program of China (Grant No.2011AA01A203, No.2013AA01A213), the NSFC project (Grant No.60933003, No.61202075) , the BNSFproject (Grant No.4133081), and Huawei funding. 2/ Big Data Benchmarking Workshop Publications BigDataBench: a Big Data Benchmark Suite from Web Search Engines. Wanling Gao, et al. The Third Workshop on Architectures and Systems for Big Data (ASBD 2013) in conjunction with ISCA 2013. Characterizing Data Analysis Workloads in Data Centers. Zhen Jia, et al. 2013 IEEE International Symposium on Workload Characterization (IISWC-2013) Characterizing OS behavior of Scale-out Data Center Workloads. Chen Zheng et al. Seventh Annual Workshop on the Interaction amongst Virtualization, Operating Systems and Computer Architecture (WIVOSCA 2013). In Conjunction with ISCA 2013.[ Characterization of Real Workloads of Web Search Engines. Huafeng Xi et al. 2011 IEEE International Symposium on Workload Characterization (IISWC-2011). The Implications of Diverse Applications and Scalable Data Sets in Benchmarking Big Data Systems. Zhen Jia et al. Second workshop of big data benchmarking (WBDB 2012 India) & Lecture Note in Computer
    [Show full text]
  • An Experimental Evaluation of Datacenter Workloads on Low-Power Embedded Micro Servers
    An Experimental Evaluation of Datacenter Workloads On Low-Power Embedded Micro Servers Yiran Zhao, Shen Li, Shaohan Hu, Hongwei Wang Shuochao Yao, Huajie Shao, Tarek Abdelzaher Department of Computer Science University of Illinois at Urbana-Champaign fzhao97, shenli3, shu17, hwang172, syao9, hshao5, [email protected] ABSTRACT To reduce datacenter energy cost, power proportional- This paper presents a comprehensive evaluation of an ultra- ity [47] is one major solution studied and pursued by both low power cluster, built upon the Intel Edison based micro academia and industry. Ideally, it allows datacenter power servers. The improved performance and high energy effi- draw to proportionally follow the fluctuating amount of work- ciency of micro servers have driven both academia and in- load, thus saving energy during non-peak hours. However, dustry to explore the possibility of replacing conventional current high-end servers are not energy-proportional and brawny servers with a larger swarm of embedded micro ser- have narrow power spectrum between idling and full uti- vers. Existing attempts mostly focus on mobile-class mi- lization [43], which is far from ideal. Therefore, researchers cro servers, whose capacities are similar to mobile phones. try to improve energy-proportionality using solutions such We, on the other hand, target on sensor-class micro servers, as dynamic provisioning and CPU power scaling. The for- which are originally intended for uses in wearable technolo- mer relies on techniques such as Wake-On-LAN [36] and gies, sensor networks, and Internet-of-Things. Although VM migration [24] to power on/off servers remotely and dy- sensor-class micro servers have much less capacity, they are namically.
    [Show full text]
  • Getting Started with Testcomplete 7
    * Windows and the Windows logo are trademarks of the Microsoft group of companies. 2 Table of Contents Table of Contents INTRODUCING AUTOMATED TESTING AND TESTCOMPLETE .....................................................3 Automated Testing......................................................................................................................................3 Test Types ...................................................................................................................................................3 TestComplete Projects and Project Items...................................................................................................4 TestComplete User Interface......................................................................................................................5 TestComplete Test Object Model................................................................................................................6 Checkpoints and Stores ..............................................................................................................................8 CREATING YOUR FIRST TEST..................................................................................................................9 1. Creating a Test Project.........................................................................................................................10 2. Defining Applications to Test ...............................................................................................................11 3. Planning
    [Show full text]