Load Testing with the Cloud

Total Page:16

File Type:pdf, Size:1020Kb

Load Testing with the Cloud Neotys White Paper - Load Testing with the Cloud Load Testing with the Cloud A Neotys Whitepaper Neotys White Paper - Load Testing with the Cloud Table of Contents Introduction ........................................................................................................................................................................................ 3 Benefits of Load Testing in the Cloud ....................................................................................................................... 4 Load Testing Inside vs. Outside the Firewall ....................................................................................................... 5 Best Practices .................................................................................................................................................................................... 6 How to Choose a Cloud Testing Solution ............................................................................................................. 8 Getting Started with Load Testing in the Cloud ......................................................................................... 12 Conclusion ......................................................................................................................................................................................... 13 About Neotys .................................................................................................................................................................................. 14 Neotys White Paper - Load Testing with the Cloud Introduction Businesses worldwide rely on the cloud to keep pace with the speed of innovation. As organizations expand, they need to support growth with computing architecture able to scale as needed, and they look to the public cloud to address this need. In the age of the Internet of Things, most enterprise apps are delivered and consumed as a service. Application performance is now a competitive differentiator that determines whether users will consume the app based on the provided user experience. In October 2017, Gartner forecasted that Worldwide Public Cloud Services Revenue would total $260 Billion in 2017 and estimated that it would attain $411 Billion by 2020. Q3 2017 data from Synergy Research Group indicated that the cloud market continues to grow by over 40% per year, and AWS’s domination of the Infrastructure-as-a-Service market continues unabated. While Microsoft, Google and Alibaba are all gaining market share, and growing their revenues much more rapidly than Amazon, Amazon remains bigger than its next five largest competitors combined. Amazon Web Services (AWS) hosts applications for every imaginable vertical business segment. These applications fulfill a plethora of needs –they allow end users to access documents and applications remotely from anywhere and from any device. They help IT managers rapidly adapt and scale their infrastructures on-demand via virtualization and a pay-per use-model. The ability to harness scalability on-demand through cloud infrastructure-as-a-service and platform-as-a-service offerings provides enterprises with the business and operational agility required to meet ever-changing market demands at the speed of business. But how does testing in the cloud impact the work of load testing engineers? To reap the benefits of cloud, testers require a hosted, paid SaaS solution for load testing, one that deploys into the public cloud, such as AWS, uses a hosting service, such as Rackspace to set up a private cloud, or can support test in a private cloud hosted on-premises. How can testers seize the opportunities afforded by the cloud to test the performance of their web applications better? This white paper describes how the cloud has revolutionized load testing and the advantages it provides for ensuring that web applications perform well in production and meet user experience thresholds. It also covers the limitations of a test strategy that relies solely on cloud-based testing, highlighting the need for a complementary internal load testing solution. After describing several best practices for load testing in the cloud, this white paper outlines key capabilities to look for in a load testing solution. Without the right tools in place, merely moving testing activities to the cloud will likely 3 Neotys White Paper - Load Testing with the Cloud not deliver the results necessary to justify the move. Understanding how to apply the right tools and practices to make the most of the cloud is fundamental to cloud-based testing and vital to ultimately going live with total peace of mind. Benefits of Load Testing in the Cloud Load testing in the cloud is beneficial for many reasons. First, by leveraging the cloud, enterprises decrease infrastructure spend by eliminating the substantial up-front costs of purchasing and maintaining test environments that may be used only infrequently. By employing a pay-as-you-go model, cloud vendors enable access to infrastructure that companies need, when they need it, for as long as they need it. From a business standpoint, the cloud lowers total cost of ownership, while increasing flexibility. Second, load testing in the cloud eliminates the time-consuming set up and configuration of complex testing environments. Testers can create and configure the machine image they need once and then replicate it in the cloud as many times as required. Often, the cloud testing provider will automate this process, saving even more time. Third, the on-demand nature of the cloud supports distributed development teams and promotes the collaboration that ensures that performance testing is supported and understood throughout the organization—before "go live" decisions are approved. Lastly, load testing in the cloud enables testing teams to conduct more efficient and realistic large-scale tests. Robust and comprehensive cloud-enabled testing solutions can support test for both unsecured HTTP and encrypted https protocols, as well as Docker containers, ensuring that complete and efficient test is supported throughout the application testing process. Perform Testing at Scale Today's web applications can experience rapid and sporadic surges in traffic based on a variety of factors, such as the launch of new products, the initiation of sales campaigns, periodic seasonal discounting, or even the use of vouchers that erroneously enable discounting across multiple product lines rather than for only the intended product. Whenever the application is unable to handle the increased load, businesses risk lost revenue, customer service spikes and damage to corporate brand and reputation. Generating the load for large-scale tests to mimic these unanticipated spikes in production traffic, however, typically requires tens or even hundreds of machines. Purchasing and configuring these systems requires a significant investment of time and money—an untenable option in today's business environment. By using the cloud, enterprises can rapidly set up as many load generating machines as required, on demand. Perform More Realistic Tests When testing a web application using infrastructure inside the firewall, testers cannot verify the entire delivery chain. Unless all end users will also be within the firewall, such tests are inherently limited and may fail to reveal all performance issues. 4 Neotys White Paper - Load Testing with the Cloud With the cloud, testers can execute load tests that access web applications as the users will—from outside the firewall—and validate all components of the delivery chain, including the firewall, DNS, network equipment, and ISP. Validating delivery chain components makes these tests more realistic. Further, they enable testers to evaluate the real-world effects of third-party components, such as content delivery networks, analytics servers, and ad servers. Because all users won't access the app from the same fixed location across the same network, a realistic load test cannot be completed from a single location. That's why it is important to test the application and its components from different locations and geographic regions and assess its performance as network bandwidth and latency changes. Load Testing Inside vs. Outside the Firewall Although the advantages of load testing in the cloud are clear, internal testing still has its place in the overall test plan, particularly when testing from outside the firewall is not feasible. Internal testing helps isolate effects that manifest from both the application and infrastructure by distinguishing them from those that are outside the firewall and potentially beyond the tester's control. Testing Inside the Firewall Some testing use cases preclude the use of the cloud. For example, an enterprise web application not designed to be accessed from the Internet will not benefit from load testing in the cloud. Likewise, a pre-production environment not set up for Internet access is best tested using an internal testing solution applied within the firewall. By using the same load testing solution for both internal testing and testing in the cloud, scripts developed for internal pre-production testing can be reused in production cloud-based testing. Using different tools for internal and cloud testing would not only require a rewrite of the scripts, but it would also increase licensing and training costs. Conducting Small Scale Tests Not all load testing requires hundreds of load generators. Even applications that anticipate many thousands of concurrent
Recommended publications
  • Hardware & Software Standards
    Hardware & Software Standards Introduction This document identifies the current City of Chicago standards for its hardware and software environments, and is intended primarily for City department and vendor use. These standards do not mean that other software and hardware, which might have been previously listed as standard, may not be used or supported, but the following items should be purchased for any new initiative or growth/replacement needs. Any proposals for non-standard hardware or software purchases or questions/comments should be forwarded to the Department of Innovation and Technology (DoIT) Enterprise Architecture Board for review, and will need to be approved via the Technology Purchase Review and Approval (TPRA) process. Standards denoted with an asterisk (*) are currently under review. Platform Standards Operating System (O/S) Hardware Platform Solaris 10 (Unix) (Oracle) Sun Microsystems RedHat Linux Enterprise Server 6.x, 7.x Dell RedHat Linux Enterprise Server 7.x (PCI Services) Dell VMWare VSphere 6.5U1 Dell Windows 2012 R2 & 2016 (Standard and Enterprise) Dell Windows 7, Windows 10 Dell, Panasonic Page 1 of 6 Last Revised January 2018 Hardware & Software Standards Enterprise Services Type Windows 2008 Server All other platforms Oracle Enterprise 11gR2, 12cR1; Postgres 9.x or 10.x (EnterpriseDB or Database N/A community) Print O/S n/a File O/S n/a Email Exchange 2016 / Office365 n/a Desktops, Laptops, & Tablets Type Model Standard Users Dell OptiPlex 5050 SFF, Dell OptiPlex 7450 All-In-One Mobile User Latitude 12 Rugged Extreme Latitude 14 Rugged 5414 Latitude 12 2 in 1 with case and Doc Latitude 5480 14" Laptop 6th gen proc High-End Workstation Dell Precision T5810 Laptop Accessories Docking- For the E-5470 units, Dell Business Dock - WD15 with 130W Adapter Monitor Dell 23 Monitor – P2317H Page 2 of 6 Last Revised January 2018 Hardware & Software Standards Printing and Scanning The Department of Fleet and Facility Management (2FM) oversees print services for the City of Chicago.
    [Show full text]
  • IT CLASSIFICATION TECHNOLOGY LIST – ISSUE DATE: April 18, 2017
    IT CLASSIFICATION TECHNOLOGY LIST – ISSUE DATE: April 18, 2017 Technology Definition: A set of knowledge, skills and/or abilities, taking a significant time (e.g. 6 months) to learn, and applicable to the defined classification specification assigned. Example of Tools: These are examples only for illustration purposes and are not meant to constitute a full and/or comprehensive list. CLASSIFICATION DISCIPLINE TECHNOLOGY DEFINITION EXAMPLE OF TOOLS Omegamon, IBM Admin Tools, Log Analyzer, Relational The relational database management system provided by IBM that runs on Unix, Linux, DBA DB2 DB2 Compare, Nsynch, TSM, Universal Database Windows and z/OS platforms including DB2 Connect and related tools. Command, SQL SQL Server Mgmt. Studio, Red Gate, Vantage, SQL Server The relational database management system and related tools provided by Microsoft Corp. Tivoli, Snap Manager, Toad, Enterprise Manager, SQL ORACLE The relational database management system and related tools provided by Oracle Corp. ASE SYBASE The relational database management system and related tools provided by Sybase. Cincom SUPRA SQL – Cincom’s relational database management system provides access to data Supra 2.X in open and proprietary environments through industry-standard SQL for standalone and client/server application solutions. Open Source Open Source database management system such as Mysql. Phpadmin, mysqladmin, Vertica Hierarchical The hierarchical database management system provided by IBM that runs on z/OS mainframe IMS BMC IMS Utilities, Strobe, Omegamon Database platform including related tools. Cincom SUPRA® PDM – Cincom’s networked, hierarchical database management system provides access to your data through a Physical Data Manager (PDM) that manages the data Supra 1 structures of the physical files that store the data.
    [Show full text]
  • Enterprise Development with Flex
    Enterprise Development with Flex Enterprise Development with Flex Yakov Fain, Victor Rasputnis, and Anatole Tartakovsky Beijing • Cambridge • Farnham • Köln • Sebastopol • Taipei • Tokyo Enterprise Development with Flex by Yakov Fain, Victor Rasputnis, and Anatole Tartakovsky Copyright © 2010 Yakov Fain, Victor Rasputnis, and Anatole Tartakovsky.. All rights reserved. Printed in the United States of America. Published by O’Reilly Media, Inc., 1005 Gravenstein Highway North, Sebastopol, CA 95472. O’Reilly books may be purchased for educational, business, or sales promotional use. Online editions are also available for most titles (http://my.safaribooksonline.com). For more information, contact our corporate/institutional sales department: (800) 998-9938 or [email protected]. Editor: Mary E. Treseler Indexer: Ellen Troutman Development Editor: Linda Laflamme Cover Designer: Karen Montgomery Production Editor: Adam Zaremba Interior Designer: David Futato Copyeditor: Nancy Kotary Illustrator: Robert Romano Proofreader: Sada Preisch Printing History: March 2010: First Edition. Nutshell Handbook, the Nutshell Handbook logo, and the O’Reilly logo are registered trademarks of O’Reilly Media, Inc. Enterprise Development with Flex, the image of red-crested wood-quails, and related trade dress are trademarks of O’Reilly Media, Inc. Many of the designations used by manufacturers and sellers to distinguish their products are claimed as trademarks. Where those designations appear in this book, and O’Reilly Media, Inc. was aware of a trademark claim, the designations have been printed in caps or initial caps. While every precaution has been taken in the preparation of this book, the publisher and authors assume no responsibility for errors or omissions, or for damages resulting from the use of the information con- tained herein.
    [Show full text]
  • Web-Engineering" Kapitel 9: Website-Analyse Und -Optimierung
    Reader zum MOOC "Web-Engineering" Kapitel 9: Website-Analyse und -Optimierung Die PDF-Datei wurde mit Hilfe des Open-Source-Werkzeugs „mwlib“ erstellt. Für weitere Informationen siehe http://code.pediapress.com/ PDF generated at: Sat, 14 Dec 2013 16:53:28 UTC Inhalt Artikel Web-Controlling 1 Web Analytics 1 Google Analytics 7 Piwik 10 AWStats 12 Webalizer 14 Logdateianalyse 16 Ganglinie 20 Konversion (Marketing) 22 Conversion-Tracking 24 Click-Through-Rate 25 Besucherzähler 26 Unique Visit 27 Referrer 28 Seitenabruf 30 Zählpixel 32 Cron 34 Do Not Track 38 Web-Monitoring 41 Munin (Software) 41 WPO: Web Performance Optimization 44 Google Page Speed 44 SPDY 45 Performnace-Test 47 Lasttest (Computer) 47 Stresstest 50 Apache JMeter 52 Dienstgütevereinbarung für Websites 53 Service-Level-Agreement 53 Webhosting 55 SMART (Projektmanagement) 58 Downtime 59 Verfügbarkeit 60 Paketumlaufzeit 62 Quellennachweise Quelle(n) und Bearbeiter des/der Artikel(s) 64 Quelle(n), Lizenz(en) und Autor(en) des Bildes 66 Artikellizenzen Lizenz 67 1 Web-Controlling Web Analytics Web Analytics (auch Web-Controlling, Web-Analyse, Datenverkehrsanalyse, Traffic-Analyse, Clickstream-Analyse, Webtracking) ist die Sammlung und Auswertung des Verhaltens von Besuchern auf Websites. Ein Analytic-Tool untersucht typischerweise, woher die Besucher kommen, welche Bereiche auf einer Internetseite aufgesucht werden und wie oft und wie lange welche Unterseiten und Kategorien angesehen werden. In Deutschland ist der Einsatz solcher Werkzeuge aus Datenschutzgründen umstritten. Sie wird vor allem zur Optimierung der Website und zur besseren Erreichung von Zielen der Website (z. B. Häufigkeit von Besuchen, Vermehrung von Seitenaufrufen, Bestellungen, Newsletter-Abonnements) eingesetzt. Grundlegend kann bei Web Analytics zwischen Auswertungsverfahren zur permanenten Messung der Site-Effektivität und Methoden zur Auffindung von Schwachpunkten in der Site und Verbesserungsmöglichkeiten unterschieden werden (siehe Methoden).
    [Show full text]
  • The 20 Best 2018 Performance Tester Insights
    Whitepaper The 20 Best 2018 Performance Tester insights THE 20 BEST 2018 PERFORMANCE TESTER INSIGHTS 2nd edition 20 speakers from 10 countries 24 hours live 35 cup of coffee 800 registrants 950 stickers Whitepaper The 20 Best 2018 Performance Tester insights This past June, Neotys brought together 20 experts from around the world to share their experience and knowledge regarding techniques, challenges, and best practices associated with load testing for the modern enterprise. The gathering was delivered online as a virtual event. The second annual Performance Advisory Council (PAC) event, proved to be transformational. Attendees were treated to new and innovative practices on the art and science of performance testing. Each presentation provided a wealth of information with insights and techniques aimed at performance testing improvement. The following is a synopsis of several presentations made during the event. We’re happy to share this information in the spirit of creating a more well informed, efficient community of test practitioners. Enabling user story sharing and learning on an international basis is critical to us. Stijn Schepers Performance Testing is Not an Average Game! Stijn Schepers has over 18 years of international experience as a performance test specialist/manager using both Agile (Scrum, SAFe) and Waterfall models. He has worked in Europe, New Zealand, and Australia as a performance engineer testing and optimizing complex enterprise solutions like Oracle: Financials, Forms, Web Center, CRM Applications, JD Edwards, J2EE solutions, Vignette, SIEBEL, SAP, Cúram, IBM Websphere, and Mule. Schepers is a visionary leader with in-depth technical knowledge in concept and practice driving performance testing in the modern enterprise.
    [Show full text]
  • Directory of Skills & Certifications
    GLOBAL TECH. LOCAL TRUST. Directory TECHNICAL SKILLS & CERTIFICATIONS SMC Squared | Dallas • Minneapolis • Bangalore Let’s bring your ideas to life. Call to schedule your complimentary SAM today. +1 833-762-2552 | [email protected] | smc2.com TECHNICAL SKILLS & EXPERIENCE A Azure Function apps D H Accounting and Administrative Azure Functions Data Analytics and Reporting Hadoop Actitime/Project Insight - Time Tracking Tools Azure Gen2 Data Architecture Hbase Active Directory Azure HD Insights Data Bricks HDFS ActiveVOS Azure IoT Hub Data Governance HDFS and KAFKA AD DS Azure Key Vault Data Integeration HD Insight ADF Azure Logic Apps Data Lake Delivery HDP/CDH/MAPR Administrator Azure Machine Learning Data Modeling Hibernate ADO.NET Azure MicroServices Data Processing Hive Adobe Analytics Azure ML Studio Data Science Hortonworks Adobe Experience Manager Azure PAAS Data Science Delivery HP LoadRunner Adobe Experience Manager 6 Azure Pipelines Data Visualization HR Adobe Photoshop Azure Portal Data Warehousing HRIS Aerospike Azure Services Database (Oracle & Sybase) HTML/HTML5 Agile Azure Spark Databricks Database Design HTML/CSS Agile Testing Azure SQL DataBase Database Testing Hue Agile/ Scrum Methodologies Azure Stream Analytics DataProc Hybrid app development using Apache Airflow Azure Testing DB Testing Cordova with HTML5, CSS and jQuery AJAX B DB Visualizer Hybris Platform Akamai and Tata CDN B2B Commerce Cloud DB2 I Akka Http Backend Testing DBA MYSQL IBM IHS Alteryx Bash Scripting DBVisualizer/ PHPMyAdmin - DB Tools IDD Analysis Bash
    [Show full text]
  • A BZ Media Publication
    SDT333 cover_Layout 1 12/22/16 2:26 PM Page 1 A BZ Media Publication JANUARY 2017 • ISSUE NO. 333 • $9.95 • www.sdtimes.com SDT333 Full Page Ads_Layout 1 12/21/16 9:15 AM Page 2 Data Quality Made Easy. Your Data, Your Way. NAME @ Melissa Data provides the full spectrum of data Our data quality solutions are available quality to ensure you have data you can trust. on-premises and in the Cloud – fast, easy to use, and powerful developer tools, We profile, standardize, verify, match and integrations and plugins for the Microsoft enrich global People Data – name, address, and Oracle Product Ecosystems. email & phone. Start Your Free Trial www.MelissaData.com/sd-times Germany India www.MelissaData.de www.MelissaData.in United Kingdom Australia www.MelissaData.co.uk www.MelissaData.com.au www.MelissaData.com 1-800-MELISSA SDT333 page 3_Layout 1 12/21/16 1:45 PM Page 3 Contents ISSUE 333 • JANUARY 2017 NEWS FEATURES 6 News Watch 2016: The Year of Artificial Intelligence 18 2017 trends in software development 20 Top retailers are open to hacks 23 Android 7.1: What you can expect 24 HashiCorp and the state of automation page 10 26 Notes from Node.js Interactive 29 The Software Testing World Cup chronicles Digital transformation is essential to the future of business 33 CollabNet enters DevOps arena 39 Red Hat powers the API economy page 34 COLUMNS 48 GUEST VIEW by Alan Ho Move fast while avoiding Diffusing the monolith time bomb automated testing pitfalls 49 ANALYST VIEW by Al Hilwa Immigration, progress and politics 50 INDUSTRY WATCH by David Rubinstein 2017: The future starts now page 41 Software Development Times (ISSN 1528-1965) is published 12 times per year by BZ Media LLC, 225 Broadhollow Road, Suite 211, Melville, NY 11747.
    [Show full text]
  • Rapise User Manual Version 3.0 Inflectra Corporation
    Rapise User Manual Version 3.0 Inflectra Corporation Tuesday, May 23, 2017 © 2017 Inflectra Corporation 2 Rapise User Manual Table of Contents Foreword 0 Part I Company & Copyright 6 Part II Rapise User's Guide 7 1 About ...................................................................................................................................this Guide 7 2 Getting................................................................................................................................... Started 8 Overview .......................................................................................................................................................... 8 Samples.......................................................................................................................................................... Index 10 Tutorial: ..........................................................................................................................................................Web Testing 13 Using......................................................................................................................................................... JavaScript 35 Tutorial: ..........................................................................................................................................................Windows Testing 42 Using......................................................................................................................................................... JavaScript 53
    [Show full text]
  • Performance Testing with Jmeter Second Edition
    [ 1 ] Performance Testing with JMeter Second Edition Test web applications using Apache JMeter with practical, hands-on examples Bayo Erinle BIRMINGHAM - MUMBAI Performance Testing with JMeter Second Edition Copyright © 2015 Packt Publishing All rights reserved. No part of this book may be reproduced, stored in a retrieval system, or transmitted in any form or by any means, without the prior written permission of the publisher, except in the case of brief quotations embedded in critical articles or reviews. Every effort has been made in the preparation of this book to ensure the accuracy of the information presented. However, the information contained in this book is sold without warranty, either express or implied. Neither the author nor Packt Publishing, and its dealers and distributors will be held liable for any damages caused or alleged to be caused directly or indirectly by this book. Packt Publishing has endeavored to provide trademark information about all of the companies and products mentioned in this book by the appropriate use of capitals. However, Packt Publishing cannot guarantee the accuracy of this information. First published: July 2013 Second edition: April 2015 Production reference: 1200415 Published by Packt Publishing Ltd. Livery Place 35 Livery Street Birmingham B3 2PB, UK. ISBN 978-1-78439-481-3 www.packtpub.com Credits Author Project Coordinator Bayo Erinle Kinjal Bari Reviewers Proofreaders Vinay Madan Simran Bhogal Satyajit Rai Safis Editing Ripon Al Wasim Joanna McMahon Commissioning Editor Indexer Pramila Balan Monica Ajmera Mehta Acquisition Editor Production Coordinator Llewellyn Rozario Arvindkumar Gupta Content Development Editor Cover Work Adrian Raposo Arvindkumar Gupta Technical Editors Tanvi Bhatt Narsimha Pai Mahesh Rao Copy Editors Charlotte Carneiro Pranjali Chury Rashmi Sawant About the Author Bayo Erinle is an author and senior software engineer with over 11 years of experience in designing, developing, testing, and architecting software.
    [Show full text]
  • Performance Testing of Desktop Application Using Loadrunner
    Performance Testing Of Desktop Application Using Loadrunner heigh!Monogenistic Quinn isOzzie diminuendo: dyking pestiferously. she ventures Horrendous disproportionately and guttural and decapitating Herold recodes her saltando.some attires so Your teams an idea of application respond to make an existing scenario What is web application has since then be a performance testing of desktop application using loadrunner in loadrunner integrates with any browser is the exact cause response time testers. It comes to intercept the test? Open any application characteristics of record or database servers using your site, does not of performance testing desktop using application? In a testing desktop. Documentum performance testing using Load Runner. Of testing related protocols in loadrunner the functionality of requests via amf message would expect reports and performance testing of desktop application using loadrunner? Poor performance desktop application that performance desktop. What you can be identified the script? There were also. Also on performance testing of desktop application using loadrunner with several servers are key features and stress test and traffic load impact on the user interface and we will try the uploaded jar. We strive to identify all user responsiveness, and provide vusers to reset all, but difficult to see advantages of your application is good for helping us! Performance testing is an essential medium of any web application development. Execution including perl. Flexible deployment options allow giant to applications, you can select which desktop client with key field. What is important graphs that provides simple. And whether the primary impact conversions, graphs sufficient for this uncovers many simultaneous users and memory leaks during software testers, or https by.
    [Show full text]
  • Self-Scalable Benchmarking As a Service with Automatic Saturation
    Self-scalable Benchmarking as a Service with Automatic Saturation Detection Alain Tchana, Bruno Dillenseger, Noel de Palma, Xavier Etchevers, Jean-Marc Vincent, Nabila Salmi, Ahmed Harbaoui To cite this version: Alain Tchana, Bruno Dillenseger, Noel de Palma, Xavier Etchevers, Jean-Marc Vincent, et al.. Self- scalable Benchmarking as a Service with Automatic Saturation Detection. 14th International Mid- dleware Conference (Middleware), Dec 2013, Beijing, China. pp.389-404, 10.1007/978-3-642-45065- 5_20. hal-00949560 HAL Id: hal-00949560 https://hal.inria.fr/hal-00949560 Submitted on 2 Mar 2017 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. Distributed under a Creative Commons Attribution| 4.0 International License Self-scalable Benchmarking as a Service with Automatic Saturation Detection Alain Tchana1, Bruno Dillenseger2, Noel De Palma1, Xavier Etchevers2, Jean-Marc Vincent1, Nabila Salmi2, and Ahmed Harbaoui2 1 Joseph Fourier University, LIG, Grenoble, France, Email: fi[email protected], 2 Orange Labs, Grenoble, France, fi[email protected] Abstract. Software applications providers have always been required to perform load testing prior to launching new applications. This crucial test phase is expensive in human and hardware terms, and the solutions generally used would benefit from further development.
    [Show full text]
  • Performance Engineering 2020 a Sogeti and Neotys Report the State of Performance Engineering 2020 — Contents
    The State of Performance Engineering 2020 A Sogeti and Neotys report The state of performance engineering 2020 — contents Introduction to performance engineering 3 Chapter 1: Culture 6 Enterprise spotlight: IMA 15 Chapter 2: Organisation 18 Chapter 3: Practices 27 Enterprise spotlight: MAIF 42 Chapter 4: Building in performance engineering 44 Chapter 5: Outlook 57 Closing thoughts: the practitioners’ corner 68 Acknowledgements, About the study, Respondents, Footnotes and references, About Neotys, About Sogeti 74 2 THE STATE OF PERFORMANCE ENGINEERING 2020 PRESENTED BY SOGETI AND NEOTYS Introduction In the first half of 2020, application performance became Welcome to the 2020 State of Performance highly visible and imperative. At the time of writing this Engineering report report, video conferencing, teleconsultation, online shopping, and many online transactions were all in the The discipline of performance engineering is often midst of extraordinary surges of use. Companies guided reserved for specialists. There is little data available by a strong proactive vision in digital business have to enterprises to help them understand the current managed to navigate successfully in these uncertain practices and how other organisations are managing it. external circumstances. For instance, UK home furnishing This research fills the void by combining the opinions retailer Dunelm reported a 100%+ increase in sales of 515 senior decision makers and the perspective during the lockdown weeks of Spring 2020. of subject matter experts from Sogeti, Neotys and Organisations are changing the way they operate — outside practitioners to explore the landscape of with a surge in remote work — and they increasingly performance engineering. This report reveals the place serve their customers through digital engagement.
    [Show full text]