Security and Performance Testing for Improving Quality of Distributed Applications Working in Public-Private Network Infrastructures

Total Page:16

File Type:pdf, Size:1020Kb

Security and Performance Testing for Improving Quality of Distributed Applications Working in Public-Private Network Infrastructures Performance and Security Testing for Improving Quality of Distributed Applications Working in Public/Private Network Environments GDANSK UNIVERSITY OF TECHNOLOGY Faculty of Electronics, Telecommunications and Informatics Marcin Adam Barylski Performance and Security Testing for Improving Quality of Distributed Applications Working in Public/Private Network Environments PhD Dissertation Supervisor: prof. dr hab. in Ŝ. Henryk Krawczyk Faculty of Electronics, Telecommunications and Informatics Gdansk University of Technology Gdansk, 2010 Performance and Security Testing for Improving Quality of Distributed Applications Working in Public/Private Network Environments To my wife, Ewa 2 Performance and Security Testing for Improving Quality of Distributed Applications Working in Public/Private Network Environments Acknowledgments First of all, I would like to thank Professor Henryk Krawczyk, a professor at Faculty of Electronics, Telecommunications and Informatics, President of Gda ńsk University of Technology, for priceless motivation, valuable advices, and his unlimited time for me that was required to write this dissertation. This dissertation would not be ever possible without help of my family. Firstly I would like to thank my wife, Ewa, for her patience, being always ready to give me a helpful hand, and invaluable day-by-day support. My parents taught me how to discover the world around me and without their education I would not ever start my studies. I would like to thank Mrs. Izabela Dziedzic for taking care of all formal and organizational matters. The crucial field test of one of the most important deliverable of this dissertation, MA2QA, would not be possible without full support of Jerzy Proficz from TASK, who introduced me to KASKADA test environment. Finally, I would like to thank my employer, Intel Technology Poland, for enabling access to best know methods. 3 Performance and Security Testing for Improving Quality of Distributed Applications Working in Public/Private Network Environments Table of Contents INDEX OF FIGURES ...................................................................................................................7 INDEX OF TABLES ..................................................................................................................10 GLOSSARY .............................................................................................................................12 LIST OF ACRONYMS ...............................................................................................................16 CHAPTER 1: INTRODUCTION .........................................................................................21 1.1. B ACKGROUND OF SECURITY AND PERFORMANCE TESTING .............................................21 1.2. I NTRODUCTION TO DISTRIBUTED PUBLIC -PRIVATE NETWORK ENVIRONMENTS ...............22 1.3. G OAL AND SCOPE OF THE DISSERTATION ........................................................................26 1.4. C LAIMS OF THE DISSERTATION .......................................................................................28 1.5. D OCUMENT STRUCTURE .................................................................................................28 CHAPTER 2: CHARACTERIZATION OF PRIVATE-PUBLIC IPSEC AND HTTPS APPLICATIONS....................................................................................................................30 2.1. IPS EC -BASED DISTRIBUTED APPLICATIONS DESIGN ........................................................30 2.1.1. Introduction to IPSec..............................................................................................30 2.1.2. ESP security............................................................................................................32 2.1.3. ESP performance....................................................................................................42 2.1.4. IKEv2 security and performance............................................................................44 2.2. HTTPS-BASED DISTRIBUTED APPLICATIONS DESIGN ......................................................46 2.2.1. Introduction to HTTPS ...........................................................................................46 2.2.2. HTTPS security.......................................................................................................49 2.2.3. HTTPS performance...............................................................................................52 2.3. D ISTRIBUTED APPLICATIONS WORKING IN IPS EC /HTTPS ENVIRONMENTS ....................54 2.3.1. Request/Response (R/R) solution............................................................................54 2.3.2. Publish/Subscribe (P/S) solution............................................................................55 2.3.3. Concept of a secure service processing continuous multimedia data....................56 2.3.4. Security and performance of continuous multimedia streams distribution............62 2.4. S UMMARY ......................................................................................................................65 CHAPTER 3. SELECTION OF SECURITY AND PERFORMANCE TESTING PROCEDURES.......................................................................................................................66 3.1. T HE GIST OF QUALITY CONTROL .....................................................................................66 3.2. F UNDAMENTALS OF SW PERFORMANCE TESTING ...........................................................70 3.3. N ETWORK LAYER PERFORMANCE TESTS .........................................................................72 3.3.1. Network throughput testing ....................................................................................73 3.3.2. Network latency testing ..........................................................................................84 3.4. M IDDLEWARE LAYER PERFORMANCE TESTS ...................................................................87 3.4.1. DB performance tests .............................................................................................90 3.4.2. WS performance tests .............................................................................................92 3.4.3. Web performance tests ...........................................................................................93 3.5. F UNDAMENTALS OF SW SECURITY .................................................................................97 3.6. SW SECURITY TESTING ...................................................................................................98 3.6.1. Scope of security testing .........................................................................................98 3.6.2. Security attacks.....................................................................................................100 3.7. IPS EC PERFORMANCE AND SECURITY TESTING .............................................................104 3.8. HTTPS PERFORMANCE AND SECURITY TESTING ...........................................................107 3.9. S UMMARY ....................................................................................................................110 4 Performance and Security Testing for Improving Quality of Distributed Applications Working in Public/Private Network Environments CHAPTER 4: PROPOSAL OF MA2QA APPROACH....................................................111 4.1. A PPLICATION MODEL ....................................................................................................111 4.1.1. Subject of analysis ................................................................................................111 4.1.2. Design and implementation for performance and security ..................................112 4.2. Q UALITY MODEL ..........................................................................................................114 4.2.1. Quality tree...........................................................................................................114 4.2.2. Scope of quality analysis ......................................................................................115 4.2.3. Method for finding the correlations between the metrics.....................................116 4.3. M ULTIDIMENSIONAL APPROACH TO QUALITY ANALYSIS (MA2QA)...........................116 4.3.1. MA2QA fundamentals...........................................................................................116 4.3.2. MA2QA usage in iterative application development............................................117 4.3.3. Compromising security and performance ............................................................118 4.3.4. MA2QA quality vector..........................................................................................121 4.3.5. Sample MA2QA score card ..................................................................................126 4.3.6. Sample MA2QA evaluation ..................................................................................128 4.4. S UMMARY ....................................................................................................................128 CHAPTER 5. EXPERIMENTS AND RESULTS .............................................................130 5.1. G OAL AND PLAN OF EXPERIMENTS ................................................................................130 5.2. E XPERIMENT 1 (EXP1): ENDPOINT AUTHENTICATION VS . USER INPUT CONFIRMATION LATENCY .............................................................................................................................132
Recommended publications
  • IBM Systems Group
    IBM Systems Group IBM ~ p5 520 IBM ~ p5 550 Milan Patel Offering Manager, pSeries Entry Servers v 3.04 IBM Confidential © 2004 IBM Corporation This presentation is intended for the education of IBM and Business Partner sales personnel. It should not be distributed to clients. IBM Systems Group Agenda ❧Learning Objectives ❧Offering Description – p5-520 1.65GHz (2 way) – p5-550 1.65GHz (2-4way) ❧Selling Scenarios ❧Pricing and Positioning – The Value Paks New – Portfolio Positioning – Market Positioning – Retention Positioning ❧Target Sectors and Applications ❧Speeds and Feeds © 2004 IBM Corporation page 2 Template Version 3.04 IBM Systems Group Field Skills & Education IBM Confidential IBM Systems Group Learning Objectives At the conclusion of this material, you should be able to: ❧ Articulate the key messages and value-prop of ~ p5-520, ~ p5-550 ❧ Identify the opportunities and target sectors for the ~ p5-520, ~ p5-550 ❧ Articulate the enhancements and benefits of the ~ p5-520, ~ p5-550 ❧ Explain how Advanced POWER™ virtualization can help to reduce costs and simplify customer environments © 2004 IBM Corporation page 3 Template Version 3.04 IBM Systems Group Field Skills & Education IBM Confidential IBM Systems Group Offerings Overview p5-520 and p5-550 © 2004 IBM Corporation page 4 Template Version 3.04 IBM Systems Group Field Skills & Education IBM Confidential IBM Systems Group IBM ~ p5 520 What is p5-520? ❧ p5-520 is a 2 way entry server that complements the p5-550 in the entry space ❧ p5-520 delivers - – Outstanding performance –
    [Show full text]
  • Multiply Your Power!
    Matthias Koechl Senior IT Architect IBM SAP Competence Center Walldorf, Germany Multiply Your Power! - combine the power of SAP® business solutions with IBM Power System™ and AIX™ strengths Power your planet Last Update 09/2010 by MK © 2010 IBM Corporation IBM SAP Alliance 2 ISICC Walldorf - MK, 09/2010 © 2010 IBM Corporation IBM SAP Alliance Agenda IBM Power System™ News Principles of POWER™ virtualization Benefits of POWER for SAP landscapes AIX for SAP Business Applications System Management 3 ISICC Walldorf - MK, 09/2010 © 2010 IBM Corporation IBM SAP Alliance Agenda IBM Power System™ News Principles of POWER™ virtualization Benefits of POWER for SAP landscapes AIX for SAP Business Applications System Management 4 ISICC Walldorf - MK, 09/2010 © 2010 IBM Corporation IBM SAP Alliance IBM’s 10 years march to Unix leadership UNIX Server Rolling Four Quarter Average Revenue Share 45% POWER6 40% Live Partition Mobility POWER7 35% POWER5 Workload Optimized Micro -Partitioning Leadership 30% POWER4 25% Dynamic LPARs 20% HP Sun IBM Source: IDC Quarterly Server Tracker May 2010 15% 0 1 10 101 30 307 109 110 Q Q300 Q Q Q102 Q302 Q103 Q303 Q104 Q304 Q105 Q305 Q106 Q306 Q107 Q Q108 Q308 Q Q309 Q For SAP new installs on UNIX platform Power System w/ AIX market share is in the 50% range Even better than overall Unix share of AIX SAP customers rely on the excellent Power platform! 5 ISICC Walldorf - MK, 09/2010 © 2010 IBM Corporation IBM SAP Alliance Power Systems: Only UNIX platform to grow Source: IDC Server Tracker Q104 - Q109 Server Tracker, 06/09, rolling four quarter average 6 ISICC Walldorf - MK, 09/2010 © 2010 IBM Corporation IBM SAP Alliance 2,100 successful Power Migration Factory migrations to date.
    [Show full text]
  • IBM Power Systems Performance Report Apr 13, 2021
    IBM Power Performance Report Power7 to Power10 September 8, 2021 Table of Contents 3 Introduction to Performance of IBM UNIX, IBM i, and Linux Operating System Servers 4 Section 1 – SPEC® CPU Benchmark Performance 4 Section 1a – Linux Multi-user SPEC® CPU2017 Performance (Power10) 4 Section 1b – Linux Multi-user SPEC® CPU2017 Performance (Power9) 4 Section 1c – AIX Multi-user SPEC® CPU2006 Performance (Power7, Power7+, Power8) 5 Section 1d – Linux Multi-user SPEC® CPU2006 Performance (Power7, Power7+, Power8) 6 Section 2 – AIX Multi-user Performance (rPerf) 6 Section 2a – AIX Multi-user Performance (Power8, Power9 and Power10) 9 Section 2b – AIX Multi-user Performance (Power9) in Non-default Processor Power Mode Setting 9 Section 2c – AIX Multi-user Performance (Power7 and Power7+) 13 Section 2d – AIX Capacity Upgrade on Demand Relative Performance Guidelines (Power8) 15 Section 2e – AIX Capacity Upgrade on Demand Relative Performance Guidelines (Power7 and Power7+) 20 Section 3 – CPW Benchmark Performance 19 Section 3a – CPW Benchmark Performance (Power8, Power9 and Power10) 22 Section 3b – CPW Benchmark Performance (Power7 and Power7+) 25 Section 4 – SPECjbb®2015 Benchmark Performance 25 Section 4a – SPECjbb®2015 Benchmark Performance (Power9) 25 Section 4b – SPECjbb®2015 Benchmark Performance (Power8) 25 Section 5 – AIX SAP® Standard Application Benchmark Performance 25 Section 5a – SAP® Sales and Distribution (SD) 2-Tier – AIX (Power7 to Power8) 26 Section 5b – SAP® Sales and Distribution (SD) 2-Tier – Linux on Power (Power7 to Power7+)
    [Show full text]
  • IBM Power Systems Facts and Features POWER7 Blades and Servers
    IBM Power Systems IBM Power Systems Facts and Features POWER7 Blades and Servers November 2011 IBM Power Systems™ servers and IBM BladeCenter® blade servers using IBM POWER6® and POWER6+™ processors are described in a separate Facts and Features report dated April 2010 (POB03004-USEN-14). Power is performance redefined 1 Deliver new services faster, with higher quality and superior economics IBM Power Systems Table of Contents BladeCenter PS700 4 BladeCenter PS701 & PS702 5 BladeCenter PS703 & PS704 6 Power® 710 and 730 Express (PCIe Gen1) 7 Power® 710 and 730 Express (PCIe Gen2) 8 Power 720 and 740 Express (PCIe Gen1) 9 Power 720 and 740 Express (PCIe Gen2) 10 Power 750 Express 11 Power 755 12 Power 770 13 Power 780 14 Power 795 15 System Unit Details 16 Server I/O Drawers & Attachment 20 Physical Planning Characteristics 21 Warranty 23 Power Systems Software Support 24 Performance Notes & More Information 26 These notes apply to the description tables for IBM POWER7® systems: Y Standard / Supported Optional Optionally Available / Supported N/A or - Not Available / Supported or Not Applicable M CoD capabilities include: Capacity Upgrade on Demand option – permanent processor or memory activation, On/Off Capacity on Demand – temporary processor or memory activation by the day, Utility Capacity on Demand – temporary processor activation by the minute, and Trial Capacity on Demand. SOD Statement of General Direction announced SLES SUSE Linux Enterprise Server RHEL Red Hat Enterprise Linux a If installed in BladeCenter E or T chassis, Advanced Management Module is required and other restrictions may apply. The Power 770 and Power 780 introduced in 2011 with PCIe Gen2 slots are included in this document.
    [Show full text]
  • Cigniti Technologies Service Sheet
    Cigniti Technologies Service Sheet Cigniti Technologies is an Independent Software Testing Services Company headquartered in Irving, Texas. Cigniti offers Test Consulting, Enterprise Mobility Testing, Regression Automation, Security and Performance Testing services. Over the last 14 years, leveraging its Smart Tools, Cigniti helped Enterprises and ISVs build quality software while improving time to market and reducing cost of quality. Designed to deliver quality solutions, the service portfolio spans technologies transcending Cloud, Mobile, Social, Big Data and Enter- prise IT across verticals and ensures impeccable standards of QA. World’s 3rd Largest Independent Software Testing Services Company Dallas | Hyderabad | Atlanta | San Jose | Des Moines | London | Toronto | www.cigniti.com Service Sheet Over the last decade, there has been a lot of transformation in the philosophy of testing and several tools have been introduced in the market place. With so many choices around, it has become a real challenge to choose the right testing partner / service provider. The Testing industry has witnessed a phenomenal growth in term of offerings, tools, and infrastructure with diverse engagement models. Now, it has become a lot more important for organizations to choose the right testing partner. Cigniti delivers Independent Quality Assurance Services backed by its proprietary IP. As a strategic partner, Cigniti brings comprehensive service offerings in the Quality Assurance space which accelerate overall test efforts for its clients. Cigniti’s Service Offerings TCoE Performance Testing Security Testing Cigniti's Test Center of Excellence Cigniti’s new age performance Cigniti’s security testing services lays down roadmaps for scalable testing frameworks initiate and ensure early detection with com- frameworks that enrich business revitalize performance across prehensive vulnerability assess- outcomes with speed, skill and systems, networks and software ment and match the emerging accuracy.
    [Show full text]
  • Test Center of Excellence How Can It Be Set Up? ISSN 1866-5705
    ISSN 1866-5705 www.testingexperience.com free digital version print version 8,00 € printed in Germany 18 The Magazine for Professional Testers The MagazineforProfessional Test Center of Excellence Center Test How can itbesetup? How June 2012 Pragmatic, Soft Skills Focused, Industry Supported CAT is no ordinary certification, but a professional jour- The certification does not simply promote absorption ney into the world of Agile. As with any voyage you have of the theory through academic mediums but encour- to take the first step. You may have some experience ages you to experiment, in the safe environment of the with Agile from your current or previous employment or classroom, through the extensive discussion forums you may be venturing out into the unknown. Either way and daily practicals. Over 50% of the initial course is CAT has been specifically designed to partner and guide based around practical application of the techniques you through all aspects of your tour. and methods that you learn, focused on building the The focus of the course is to look at how you the tes- skills you already have as a tester. This then prepares ter can make a valuable contribution to these activities you, on returning to your employer, to be Agile. even if they are not currently your core abilities. This The transition into a Professional Agile Tester team course assumes that you already know how to be a tes- member culminates with on the job assessments, dem- ter, understand the fundamental testing techniques and onstrated abilities in Agile expertise through such fo- testing practices, leading you to transition into an Agile rums as presentations at conferences or Special Interest team.
    [Show full text]
  • NCA’04) 0-7695-2242-4/04 $ 20.00 IEEE Interesting Proposals
    WALTy: A User Behavior Tailored Tool for Evaluating Web Application Performance G. Ruffo, R. Schifanella, and M. Sereno R. Politi Dipartimento di Informatica CSP Sc.a.r.l. c/o Villa Gualino Universita` degli Studi di Torino Viale Settimio Severo, 65 - 10133 Torino (ITALY) Corso Svizzera, 185 - 10149 Torino (ITALY) Abstract ticular, stressing tools make what-if analysis practical in real systems, because workload intensity can be scaled to In this paper we present WALTy (Web Application Load- analyst hypothesis, i.e., a workload emulating the activity based Testing tool), a set of tools that allows the perfor- of N users with pre-defined behaviors can be replicated, mance analysis of web applications by means of a scalable in order to monitor a set of performance parameters dur- what-if analysis on the test bed. The proposed approach is ing test. Such workload is based on sequences of object- based on a workload characterization generated from infor- requests and/or analytical characterization, but sometimes mation extracted from log files. The workload is generated they are poorly scalable by the analyst; in fact, in many by using of Customer Behavior Model Graphs (CBMG), stressing framework, we can just replicate a (set of) man- that are derived by extracting information from the web ap- ually or randomly generated sessions, losing in objectivity plication log files. In this manner the synthetic workload and representativeness. used to evaluate the web application under test is repre- The main scope of this paper is to present a methodol- sentative of the real traffic that the web application has to ogy based on the generation of a representative traffic.
    [Show full text]
  • IBM Power Systems Performance Report Archive – Aug 2008
    IBM Power Systems Performance Report Archive – Aug 2008 POWER5 and Beyond August 28, 2008 Table of Contents Performance of IBM UNIX, IBM i and Linux Operating System Servers ..................... 3 Section 1 - AIX SPEC CPU2006 HPC Performance ........................................................... 4 Section 1a - AIX SPEC CPU2000 Performance .................................................................. 4 Section 2 - AIX Multiuser Performance (rPerf, SPEC CPU2006) ................................... 5 Section 2a - AIX Multiuser Performance (rPerf, SPEC CPU2000, SPECweb99) ........ 6 Section 2b - Power Sytems Multiuser Performance using AIX 5L V5.2 ...................... 9 Section 2c - AIX Capacity Upgrade on Demand Relative Performance Guidelines 12 Section 2d - CPW Published Results…………………………………………………………………15 Section 3- TPC-C Version 5 Published Results ............................................................... 16 Section 4 - TPC-H Published Results ................................................................................. 16 Section 5 - AIX SPECsfs97_R1 Benchmark Results ...................................................... 16 Section 5a - AIX NotesBench Published Results ............................................................ 17 Section 6 - AIX Java Benchmarks (SPECjvm98, SPECjbb2000, SPECjbb2005) Published Results ........................................................................................................................................ 17 Section 6a - IBM i Java Benchmarks (SPECjbb2005) Published
    [Show full text]
  • POWER8Ò Processor Technology
    IBM Power Systems 9 January 2018 IBM Power Systems Facts and Features: Enterprise and Scale-out Systems with POWER8Ò Processor Technology 1 IBM Power Systems TaBle of contents Page no. Notes 3 Why Power Systems 4 IBM Power System S812LC, S822LC for Commercial Computing, S822LC for HPC, 5 S822LC for Big Data and S821LC IBM Power System S812L and S822L 6 IBM Power System S824L 7 IBM Power System S814 and S824 8 IBM Power System S812 and S822 9 IBM Power System E850C 10 IBM Power System E850 11 IBM Power System E870C 12 IBM Power System E870 13 IBM Power System E880C 14 IBM Power System E880 15 System S Class System Unit Details 16-19 Enterprise System, E850 System Unit and E870/E880 Node & Control Unit Details 19-20 Server I/O Drawers & Attachment 21-22 Physical Planning Characteristics 23 Warranty / Installation 24 Scale-out LC Server Systems Software Support 25 System S Class Systems Software Support 26-27 Enterprise Systems Software Support 28 Performance Notes & More Information 29 IBMÒ Power Systems™ servers and IBM BladeCenter® blade servers using IBM POWER7® and POWER7+® processors are described in a separate Facts and Features report dated July 2013 (POB03022-USEN-28). IBM Power Systems servers and IBM BladeCenter® blade servers using IBM POWER6® and POWER6+™ processors are described in a separate Facts and Features report dated April 2010 (POB03004-USEN-14). 2 IBM Power Systems These notes apply to the description tables for the pages which follow: Y Standard / Supported Optional Optionally Available / Supported N/A or – n/a Not Available / Supported or Not Applicable SOD Statement of General Direction announced SLES SUSE Linux Enterprise Server RHEL Red Hat Enterprise Linux A CoD capabilities include: Capacity Upgrade on Demand option – permanent processor or memory activation, Elastic Capacity on Demand – temporary processor or memory activation by the day, Utility Capacity on Demand – temporary processor activation by the minute, and Trial Capacity on Demand.
    [Show full text]
  • Software QA Testing and Test Tool Resources
    Software Quality Assurance Testing and Test Tool Resources Page 1 of 14 Data -driven Bluetooth Testing Qualification IE and FF Browser Rely on the 7 Automation with layers expertise for data-driven optimized Test & support! Excel, Listing Processes! www.7layers.com/Bluetooth CSV etc www.JadeLiquid.com/liquidtest Software QA Testing and Test Tool Resources Last updated Mon, 20 Apr 2009 17:11:14 UTC QA Management C++ Code Quality SaaS System to Unit Testing & Information Application Test Tools Web Test Tools Other manage your Coding Standard Tests, Bugs Tasks Analysis Product General Source Test Tools Link and HTML Test Test Management www.parasoft.com General - Tools Functional Test Tools Tools and QA Process Mailing Lists Tools Security Test Tools Bug Tracking Tools www.practitest.com Publications Performance Test Functional Test API Test Tools Web Sites Tools Tools Communications Certifications in QA Job Sites Java Test Tools Performance Test Test Tools Testing White Papers Embedded Test Tools Requirements Test Management Role-based, QA Tester Tools Performance Test Management Tools Tool industry Certifications Database Test Services Other Tools Start managing recognized Tools Services your tests like you Register now. Get TESTING GLOSSARY · TESTING TYPES · TRAINING COURSES always wanted certified by QAI www.testuff.com www.QAIGlobal.com/CSTE Ads by Google Software Testing FAQ Testing Web Sites Testing Companies Java Programming Test Data -driven Search Testing IE and FF Browser Automation with data-driven Click here to suggest a site for this list or request a modification to a listing support! Excel, CSV etc www.JadeLiquid.com/liquidtest INFORMATION General A Bibliography on testing object-oriented software FDA General Principals of Software Validation A large collection of links on Object Oriented approaches and testing Requirements management site includes links to every RM tool on the web, requirements quality factors, including testability, and more.
    [Show full text]
  • Performance Testing: Methodologies and Tools
    View metadata, citation and similar papers at core.ac.uk brought to you by CORE provided by International Institute for Science, Technology and Education (IISTE): E-Journals Journal of Information Engineering and Applications www.iiste.org ISSN 2224-5758 (print) ISSN 2224-896X (online) Vol 1, No.5, 2011 Performance Testing: Methodologies and Tools H. Sarojadevi * Department of Computer Science and Engineering, Nitte Meenakshi Institute of Technology, PO box 6429, Yelahanka, Bengaluru -64 * E-mail of the corresponding author: [email protected] Abstract Performance testing is important for all types of applications and systems, especially for life critical applications in Healthcare, Medical, Biotech and Drug discovery systems, and also mission critical applications such as Automotives, Flight control, defense, etc. This paper presents performance testing concepts, methodologies and commonly used tools for a variety of existing and emerging applications. Scalable virtual distributed applications in the cloud pose more challenges for performance testing, for which solutions are rare, but available; one of the major providers is HP Loadrunner. Keywords: Performance testing, Application performance, Cloud computing 1. Introduction Building a successful product hinges on two fundamental ingredients — functionality and performance. ‘Functionality’ refers to what the application lets its users accomplish, including the transactions it enables and the information it renders accessible. ‘Performance’ refers to the system’s ability to complete transactions and to furnish information rapidly and accurately despite high multi-user interaction or constrained hardware resources. Application failure due to performance-related problems is preventable with pre-deployment performance testing. However, most teams struggle because of lack of professional performance testing methods, and guaranteeing problems with regard to availability, reliability and scalability, when deploying their application on to the “real world”.
    [Show full text]
  • IBM Power Systems Performance Report
    IBM Power Systems Performance Report POWER9, POWER8 and POWER7 Results Feb 27, 2018 Table of Contents Performance of IBM UNIX, IBM i and Linux Operating System Servers ...................................... 3 Section 1 - AIX Multiuser SPEC CPU2006 Performance ............................................................. 4 Section 1a – Linux Multiuser SPEC CPU2006 Performance ...................................................... 5 Section 2 - AIX Multiuser Performance (rPerf, POWER8 and up) .............................................. 6 Section 2a - AIX Multiuser Performance (rPerf: POWER7) ......................................................... 7 Section 2b - AIX POWER8 and up Capacity Upgrade on Demand Relative Performance Guidelines .................................................................................................................................................... 11 Section 2c - AIX POWER7 Capacity Upgrade on Demand Relative Performance Guidelines .. 11 Section 2d - POWER8 and up CPW Published Results …..……………………………………….17 Section 2e - POWER 7 and POWER7+ CPW Published Results ……………….……………..…18 Section 3 – Java Benchmarks .................................................................................................... 21 Section 4 – AIX SAP Standard Application Benchmark Results ............................................... 22 Section 4a – SAP Sales and Distribution – SD 2-tier – Linux on Power .................................... 22 Section 4b – BW-EML – Linux on Power ..................................................................................
    [Show full text]