Automated Performance Assessment for Service-Oriented Middleware: a Case Study on BPEL Engines

Automated Performance Assessment for Service-Oriented Middleware: a Case Study on BPEL Engines

Automated Performance Assessment for Service-Oriented Middleware: a Case Study on BPEL Engines ∗ Domenico Bianculli Walter Binder Mauro Luigi Drago Faculty of Informatics Faculty of Informatics DEEP-SE group - DEI University of Lugano University of Lugano Politecnico di Milano Lugano, Switzerland Lugano, Switzerland Milano, Italy [email protected] [email protected] [email protected] ABSTRACT service-oriented middleware. Examples of functionality pro- Middleware for Web service compositions, such as BPEL vided by a service-oriented middleware are service registra- engines, provides the execution environment for services as tion and discovery, assembly and execution of composite ser- well as additional functionalities, such as monitoring and vices, service monitoring, and service management. For ex- self-tuning. Given its role in service provisioning, it is very ample, in the context of Web services-based SOAs, some of important to assess the performance of middleware in the these functionalities are provided by UDDI [15] registries context of a Service-oriented Architecture (SOA). This pa- and BPEL [16] engines. per presents SOABench, a framework for the automatic gen- It is then clear that the correct and efficient realization of eration and execution of testbeds for benchmarking middle- an SOA heavily depends on the characteristics of the mid- ware for composite Web services and for assessing the perfor- dleware infrastructure, since the latter represents the envi- mance of existing SOA infrastructures. SOABench defines ronment where services are deployed and executed. All ma- a testbed model characterized by the composite services to jor SOA vendors propose product suites that are actually a execute, the workload to generate, the deployment configu- bundle of middleware components. Moreover, academic re- ration to use, the performance metrics to gather, the data search has targeted specific facets of a service-oriented mid- analyses to perform on them, and the reports to produce. dleware, like “smart” service discovery, automated service We have validated SOABench by benchmarking the perfor- composition, dynamic service rebinding, computation of ser- mance of different BPEL engines. vice reputation, monitoring of functional properties and Ser- vice Level Agreements (SLAs), self-tuning, and self-healing. This scenario reveals a huge amount of engineering activ- Categories and Subject Descriptors ities, both industrial and academic, for the development of D.2.5 [Software Engineering]: Testing and Debugging— service-oriented middleware components. From an engineer- testing tools; D.2.8 [Software Engineering]: Metrics— ing point of view, one of the key activities in the develop- performance measures ment process is the validation of the product; in the case of service-oriented middleware, one validation task consists General Terms in assessing the performance delivered by the system. Mid- dleware performance is a critical factor for service-oriented Experimentation, Measurements, Performance systems, since it affects the global Quality of Service (QoS) perceived by the end-users of the system. Moreover, besides Keywords being executed on the developers’ site, this kind of test can Middleware, performance assessment, web service composi- also be executed on the stakeholders’ site, for example by tions, testbed generation, experiment automation SOA analysts, when they have to choose a specific middle- ware component from several alternatives. Performance assessment of a software component, at a 1. INTRODUCTION minimum, consists in executing a series of tests, each one Service-oriented Computing has received lot of attention with a specific workload for the component, and collecting both by industry, since more and more existing enterprise and aggregating some performance metrics that character- IT infrastructures have been migrated to SOAs [5], and by ize the system. In the case of distributed systems, and thus academia, which carries on several research projects in the also for service-oriented systems, the components can be de- area [17]. The implementation [7] of an SOA requires to ployed on different machines over different networks, and put together several components — each one offering a spe- may need to be stimulated by different remote clients. This cific functionality — that are aggregated under the name task, when performed manually or with a limited amount ∗ of automation, can be cumbersome and error-prone. Fur- This work was carried out while the author was an intern at Mission Critical Technologies, Inc., on site at NASA Ames thermore, in the case of distributed systems, the hetero- Research Center. geneity of the platforms and of the communication channels increase the complexity of this task. In this paper we ad- Copyright is held by the International World Wide Web Conference Com- mittee (IW3C2). Distribution of these papers is limited to classroom use, dress this problem by presenting SOABench, a framework and personal use by others. to help SOA engineers and researchers evaluate the perfor- WWW 2010, April 26–30, 2010, Raleigh, North Carolina, USA. mance (in terms of some metrics, such as response time) ACM 978-1-60558-799-8/10/04. of the middleware components that handle the execution of Modeling Tool chain service compositions. In particular, we focus on Web ser- Environment Generators vice compositions described as BPEL processes, running on execution environments that consist of a BPEL engine and Test Compiler additional components such as service registry or enterprise service buses. instanceOf Test Driver The main goal of this work is to enable further research in the service-oriented computing area, by providing a pow- Test Analyzer erful, ready-to-use framework for automating experimenta- tion with middleware components. Indeed, we foresee the application of SOABench for the evaluation and compari- Figure 1: SOABench at a glance son of the performance of different BPEL engines, sporting different optimizations or featuring specific extensions, such as monitoring, reputation reporting, rebinding, self-tuning, to run and the testbed to use. The performance engineer can or self-healing. Example research questions that SOABench then use it to design a testbed model for an SOA. Such a helps answering are: model is characterized by the composite services to execute, the workload to generate, the deployment configuration to • Which BPEL engine handles best the workload for given use, the performance metrics to gather, the data analyses to experiments on a given platform? perform on them, and the reports to produce. • What is the scalability of a certain BPEL engine, in After its definition, the model can be passed as input to terms of the maximum number of users and requests it the tool chain underlying SOABench, which includes four can handle without showing failures? main components (see Figure 1). The first component is a set of generators for creating executable versions of pro- These questions can also seek an answer outside academia, cesses, the mock-ups of the external services they inter- for example in enterprise ICT environments, in the context act with, and the testing clients that create (concurrent) of business decisions making regarding the implementation workload by invoking processes. A compiler translates the of SOAs in the enterprise. Additionally, SOABench could testbed definition into a format understandable by the un- also be used by developers of service-oriented middleware derlying platform we use for executing experiments. The components, such as BPEL engines, to evaluate the perfor- testbed driver steers the experiment execution, and the ana- mance and the scalability of their products under various lyzer gathers and processes measurement data, producing as workloads. output reports including statistics computed from the mea- The original, scientific contributions of the paper are surements. twofold. 1) We present a flexible framework for automating per- 2.1 Design Goals formance assessment for service-oriented middleware, which Our framework has been designed to meet the following automatically generates and executes testbeds for bench- goals: marking middleware components. SOABench has been im- Technology independence. SOABench treats the BPEL plemented using state-of-the-art technologies and is publicly 1 engine and the infrastructure it interacts with as a black- available . box characterized by a generic, common interface, which 2) We leverage SOABench to evaluate performance and acts as a wrapper for any BPEL-compliant infrastructure. scalability of three popular BPEL engines and discuss how Since platform dependencies — such as code specific to a the engines react to certain kinds of workload. We show that certain BPEL engine for deploying test processes — consti- one of the engines fails even under low workload, making it tute a major barrier to testability [18] for SOAs, we decided not suitable for use in production environments. to keep them at a minimum degree, and introduced a plug- The rest of the paper is structured as follows. Section 2 in mechanism to deal with them. Performance metrics are gives an overview of SOABench and its components. Sec- measured at the testbed infrastructure level, without the tion 3 illustrates the testbed meta-model we use to charac- need for instrumenting the single components of the infras- terize the experiments to be run on top of SOABench. Sec- tructure (e.g., BPEL engines) by inserting profiling code. tion 4 details how the internals of SOABench

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    10 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us