Jafar, Anderson, and Abdullat Fri, Nov 7, 4:00 - 4:30, Pueblo C Comparison of Dynamic Web Content Processing Language Performance Under a LAMP Architecture Musa Jafar [email protected] Russell Anderson [email protected] Amjad Abdullat [email protected] CIS Department, West Texas A&M University Canyon, TX 79018 Abstract LAMP is an open source, web-based application solution stack. It is comprised of (1) an oper- ating system platform running Linux, (2) an Apache web server, (3) a MySQL database man- agement system, and (4) a Dynamic Web Content Processor (tightly coupled with Apache) that is a combination of one or more of Perl, Python and PHP scripting languages paired with their corresponding MySQL Database Interface Module. In this paper, we compare various performance measures of Perl, Python and PHP separately without a database interface and in conjunction with their MySQL Database Interface Modules within a LAMP framework. We per- formed our tests under two separate Linux, Apache and Dynamic Web Content environments: an SE Linux environment and a Redhat Enterprise Linux environment. A single MySQL data- base management system that resided on a separate Redhat Linux box served both environ- ments. We used a hardware appliance framework for our test configuration, generation and data gathering. An appliance framework is repeatable and easily configurable. It allows a performance engineer to focus effort on the design, configuration and monitoring of tests, and the analysis of test results. In all cases, whether database connectivity was involved or not, PHP outperformed Perl and Python. We also present the implementation of a mechanism to handle the propagation of database engine status-codes to the web-client, this is important when automatic client-based testing is performed, because the HTTP server is incapable of automatically propagating third-tier applications status-codes to the HTTP client. Keywords : LAMP solution stack, Web Applications Development, Perl, Python, PHP, MySQL, APACHE, Linux, DBI, Dynamic Web Content Processor, HTTP 1.1 status-code. refers to the non-proprietary, open source, 1. INTRODUCTION web development, deployment, and produc- The term LAMP, originally formalized by tion platform that is comprised of individual Dougherty (2001) of O'Reilly Media, Inc., open source components. LAMP uses Linux Proc CONISAR 2008, v1 (Phoenix): §2732 (refereed) c 2008 EDSIG, page 1 Jafar, Anderson, and Abdullat Fri, Nov 7, 4:00 - 4:30, Pueblo C for the operating system, Apache for the performance of PHP and C under LAMP, web server, MySQL for database manage- WAMP and WIMP architectures – replacing ment, and a combination of Perl, Python or Linux with Windows and Apache with IIS. PHP as language(s) to generate dynamic Although the LAMP architecture is prevailing content on the server. More recently, Ruby as a mainstream web-based architecture, was added to the platform. Lately, Some unfortunately, there does not appear to be a deployments replaced Apache with lighthttpd complete study that addresses the issues from Open Source or IIS from Microsoft. related to a pure LAMP architecture. The Depending on the components replaced, the closest are the studies by Gousios (2002), platform is also known as WAMP when Mi- Cecchet (2003) and Titchkosky (2003). crosoft Windows replaces Linux or WI MP when Microsoft Windows replaces Linux and In this paper, we perform a complete study IIS server replaces Apache. Doyle (2008) comparing the performance of Perl, Python provided more information on the evolution and PHP separately and in conjunction with of the various dynamic web content proces- their database connectors under LAMP archi- sors framework and the various technologies tecture using two separate Linux environ- for web applications development. ments (SE Linux and Redhat Enterprise Li- nux). The two environments were served by Pedersen (2004), Walberg (2007) and Me- a third Linux environment hosting the back- nascé (2002) indicated that to measure a end MySQL database. The infrastructure web application’s performance, there are that we used to emulate a web-client and to many factors to consider that are not neces- configure and generate tests was a hardware sarily independent. It requires the fine tun- appliance-based framework. It is fundamen- ing and optimization of bandwidth, tally different from the infrastructure used processes, memory management, CPU by the authors just mentioned. The frame- usage, disk usage, session management, work allowed us to focus on the task where granular configurations across the board, a performance engineer designs, configures, kernel reconfiguration, accelerators, load runs, monitors, and analyzes tests without balancers, proxies, routing, TCP/IP parame- having to write code, distribute code across ter calibrations, etc. Usually, performance multiple machines, and synchronize running tests are conducted in a very controlled en- tests. An appliance framework is also rep- vironment. The research presented in this licable. Tests are repeatable and easily re- paper is no exception. Titchkosky (2003) configured. provided a good survey of network perfor- mance studies on web-server performance In the following sections, we present our under different network settings such as benchmark framework and what makes it wide area networks, parallel wide area net- different. In section three we present our works, ATM networks, and content caching. test results. Section four is an elaboration on our methodology and section five is the Other researchers performed application- summary and conclusions of the paper. All solution web server performance test com- figures are grouped in an appendix at the parisons under a standard World Wide Web end of the paper. environment for a chosen set of dynamic web content processers and web servers. 2. THE BENCHMARK TESTING FRA- These are closer to the research presented MEWORK in this paper. Gousios (2002) compared processing performance of servlets, FastCGI, The objective of this research was to com- PHP and Mod-Perl under Apache and Post- pare the performance of the three dynamic gresSQL. Cecchet (2003) compared the per- web content processors: Perl, PHP and Py- formance of PHP, servlets and EJB under thon within a LAMP solution-stack under six Apache and MySQL by completely imple- test scenarios. menting a client-browser emulator. The The first three scenarios were concurrent- work of Titchkosky (2003) is complementary user scenarios. We measured and compared to that of Cecchet (2003). They used the average-page-response-time at nine dif- Apache, PHP, Perl, Serve-Side Java (Tomcat, ferent levels: 1, 5, 10, 25, 50, 75 100, 125 Jetty, Resin) and MySQL. They were very and 150 concurrent users respectively. Sce- elaborate in their attempt to control the test nario one is a pure CGI scenario, no data- environment. Ramana (2005) compared the Proc CONISAR 2008, v1 (Phoenix): §2732 (refereed) c 2008 EDSIG, page 2 Jafar, Anderson, and Abdullat Fri, Nov 7, 4:00 - 4:30, Pueblo C base connectivity is required. Scenario two using each of the content generator lan- is a simple database query scenario. Scena- guages and each of the configured number rio three is a database insert-update-delete of concurrent users, the web clients re- transaction scenario. For example, for the quested a simple CGI-Script execution that 25 concurrent users test of any of the above dynamically generated a web page, return- 3 scenarios, 25 concurrent users were estab- ing “Hello World”. No database connectivity lished at the beginning of the test. Whenev- was involved. Figure 1 is a sequence dia- er a user is terminated a new user was es- gram representing the skeleton of the sce- tablished to maintain the 25 concurrent user nario. Figures 4 and 5 are the comparison level for the duration of the test. Under plots under SE Linux and Redhat Linux of each test scenario, we performed 54 tests as the test scenario. follows: scenario Two (a Simple Database For each platform (SE Linux, Redhat) query): For each platform, using each of For each language (Perl, Python, PHP), the content generator languages and each of For each number of concurrent users the configured number of concurrent users, (1,5,10,25,50,75,100,125,50) the web clients requested execution of a simple SQL query against the database, 1- configure a test, formatted the result, and returned the con- 2- perform and monitor a test tent page to the web client. The SQL query was:"Select SQL_NO_CACHE count(*) from 3- gather test results Person ". Figure 2 is a sequence diagram End For representing the skeleton of test scenarios two and three. Figures 6 and 7 are the End For comparison plots under SE Linux and Redhat 4- Tabulate, analyze and plot the average- Linux of the test scenario. page-response time for Perl, Python and Scenario Three (a Database In- PHP under a platform sert-Update-Delete Transaction): For End For each platform, using each of the content generator languages and each of the confi- The next three scenarios were transactions gured number of concurrent users, the web per second scenarios. The objective of these clients requested the execution of a transac- tests was to stress the LAMP solution stack tion against the database (which included an to the level where transactions start to fail insert, an update, and a delete), formatted and the transaction success rate for the du- the result and returned the content page to ration of a test is below 80%. For example, the web client. Figures 8 and 9 are the at the 25 transactions per second level of comparison plots under SE Linux and Redhat Perl under SE Linux, we fed 25 new transac- Linux of the test scenario. tions per second regardless of the status of the previously generated transactions. If a Transactions per second test: Scenarios failure rate of 20% or more was exhibited, Four through Six we considered the 25 transactions per second as cutoff point and we did not need These were stress tests.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages15 Page
-
File Size-