Mobile App Testing ISSN 1866-5705
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
EVALUATING and SELECTING SOFTWARE TEST AUTOMATION TOOLS Synthesizing Empirical Evidence from Practitioners
A 752 OULU 2020 A 752 UNIVERSITY OF OULU P.O. Box 8000 FI-90014 UNIVERSITY OF OULU FINLAND ACTA UNIVERSITATISUNIVERSITATIS OULUENSISOULUENSIS ACTA UNIVERSITATIS OULUENSIS ACTAACTA SCIENTIAESCIENTIAEA A RERUMRERUM Päivi Raulamo-Jurvanen NATURALIUMNATURALIUM Päivi Raulamo-Jurvanen University Lecturer Tuomo Glumoff EVALUATING AND University Lecturer Santeri Palviainen SELECTING SOFTWARE TEST Postdoctoral researcher Jani Peräntie AUTOMATION TOOLS SYNTHESIZING EMPIRICAL EVIDENCE FROM University Lecturer Anne Tuomisto PRACTITIONERS University Lecturer Veli-Matti Ulvinen Planning Director Pertti Tikkanen Professor Jari Juga University Lecturer Anu Soikkeli University Lecturer Santeri Palviainen UNIVERSITY OF OULU GRADUATE SCHOOL; UNIVERSITY OF OULU, FACULTY OF INFORMATION TECHNOLOGY AND ELECTRICAL ENGINEERING Publications Editor Kirsti Nurkkala ISBN 978-952-62-2765-8 (Paperback) ISBN 978-952-62-2766-5 (PDF) ISSN 0355-3191 (Print) ISSN 1796-220X (Online) ACTA UNIVERSITATIS OULUENSIS A Scientiae Rerum Naturalium 752 PÄIVI RAULAMO-JURVANEN EVALUATING AND SELECTING SOFTWARE TEST AUTOMATION TOOLS Synthesizing Empirical Evidence from Practitioners Academic dissertation to be presented with the assent of the Doctoral Training Committee of Information Technology and Electrical Engineering of the University of Oulu for public defence in the OP auditorium (L10), Linnanmaa, on 13 November 2020, at 12 noon UNIVERSITY OF OULU, OULU 2020 Copyright © 2020 Acta Univ. Oul. A 752, 2020 Supervised by Professor Mika Mäntylä Professor Burak Turhan Associate Professor Vahid Garousi Reviewed by Associate Professor Filippo Ricca Associate Professor Viktoria Stray Opponent Professor Kari Smolander ISBN 978-952-62-2765-8 (Paperback) ISBN 978-952-62-2766-5 (PDF) ISSN 0355-3191 (Printed) ISSN 1796-220X (Online) Cover Design Raimo Ahonen PUNAMUSTA TAMPERE 2020 Raulamo-Jurvanen, Päivi, Evaluating and Selecting Software Test Automation Tools. -
Software License Agreement
ESSENTIAL STUDIO SOFTWARE LICENSE AGREEMENT This Software License Agreement (the “Agreement”) is a legal agreement between you (“You”, “Your”, or “Customer”) and Syncfusion, Inc., a Delaware corporation with its principal place of business located at 2501 Aerial Center Parkway, Suite 200, Morrisville, NC 27560 (“Syncfusion”). This license is for Essential Studio Enterprise Edition, Essential Studio WPF Edition, Essential Studio PDF Edition, Essential Studio Xamarin Edition, and Essential Studio Win Forms Edition. Syncfusion licenses its products on a per-copy basis (referred to below as Retail Licenses) or under a project license, a corporate division license, or an enterprise license. Your right to use any given copy of a Syncfusion Essential Studio software product is generally set forth in this Agreement. In the event that your copy of this software product is licensed under a project license, a division license, or global license, additional terms and conditions shall also apply which will be set forth in a separate written and signed agreement. Carefully read all of the terms and conditions of this Agreement prior to downloading and/or installing or using the Licensed Product (as that term is defined below). This Agreement between you and Syncfusion sets forth the terms and conditions of your use of the Licensed Product. For the purposes of this Agreement, the effective date of this Agreement shall be the date upon which you click the “YES” button below. BY CLICKING THE “YES” BUTTON, YOU ARE ACCEPTING ALL OF THE TERMS OF THIS AGREEMENT AND AGREE TO BE BOUND BY THE TERMS OF THIS AGREEMENT. THIS AGREEMENT CONSTITUTES A BINDING CONTRACT. -
Comparison of GUI Testing Tools for Android Applications
Comparison of GUI testing tools for Android applications University of Oulu Department of Information Processing Science Master’s Thesis Tomi Lämsä Date 22.5.2017 2 Abstract Test automation is an intriguing area of software engineering, especially in Android development. This is since Android applications must be able to run in many different permutations of operating system versions and hardware choices. Comparison of different tools for automated UI testing of Android applications is done in this thesis. In a literature review several different tools available and their popularity is researched and the structure of the most popular tools is looked at. The two tools identified to be the most popular are Appium and Espresso. In an empirical study the two tools along with Robotium, UiAutomator and Tau are compared against each other in test execution speed, maintainability of the test code, reliability of the test tools and in general issues. An empirical study was carried out by selecting three Android applications for which an identical suite of tests was developed with each tool. The test suites were then run and the execution speed and reliability was analysed based on these results. The test code written is also analysed for maintainability by calculating the lines of code and the number of method calls needed to handle asynchrony related to UI updates. The issues faced by the test developer with the different tools are also analysed. This thesis aims to help industry users of these kinds of applications in two ways. First, it could be used as a source on what tools are over all available for UI testing of Android applications. -
Get Started with Corticon.Js
Get Started with Corticon.js Copyright © 2020 Progress Software Corporation and/or its subsidiaries or affiliates. All rights reserved. ® These materials and all Progress software products are copyrighted and all rights are reserved by Progress Software Corporation. The information in these materials is subject to change without notice, and Progress Software Corporation assumes no responsibility for any errors that may appear therein. The references in these materials to specific platforms supported are subject to change. Corticon, DataDirect (and design), DataDirect Cloud, DataDirect Connect, DataDirect Connect64, DataDirect XML Converters, DataDirect XQuery, DataRPM, Defrag This, Deliver More Than Expected, Icenium, Ipswitch, iMacros, Kendo UI, Kinvey, MessageWay, MOVEit, NativeChat, NativeScript, OpenEdge, Powered by Progress, Progress, Progress Software Developers Network, SequeLink, Sitefinity (and Design), Sitefinity, SpeedScript, Stylus Studio, TeamPulse, Telerik, Telerik (and Design), Test Studio, WebSpeed, WhatsConfigured, WhatsConnected, WhatsUp, and WS_FTP are registered trademarks of Progress Software Corporation or one of its affiliates or subsidiaries in the U.S. and/or other countries. Analytics360, AppServer, BusinessEdge, DataDirect Autonomous REST Connector, DataDirect Spy, SupportLink, DevCraft, Fiddler, iMail, JustAssembly, JustDecompile, JustMock, NativeScript Sidekick, OpenAccess, ProDataSet, Progress Results, Progress Software, ProVision, PSE Pro, SmartBrowser, SmartComponent, SmartDataBrowser, SmartDataObjects, -
Test Automation in Web Environment
FACULDADE DE ENGENHARIA DA UNIVERSIDADE DO PORTO Test Automation in Web Environment Jorge Miguel Guerra Santos ForMestrado Jury Integrado em Engenharia Evaluation Informática e Computação Supervisor: Profa Ana Paiva Proponent: Engo Joel Campos June 27, 2016 Test Automation in Web Environment Jorge Miguel Guerra Santos Mestrado Integrado em Engenharia Informática e Computação Approved in oral examination by the committee: Chair: External Examiner: Supervisor: June 27, 2016 Abstract In today’s fast moving world, it is a challenge for any company to continuously maintain and improve the quality and efficiency of software systems development. In many software projects, testing is neglected because of time or cost constraints. This leads to a lack of product quality, followed by customer dissatisfaction and ultimately to increased overall quality costs. Addition- ally, with the increasingly more complex software projects, the number of hours spent on testing increases as well, but without the support of suitable tools, the test efficiency and validity tends to decline. Some software testing tasks, such as extensive low-level interface regression testing, can be laborious and time consuming to do manually. In addition, a manual approach might not always be effective in finding certain classes of defects. Test automation offers a possibility to perform these types of testing effectively. Once automated tests have been developed, they can be run quickly and repeatedly. However, test automation systems usually lack reporting, analysis and meaningful information about project status. The end goal of this research work is to create a prototype that can create and organize test batteries by recording user interaction, reproduce the recorded actions automatically, detect failures during test execution and generate reports, while also setting up the test environment, all in a automatic fashion and develop techniques to create more maintainable test cases. -
Mobile Developer's Guide to the Galaxy
Don’t Panic MOBILE DEVELOPER’S GUIDE TO THE GALAXY U PD A TE D & EX TE ND 12th ED EDITION published by: Services and Tools for All Mobile Platforms Enough Software GmbH + Co. KG Sögestrasse 70 28195 Bremen Germany www.enough.de Please send your feedback, questions or sponsorship requests to: [email protected] Follow us on Twitter: @enoughsoftware 12th Edition February 2013 This Developer Guide is licensed under the Creative Commons Some Rights Reserved License. Editors: Marco Tabor (Enough Software) Julian Harty Izabella Balce Art Direction and Design by Andrej Balaz (Enough Software) Mobile Developer’s Guide Contents I Prologue 1 The Galaxy of Mobile: An Introduction 1 Topology: Form Factors and Usage Patterns 2 Star Formation: Creating a Mobile Service 6 The Universe of Mobile Operating Systems 12 About Time and Space 12 Lost in Space 14 Conceptional Design For Mobile 14 Capturing The Idea 16 Designing User Experience 22 Android 22 The Ecosystem 24 Prerequisites 25 Implementation 28 Testing 30 Building 30 Signing 31 Distribution 32 Monetization 34 BlackBerry Java Apps 34 The Ecosystem 35 Prerequisites 36 Implementation 38 Testing 39 Signing 39 Distribution 40 Learn More 42 BlackBerry 10 42 The Ecosystem 43 Development 51 Testing 51 Signing 52 Distribution 54 iOS 54 The Ecosystem 55 Technology Overview 57 Testing & Debugging 59 Learn More 62 Java ME (J2ME) 62 The Ecosystem 63 Prerequisites 64 Implementation 67 Testing 68 Porting 70 Signing 71 Distribution 72 Learn More 4 75 Windows Phone 75 The Ecosystem 76 Implementation 82 Testing -
Behavioral Analysis of Android Applications Using Automated Instrumentation
2013 Seventh International Conference on Software Security and Reliability Companion Behavioral Analysis of Android Applications Using Automated Instrumentation Mohammad Karami, Mohamed Elsabagh, Parnian Najafiborazjani, and Angelos Stavrou Computer Science Department, George Mason University, Fairfax, VA 22030 { mkarami, melsabag, pnajafib, astavrou}@gmu.edu Abstract—Google’s Android operating system has become one application is not a straight forward task due to variety of the most popular operating system for hand-held devices. Due inputs and heterogeneity of the technologies [12]. to its ubiquitous use, open source nature and wide-spread Two primary methods are being employed for mobile appli- popularity, it has become the target of recent mobile malware. In this paper, we present our efforts on effective security cation analysis: white-box approach and black-box approach. inspection mechanisms for identification of malicious applications In black-box testing only the inputs and outputs of the appli- for Android mobile applications. To achieve that, we devel- cation are being exercised. On the other hand, for white box oped a comprehensive software inspection framework. Moreover, approach the source code need to be analyzed. Since the source to identify potential software reliability flaws and to trigger code of the malicious applications that we get from Google malware, we develop a transparent instrumentation system for automating user interactions with an Android application that Play is not available we cannot analyze the internal structure does not require source code. Additionally, for run-time behavior of the malicious applications to figure out what they exactly analysis of an application, we monitor the I/O system calls gener- do, but we can utilize the black-box testing to define their ated the by application under monitoring to the underlying Linux functionality. -
Unit Testing, Integration Testing and Continuous Builds for Android
Unit Testing, Integration Testing and Continuous Builds Manfred Moser simpligility technologies inc. http://www.simpligility.com @simpligility Agenda Get an overview about testing and continuous integration for Android app development Why testing? What can we test? How can we do it? 2 Manfred Moser simpligility.com Apache Maven See previous presentation Maven used to control build and more Good library reuse and dependency use – makes testing easier out of the box Strong tool support But its all possible without Maven too... Why (automated) testing? Find problem early and you ● Can fix it quickly ● Safe money on QA testing ● Do not get negative customer feedback ● Deal with feature requests instead of bugs ● Avoid production problems ● Can refactor (and change) without breaking old stuff 4 Manfred Moser simpligility.com What are we testing? Plain java code Android dependent code Configuration User interface Look and feel 5 Manfred Moser simpligility.com JVM vs Dalvik/Android stack JVM based: ● Faster ● More tools available ● More mature tooling Dalvik based: ● Necessary for integration tests ● Reproduce actual behaviour ● Full stack testing (beyond VM, to native..) 6 Manfred Moser simpligility.com JVM testing tools ● JUnit ● TestNG ● EasyMock ● Unitils ● Cobertura ● Emma ● and many more 7 Android SDK Test Tools ● Integrated Junit ● use on emulator/device though ● Instrumentation Test Tools ● rich set of classes for testing ● now well documented ● MonkeyRunner ● control device/emulator running tests ● take screenshots ● jython 8 Dalvik/Android -
Integrated Testing Environment for Developers and Coding Testers
INTEGRATED TESTING ENVIRONMENT for Developers and Coding Testers Telerik www.telerik.com/test-studio www.twitter.com/teleriktesting Test Studio’s plugin for Visual Studio enables developers and testers comfortable writing code to work in the environment where they’re most productive. Write Code Where Needed Every test automation project will require some level of coding to be successful. Test Studio’s record and playback creates powerful, maintainable tests, but you’ll still need to write code at some point to cover common, critical aspects such as configuration, backing APIs, or test oracles. to ensure those APIs are properly tested elsewhere in the Setup and Teardown/ system, of course!) Using these APIs ensures our tests run Configuration faster, and we’re also keeping the overall test suite much more maintainable. Complex tests require clear, flexible configuration actions that keep the overall test suite maintainable over the long run. Pushing setup, teardown, and configuration to Configuration Actions code versus the system’s interface dramatically speeds up test execution by leveraging the system’s own internal Part of keeping your test suite lean and focused on high-value functionality through internal APIs, web service endpoints, tests is ensuring you’re not testing components which don’t or database stored procedures. make sense to test. Using coded steps to disable and re-enable these components during automated testing runs is a great Let’s have a look at some common scenarios where a team way to keep your tests smoother and targeted to functionality might drop to code to handle specific situations. your teams are writing. -
Computer Science) Thesis Title
MS (Computer Science) Thesis title: Formal Automated Testing of an Information Management System Submitted by: Rozina Kamil Roll no. 11 Reg. no: IU15M2LA011 Supervised by: Dr. Nadeem Akhtar i Title Formal Automated Testing of an Information Management System By Rozina Kamil Roll no. 11 Reg. no: IU15M2LA011 Thesis submitted for the partial fulfilment of the requirement for the degree of MASTER OF SCIENCE In COMPUTER SCIENCE Department of Computer Science & IT The Islamic University of Bahawalpur - PAKISTAN Fall 2015-17 i DECLARATION Formal Automated Testing of an Information management System published source (except the references, standard mathematical or geometrical models/equations /formulae /protocols etc.). I further declare that this work has not been submitted for award of any other diploma/degree. The university may take action if information provided is found inaccurate at any stage. (In case of default, the scholar will be proceeded against as per HEC plagiarism policy). Rozina Kamil Roll no. 11 Reg no. IU15M2LA011 ii To, The Controller of Examinations The Islamia University of Bahawalpur, Pakistan We, the supervisory committee, certify that the contents and format of thesis titled Formal Automated testing of an information management system submitted by Rozina Kamil, Roll no. 11, and Registration no. IU15M2LA011 have been found satisfactory and recommend that it be processed for evaluation by the External Examiner(s) for the award of degree. Supervisor Dr. Nadeem Akhtar Department of Computer Science & IT The Islamia University of Bahawalpur Pakistan iii Dedication I dedicate my dissertation work to my all family members, friends, class mates and my supervisor. It cannot be possible for their sincere support and encourage. -
Large-Scale Android Dynamic Analysis
Andlantis: Large-scale Android Dynamic Analysis Michael Biermayz, Eric Gustafsonz, Jeremy Ericksony, David Fritzy, Yung Ryn Choey ∗ySandia National Laboratories fmbierma, jericks, djfritz, [email protected] zUniversity of California, Davis fmhbierma, [email protected] Abstract— In this paper, we present Andlantis: a highly scalable Analyzing Android applications for malicious behavior is an dynamic analysis framework for analyzing applications on important area of research, and is made difficult, in part, by the Android operating system. Andlantis runs the Android the increasingly large number of applications available for the operating system in a virtualized environment and is able platform. While techniques exist to perform static analysis on a to provide the virtual device with artificial network data in large number of applications, dynamic analysis techniques are order to provide an environment which closely replicates that relatively limited in scale due to the computational resources of a physical device. Andlantis is able to schedule and run required to emulate the full Android system to achieve accurate thousands of Android instances in parallel, enabling us to execution. We present Andlantis, a scalable dynamic analysis investigate the behavior of mobile malware at scale. system capable of processing over 3000 Android applications per hour. During this processing, the system is able to collect Andlantis employs a scalable high-performance emulytics valuable forensic data, which helps reverse-engineers and mal- framework, minimega, to parallelize this expensive task as ware researchers identify and understand anomalous application much as possible and achieve a level of throughput un- behavior. We discuss the results of running 1261 malware samples precedented in Android dynamic analysis. -
Test Center of Excellence How Can It Be Set Up? ISSN 1866-5705
ISSN 1866-5705 www.testingexperience.com free digital version print version 8,00 € printed in Germany 18 The Magazine for Professional Testers The MagazineforProfessional Test Center of Excellence Center Test How can itbesetup? How June 2012 Pragmatic, Soft Skills Focused, Industry Supported CAT is no ordinary certification, but a professional jour- The certification does not simply promote absorption ney into the world of Agile. As with any voyage you have of the theory through academic mediums but encour- to take the first step. You may have some experience ages you to experiment, in the safe environment of the with Agile from your current or previous employment or classroom, through the extensive discussion forums you may be venturing out into the unknown. Either way and daily practicals. Over 50% of the initial course is CAT has been specifically designed to partner and guide based around practical application of the techniques you through all aspects of your tour. and methods that you learn, focused on building the The focus of the course is to look at how you the tes- skills you already have as a tester. This then prepares ter can make a valuable contribution to these activities you, on returning to your employer, to be Agile. even if they are not currently your core abilities. This The transition into a Professional Agile Tester team course assumes that you already know how to be a tes- member culminates with on the job assessments, dem- ter, understand the fundamental testing techniques and onstrated abilities in Agile expertise through such fo- testing practices, leading you to transition into an Agile rums as presentations at conferences or Special Interest team.