Mobile App Testing ISSN 1866-5705

Total Page:16

File Type:pdf, Size:1020Kb

Mobile App Testing ISSN 1866-5705 19 September 2012 The Magazine for Professional Testers Pantone 202 Pantone 173 printed in Germany printed 100% 80% 60% 40% 20% 100% 80% 60% 40% 20% cmyk cmyk c:0 c:0 c:0 c:0 c:0 c:0 c:0 c:0 c:0 c:0 m:100 m:80 m:60 m:40 m:20 m:69 m:55 m:41 m:28 m:14 y:65 y:52 € 8,00 version print y:39 y:26 y:13 y:100 y:80 y:60 y:40 y:20 k:47 k:37 k:28 k:19 k:10 k:6 k:5 k:4 k:3 k:0 free digital version free digital www.testingexperience.com Mobile App Testing ISSN 1866-5705 We all did make the experience that some mobile apps did not work according to our expectations. It is not only that the app just crashes and you have to start it again. It is that can be used in your daily work. Don’t miss it. For more information more than that. I had some funny situations when I uploaded pictures please go to www.agiletestingdays.com. via an app, and they were doubly displayed, or now seemed to belong to Based on the great success of the Agile Testing Days, we decided to set up my friends. More interesting is when you do online banking via the app, the Agile Dev Practices as the conference that also allows agile developers that’s where the fun stops. Mobile app testing has a lot of different facets. and those who want to a career in agile developing to meet and discuss it. We have a lot of very good articles discussing many different aspects of We have set up a great line-up of keynote speakers, tutorials, workshops, it. I want to thank all the authors for the great articles, and of course all etc. Don’t miss the conference. Have a look at www.agiledevpractices.com. the sponsors that make the magazine possible. Last but not least, I want to thank Katrin Schülke for the great job she I think that “mobile” app development and testing will become more and did over the past years to develop Testing Experience. It was a pleasure to more important. Microsoft is now bringing “Surface” to the tablet market, work with her, and I wish her the best for her new career path. Our door and after seeing the promotion videos – looking very promising – I guess is always open for her. that this will provide the impetus for the whole world to move in a new direction. I wish this for Microsoft, but I also wish it for us, the users. We Enjoy reading and have a pleasant time. are moving forward, and this is always positive. Take a second to think back 10 years. How was your mobile and what could you do with it? Stop now for a second and try to imagine what could be possible in 10 years. José Knowledge and imagination will take the teams to a new level. Most of the companies that I know develop mobile apps using agile tech- niques. The Agile Testing Days in November will cover interesting aspects From the editor 1 contents 19/2012 Context-Dependent Testing of Apps Applications propel the versatility of smartphones and tablet PCs. How- ever, testing apps is a cumbersome task. It is particularly complicated since apps have to cope with frequent changes of the context they run in. Mobile devices move, which leads to changes in the available network connection and varying environmental parameters, such as coordinates derived from a geolocation service. Moreover, patterns of usage change. We propose a novel concept to deal with context changes when testing apps. It is based on basic blocks of code between which context changes are possible and helps testers to manage com- plexity. Moreover, it supports the maintenance of a suite of test cases that would become much larger if test cases were simply duplicated to deal with different contexts. In addition to the introduction of our concept, we highlight its application and benefits. What Makes a Good Tester? Mobile Business Application Testing This article discusses the common personality traits Automation: A Novel Approach displayed by experienced, successful professional In this document, we will present the approach of testers to help you understand what makes a good automating mobile business applications testing tester, to help you structure your testing team, and using emulators and real time devices. for evaluating new possible recruits. From the editor .............................................................................................................................1 There’s a test for that! ...........................................................................................................40 What Makes a Good Tester? ...................................................................................................2 Testing on a Real Handset vs. Testing on a Simulator – the big battle ............ 42 Context-Dependent Testing of Apps ..................................................................................2 What Makes a Good Tester? .................................................................................................52 Mobile App Performance – How to Ensure High-Quality Experiences.............. 6 4E Framework for Mobile Network Variability Testing ...........................................56 Roadblocks and their workaround while testing Mobile Applications ..............8 Enterprise Mobile Testing for Success ........................................................................... 60 Main issues in mobile app testing ...................................................................................18 Mobile App Testing .................................................................................................................64 Beyond the Smartphone – Testing for the Developing World ..............................22 Context-Dependent Testing of Apps ...............................................................................66 Best Practices in Mobile App Testing ..............................................................................26 Column: Testing Maturity – Where Are We Today? ......................................... 72 Mobile Test Strategy ...............................................................................................................30 Mobile Business Application Testing Automation: A Novel Approach............76 Driven Development for the Android platform ......................................................... 34 Masthead ....................................................................................................................................80 2 Testing Experience – 19 2012 Testing Experience – 19 2012 3 Graham Bath Graham’s experience in testing spans more than 30 years and has covered a wide range of domains and technologies. As a test manager he has been responsible for the testing of mission-critical sys- tems in spaceflight, telecommunications and police incident-control. Graham has designed tests to the highest levels of rigour within real-time aerospace systems. As a principal consultant for T-Systems, the systems division of Germany’s leading telecommunications company Deutsche Telekom, he has mastered the Quality Improvement Programs of major companies, primarily in the financial and government sectors. In his current position Graham is responsible for the test consulting group. He has built up a training program to include all levels of the ISTQB certification scheme, including the new Expert Level, for the company‘s large staff of testing professionals. Graham is the chairman of the ISTQB Expert Level Working Party and co-author of both Advanced Level and Expert Level syllabi. Gualtiero Bazzana Gualtiero has been working in the IT and testing domain for the last 20 years; he is author of more than 50 publications on the topics of sw quality and testing; he currently holds the following positions: Managing Director of Alten Italia, Chairman of ITA- STQB, the Italian Board of ISTQB® of ISTQB® Marketing Working Group, Chairman of STF – Software Testing Forum, the most important event on testing held annually in Italy. Andreas Ebbert-Karroum Andreas leads the Agile Software Factory (ASF) at codecentric. For more than four years, he is a certified Scrum Master. Since then he could contribute his competencies in small and large (> 300 persons), internal and external, or local and global projects as developer, scrum master or product owner. Me- anwhile, he’s also a Professional Developer Scrum Trainer, conducting a hands-on training for Scrum team members, which codecentric has developed together with scrum. org and in which he shares with joy the knowledge gained in the ASF. More than 15 years ago, his interest in JavaSE/EE awakened, and it never vanished since then. His focus at codecentric is the continuous improve- ment of the development process in the Agile Software Factory, where technical, organizational and social possibilities are the challenging, external determining factors. 4 Testing Experience – 19 2012 Paul Gerrard Yaron Tsubery Yaron has been working in software since 1990, and Paul is a consultant, teacher, author, webmaster, pro- has more than 20 years in testing field as a Test En- grammer, tester, conference speaker, rowing coach gineer, Testing TL, and Testing Manager. His original and most recently a publisher. He has conducted con- profession was Systems Analyst. sulting assignments in all aspects of software testing and quality assurance, specialising in test assurance. Yaron Tsubery is the current CEO and founder of He has presented keynote talks and tutorials at tes- Smartest Technologies Ltd and has been a Director of ting conferences across Europe, the USA, Australia, QA & Testing Manager at Comverse since 2000 until South Africa and occasionally won awards for them. 2010. Yaron was in charge
Recommended publications
  • EVALUATING and SELECTING SOFTWARE TEST AUTOMATION TOOLS Synthesizing Empirical Evidence from Practitioners
    A 752 OULU 2020 A 752 UNIVERSITY OF OULU P.O. Box 8000 FI-90014 UNIVERSITY OF OULU FINLAND ACTA UNIVERSITATISUNIVERSITATIS OULUENSISOULUENSIS ACTA UNIVERSITATIS OULUENSIS ACTAACTA SCIENTIAESCIENTIAEA A RERUMRERUM Päivi Raulamo-Jurvanen NATURALIUMNATURALIUM Päivi Raulamo-Jurvanen University Lecturer Tuomo Glumoff EVALUATING AND University Lecturer Santeri Palviainen SELECTING SOFTWARE TEST Postdoctoral researcher Jani Peräntie AUTOMATION TOOLS SYNTHESIZING EMPIRICAL EVIDENCE FROM University Lecturer Anne Tuomisto PRACTITIONERS University Lecturer Veli-Matti Ulvinen Planning Director Pertti Tikkanen Professor Jari Juga University Lecturer Anu Soikkeli University Lecturer Santeri Palviainen UNIVERSITY OF OULU GRADUATE SCHOOL; UNIVERSITY OF OULU, FACULTY OF INFORMATION TECHNOLOGY AND ELECTRICAL ENGINEERING Publications Editor Kirsti Nurkkala ISBN 978-952-62-2765-8 (Paperback) ISBN 978-952-62-2766-5 (PDF) ISSN 0355-3191 (Print) ISSN 1796-220X (Online) ACTA UNIVERSITATIS OULUENSIS A Scientiae Rerum Naturalium 752 PÄIVI RAULAMO-JURVANEN EVALUATING AND SELECTING SOFTWARE TEST AUTOMATION TOOLS Synthesizing Empirical Evidence from Practitioners Academic dissertation to be presented with the assent of the Doctoral Training Committee of Information Technology and Electrical Engineering of the University of Oulu for public defence in the OP auditorium (L10), Linnanmaa, on 13 November 2020, at 12 noon UNIVERSITY OF OULU, OULU 2020 Copyright © 2020 Acta Univ. Oul. A 752, 2020 Supervised by Professor Mika Mäntylä Professor Burak Turhan Associate Professor Vahid Garousi Reviewed by Associate Professor Filippo Ricca Associate Professor Viktoria Stray Opponent Professor Kari Smolander ISBN 978-952-62-2765-8 (Paperback) ISBN 978-952-62-2766-5 (PDF) ISSN 0355-3191 (Printed) ISSN 1796-220X (Online) Cover Design Raimo Ahonen PUNAMUSTA TAMPERE 2020 Raulamo-Jurvanen, Päivi, Evaluating and Selecting Software Test Automation Tools.
    [Show full text]
  • Software License Agreement
    ESSENTIAL STUDIO SOFTWARE LICENSE AGREEMENT This Software License Agreement (the “Agreement”) is a legal agreement between you (“You”, “Your”, or “Customer”) and Syncfusion, Inc., a Delaware corporation with its principal place of business located at 2501 Aerial Center Parkway, Suite 200, Morrisville, NC 27560 (“Syncfusion”). This license is for Essential Studio Enterprise Edition, Essential Studio WPF Edition, Essential Studio PDF Edition, Essential Studio Xamarin Edition, and Essential Studio Win Forms Edition. Syncfusion licenses its products on a per-copy basis (referred to below as Retail Licenses) or under a project license, a corporate division license, or an enterprise license. Your right to use any given copy of a Syncfusion Essential Studio software product is generally set forth in this Agreement. In the event that your copy of this software product is licensed under a project license, a division license, or global license, additional terms and conditions shall also apply which will be set forth in a separate written and signed agreement. Carefully read all of the terms and conditions of this Agreement prior to downloading and/or installing or using the Licensed Product (as that term is defined below). This Agreement between you and Syncfusion sets forth the terms and conditions of your use of the Licensed Product. For the purposes of this Agreement, the effective date of this Agreement shall be the date upon which you click the “YES” button below. BY CLICKING THE “YES” BUTTON, YOU ARE ACCEPTING ALL OF THE TERMS OF THIS AGREEMENT AND AGREE TO BE BOUND BY THE TERMS OF THIS AGREEMENT. THIS AGREEMENT CONSTITUTES A BINDING CONTRACT.
    [Show full text]
  • Comparison of GUI Testing Tools for Android Applications
    Comparison of GUI testing tools for Android applications University of Oulu Department of Information Processing Science Master’s Thesis Tomi Lämsä Date 22.5.2017 2 Abstract Test automation is an intriguing area of software engineering, especially in Android development. This is since Android applications must be able to run in many different permutations of operating system versions and hardware choices. Comparison of different tools for automated UI testing of Android applications is done in this thesis. In a literature review several different tools available and their popularity is researched and the structure of the most popular tools is looked at. The two tools identified to be the most popular are Appium and Espresso. In an empirical study the two tools along with Robotium, UiAutomator and Tau are compared against each other in test execution speed, maintainability of the test code, reliability of the test tools and in general issues. An empirical study was carried out by selecting three Android applications for which an identical suite of tests was developed with each tool. The test suites were then run and the execution speed and reliability was analysed based on these results. The test code written is also analysed for maintainability by calculating the lines of code and the number of method calls needed to handle asynchrony related to UI updates. The issues faced by the test developer with the different tools are also analysed. This thesis aims to help industry users of these kinds of applications in two ways. First, it could be used as a source on what tools are over all available for UI testing of Android applications.
    [Show full text]
  • Get Started with Corticon.Js
    Get Started with Corticon.js Copyright © 2020 Progress Software Corporation and/or its subsidiaries or affiliates. All rights reserved. ® These materials and all Progress software products are copyrighted and all rights are reserved by Progress Software Corporation. The information in these materials is subject to change without notice, and Progress Software Corporation assumes no responsibility for any errors that may appear therein. The references in these materials to specific platforms supported are subject to change. Corticon, DataDirect (and design), DataDirect Cloud, DataDirect Connect, DataDirect Connect64, DataDirect XML Converters, DataDirect XQuery, DataRPM, Defrag This, Deliver More Than Expected, Icenium, Ipswitch, iMacros, Kendo UI, Kinvey, MessageWay, MOVEit, NativeChat, NativeScript, OpenEdge, Powered by Progress, Progress, Progress Software Developers Network, SequeLink, Sitefinity (and Design), Sitefinity, SpeedScript, Stylus Studio, TeamPulse, Telerik, Telerik (and Design), Test Studio, WebSpeed, WhatsConfigured, WhatsConnected, WhatsUp, and WS_FTP are registered trademarks of Progress Software Corporation or one of its affiliates or subsidiaries in the U.S. and/or other countries. Analytics360, AppServer, BusinessEdge, DataDirect Autonomous REST Connector, DataDirect Spy, SupportLink, DevCraft, Fiddler, iMail, JustAssembly, JustDecompile, JustMock, NativeScript Sidekick, OpenAccess, ProDataSet, Progress Results, Progress Software, ProVision, PSE Pro, SmartBrowser, SmartComponent, SmartDataBrowser, SmartDataObjects,
    [Show full text]
  • Test Automation in Web Environment
    FACULDADE DE ENGENHARIA DA UNIVERSIDADE DO PORTO Test Automation in Web Environment Jorge Miguel Guerra Santos ForMestrado Jury Integrado em Engenharia Evaluation Informática e Computação Supervisor: Profa Ana Paiva Proponent: Engo Joel Campos June 27, 2016 Test Automation in Web Environment Jorge Miguel Guerra Santos Mestrado Integrado em Engenharia Informática e Computação Approved in oral examination by the committee: Chair: External Examiner: Supervisor: June 27, 2016 Abstract In today’s fast moving world, it is a challenge for any company to continuously maintain and improve the quality and efficiency of software systems development. In many software projects, testing is neglected because of time or cost constraints. This leads to a lack of product quality, followed by customer dissatisfaction and ultimately to increased overall quality costs. Addition- ally, with the increasingly more complex software projects, the number of hours spent on testing increases as well, but without the support of suitable tools, the test efficiency and validity tends to decline. Some software testing tasks, such as extensive low-level interface regression testing, can be laborious and time consuming to do manually. In addition, a manual approach might not always be effective in finding certain classes of defects. Test automation offers a possibility to perform these types of testing effectively. Once automated tests have been developed, they can be run quickly and repeatedly. However, test automation systems usually lack reporting, analysis and meaningful information about project status. The end goal of this research work is to create a prototype that can create and organize test batteries by recording user interaction, reproduce the recorded actions automatically, detect failures during test execution and generate reports, while also setting up the test environment, all in a automatic fashion and develop techniques to create more maintainable test cases.
    [Show full text]
  • Mobile Developer's Guide to the Galaxy
    Don’t Panic MOBILE DEVELOPER’S GUIDE TO THE GALAXY U PD A TE D & EX TE ND 12th ED EDITION published by: Services and Tools for All Mobile Platforms Enough Software GmbH + Co. KG Sögestrasse 70 28195 Bremen Germany www.enough.de Please send your feedback, questions or sponsorship requests to: [email protected] Follow us on Twitter: @enoughsoftware 12th Edition February 2013 This Developer Guide is licensed under the Creative Commons Some Rights Reserved License. Editors: Marco Tabor (Enough Software) Julian Harty Izabella Balce Art Direction and Design by Andrej Balaz (Enough Software) Mobile Developer’s Guide Contents I Prologue 1 The Galaxy of Mobile: An Introduction 1 Topology: Form Factors and Usage Patterns 2 Star Formation: Creating a Mobile Service 6 The Universe of Mobile Operating Systems 12 About Time and Space 12 Lost in Space 14 Conceptional Design For Mobile 14 Capturing The Idea 16 Designing User Experience 22 Android 22 The Ecosystem 24 Prerequisites 25 Implementation 28 Testing 30 Building 30 Signing 31 Distribution 32 Monetization 34 BlackBerry Java Apps 34 The Ecosystem 35 Prerequisites 36 Implementation 38 Testing 39 Signing 39 Distribution 40 Learn More 42 BlackBerry 10 42 The Ecosystem 43 Development 51 Testing 51 Signing 52 Distribution 54 iOS 54 The Ecosystem 55 Technology Overview 57 Testing & Debugging 59 Learn More 62 Java ME (J2ME) 62 The Ecosystem 63 Prerequisites 64 Implementation 67 Testing 68 Porting 70 Signing 71 Distribution 72 Learn More 4 75 Windows Phone 75 The Ecosystem 76 Implementation 82 Testing
    [Show full text]
  • Behavioral Analysis of Android Applications Using Automated Instrumentation
    2013 Seventh International Conference on Software Security and Reliability Companion Behavioral Analysis of Android Applications Using Automated Instrumentation Mohammad Karami, Mohamed Elsabagh, Parnian Najafiborazjani, and Angelos Stavrou Computer Science Department, George Mason University, Fairfax, VA 22030 { mkarami, melsabag, pnajafib, astavrou}@gmu.edu Abstract—Google’s Android operating system has become one application is not a straight forward task due to variety of the most popular operating system for hand-held devices. Due inputs and heterogeneity of the technologies [12]. to its ubiquitous use, open source nature and wide-spread Two primary methods are being employed for mobile appli- popularity, it has become the target of recent mobile malware. In this paper, we present our efforts on effective security cation analysis: white-box approach and black-box approach. inspection mechanisms for identification of malicious applications In black-box testing only the inputs and outputs of the appli- for Android mobile applications. To achieve that, we devel- cation are being exercised. On the other hand, for white box oped a comprehensive software inspection framework. Moreover, approach the source code need to be analyzed. Since the source to identify potential software reliability flaws and to trigger code of the malicious applications that we get from Google malware, we develop a transparent instrumentation system for automating user interactions with an Android application that Play is not available we cannot analyze the internal structure does not require source code. Additionally, for run-time behavior of the malicious applications to figure out what they exactly analysis of an application, we monitor the I/O system calls gener- do, but we can utilize the black-box testing to define their ated the by application under monitoring to the underlying Linux functionality.
    [Show full text]
  • Unit Testing, Integration Testing and Continuous Builds for Android
    Unit Testing, Integration Testing and Continuous Builds Manfred Moser simpligility technologies inc. http://www.simpligility.com @simpligility Agenda Get an overview about testing and continuous integration for Android app development Why testing? What can we test? How can we do it? 2 Manfred Moser simpligility.com Apache Maven See previous presentation Maven used to control build and more Good library reuse and dependency use – makes testing easier out of the box Strong tool support But its all possible without Maven too... Why (automated) testing? Find problem early and you ● Can fix it quickly ● Safe money on QA testing ● Do not get negative customer feedback ● Deal with feature requests instead of bugs ● Avoid production problems ● Can refactor (and change) without breaking old stuff 4 Manfred Moser simpligility.com What are we testing? Plain java code Android dependent code Configuration User interface Look and feel 5 Manfred Moser simpligility.com JVM vs Dalvik/Android stack JVM based: ● Faster ● More tools available ● More mature tooling Dalvik based: ● Necessary for integration tests ● Reproduce actual behaviour ● Full stack testing (beyond VM, to native..) 6 Manfred Moser simpligility.com JVM testing tools ● JUnit ● TestNG ● EasyMock ● Unitils ● Cobertura ● Emma ● and many more 7 Android SDK Test Tools ● Integrated Junit ● use on emulator/device though ● Instrumentation Test Tools ● rich set of classes for testing ● now well documented ● MonkeyRunner ● control device/emulator running tests ● take screenshots ● jython 8 Dalvik/Android
    [Show full text]
  • Integrated Testing Environment for Developers and Coding Testers
    INTEGRATED TESTING ENVIRONMENT for Developers and Coding Testers Telerik www.telerik.com/test-studio www.twitter.com/teleriktesting Test Studio’s plugin for Visual Studio enables developers and testers comfortable writing code to work in the environment where they’re most productive. Write Code Where Needed Every test automation project will require some level of coding to be successful. Test Studio’s record and playback creates powerful, maintainable tests, but you’ll still need to write code at some point to cover common, critical aspects such as configuration, backing APIs, or test oracles. to ensure those APIs are properly tested elsewhere in the Setup and Teardown/ system, of course!) Using these APIs ensures our tests run Configuration faster, and we’re also keeping the overall test suite much more maintainable. Complex tests require clear, flexible configuration actions that keep the overall test suite maintainable over the long run. Pushing setup, teardown, and configuration to Configuration Actions code versus the system’s interface dramatically speeds up test execution by leveraging the system’s own internal Part of keeping your test suite lean and focused on high-value functionality through internal APIs, web service endpoints, tests is ensuring you’re not testing components which don’t or database stored procedures. make sense to test. Using coded steps to disable and re-enable these components during automated testing runs is a great Let’s have a look at some common scenarios where a team way to keep your tests smoother and targeted to functionality might drop to code to handle specific situations. your teams are writing.
    [Show full text]
  • Computer Science) Thesis Title
    MS (Computer Science) Thesis title: Formal Automated Testing of an Information Management System Submitted by: Rozina Kamil Roll no. 11 Reg. no: IU15M2LA011 Supervised by: Dr. Nadeem Akhtar i Title Formal Automated Testing of an Information Management System By Rozina Kamil Roll no. 11 Reg. no: IU15M2LA011 Thesis submitted for the partial fulfilment of the requirement for the degree of MASTER OF SCIENCE In COMPUTER SCIENCE Department of Computer Science & IT The Islamic University of Bahawalpur - PAKISTAN Fall 2015-17 i DECLARATION Formal Automated Testing of an Information management System published source (except the references, standard mathematical or geometrical models/equations /formulae /protocols etc.). I further declare that this work has not been submitted for award of any other diploma/degree. The university may take action if information provided is found inaccurate at any stage. (In case of default, the scholar will be proceeded against as per HEC plagiarism policy). Rozina Kamil Roll no. 11 Reg no. IU15M2LA011 ii To, The Controller of Examinations The Islamia University of Bahawalpur, Pakistan We, the supervisory committee, certify that the contents and format of thesis titled Formal Automated testing of an information management system submitted by Rozina Kamil, Roll no. 11, and Registration no. IU15M2LA011 have been found satisfactory and recommend that it be processed for evaluation by the External Examiner(s) for the award of degree. Supervisor Dr. Nadeem Akhtar Department of Computer Science & IT The Islamia University of Bahawalpur Pakistan iii Dedication I dedicate my dissertation work to my all family members, friends, class mates and my supervisor. It cannot be possible for their sincere support and encourage.
    [Show full text]
  • Large-Scale Android Dynamic Analysis
    Andlantis: Large-scale Android Dynamic Analysis Michael Biermayz, Eric Gustafsonz, Jeremy Ericksony, David Fritzy, Yung Ryn Choey ∗ySandia National Laboratories fmbierma, jericks, djfritz, [email protected] zUniversity of California, Davis fmhbierma, [email protected] Abstract— In this paper, we present Andlantis: a highly scalable Analyzing Android applications for malicious behavior is an dynamic analysis framework for analyzing applications on important area of research, and is made difficult, in part, by the Android operating system. Andlantis runs the Android the increasingly large number of applications available for the operating system in a virtualized environment and is able platform. While techniques exist to perform static analysis on a to provide the virtual device with artificial network data in large number of applications, dynamic analysis techniques are order to provide an environment which closely replicates that relatively limited in scale due to the computational resources of a physical device. Andlantis is able to schedule and run required to emulate the full Android system to achieve accurate thousands of Android instances in parallel, enabling us to execution. We present Andlantis, a scalable dynamic analysis investigate the behavior of mobile malware at scale. system capable of processing over 3000 Android applications per hour. During this processing, the system is able to collect Andlantis employs a scalable high-performance emulytics valuable forensic data, which helps reverse-engineers and mal- framework, minimega, to parallelize this expensive task as ware researchers identify and understand anomalous application much as possible and achieve a level of throughput un- behavior. We discuss the results of running 1261 malware samples precedented in Android dynamic analysis.
    [Show full text]
  • Test Center of Excellence How Can It Be Set Up? ISSN 1866-5705
    ISSN 1866-5705 www.testingexperience.com free digital version print version 8,00 € printed in Germany 18 The Magazine for Professional Testers The MagazineforProfessional Test Center of Excellence Center Test How can itbesetup? How June 2012 Pragmatic, Soft Skills Focused, Industry Supported CAT is no ordinary certification, but a professional jour- The certification does not simply promote absorption ney into the world of Agile. As with any voyage you have of the theory through academic mediums but encour- to take the first step. You may have some experience ages you to experiment, in the safe environment of the with Agile from your current or previous employment or classroom, through the extensive discussion forums you may be venturing out into the unknown. Either way and daily practicals. Over 50% of the initial course is CAT has been specifically designed to partner and guide based around practical application of the techniques you through all aspects of your tour. and methods that you learn, focused on building the The focus of the course is to look at how you the tes- skills you already have as a tester. This then prepares ter can make a valuable contribution to these activities you, on returning to your employer, to be Agile. even if they are not currently your core abilities. This The transition into a Professional Agile Tester team course assumes that you already know how to be a tes- member culminates with on the job assessments, dem- ter, understand the fundamental testing techniques and onstrated abilities in Agile expertise through such fo- testing practices, leading you to transition into an Agile rums as presentations at conferences or Special Interest team.
    [Show full text]