EVALUATING THE MOBILE DEVELOPMENT FRAMEWORKS

APACHE CORDOVA AND AND THEIR IMPACT

ON THE DEVELOPMENT PROCESS AND

APPLICATION CHARACTERISTICS

A Thesis

Presented

to the Faculty of

California State University, Chico

In Partial Fulfillment

of the Requirements for the Degree

Master of Science

in

Computer Science

by

© Michael Gonsalves 2018

Fall 2018

EVALUATING THE MOBILE DEVELOPMENT FRAMEWORKS

APACHE CORDOVA AND FLUTTER AND THEIR IMPACT

ON THE DEVELOPMENT PROCESS AND

APPLICATION CHARACTERISTICS

A Thesis

by

Michael Gonsalves

Fall 2018

APPROVED BY THE INTERIM DEAN OF GRADUATE STUDIES:

Sharon Barrios, Ph.D.

APPROVED BY THE GRADUATE ADVISORY COMMITTEE:

Tyson Henry, Ph.D. Bryan Dixon, Ph.D., Chair Graduate Coordinator

Kevin Buffardi, Ph.D.

PUBLICATION RIGHTS

No portion of this thesis may be reprinted or reproduced in any manner unacceptable to the usual copyright restrictions without the written permission of the author.

iii

TABLE OF CONTENTS

PAGE

Publication Rights ...... iii

List of Tables ...... vi

List of Figures ...... vii

Abstract ...... viii

CHAPTER

I. Introduction ...... 1

Background ...... 1 Statement of the Problem ...... 2 Purpose of the Study ...... 4 Organization of the Study ...... 4

II. Review of Related Works ...... 6

Background ...... 6 Cross-Platform Development Approaches ...... 7 Evaluating Cross-Platform Development Frameworks ...... 10

III. Methodology ...... 13

Overview ...... 13 Task Station ...... 14 Development Hardware Devices ...... 22 Development Framework Environments ...... 23 Source Code Evaluation ...... 24 Application Characteristics and Performance Profiling ...... 25 Development Process Qualitative Differences ...... 31 Summary ...... 31

iv

CHAPTER PAGE

IV. Results and Discussion ...... 33

Overview ...... 33 Source Code Evaluation ...... 34 Application Characteristics and Performance Profiling ...... 39 Development Process Qualitative Differences ...... 47 Summary ...... 70

V. Summary and Recommendations...... 72

Summary ...... 72 Limitations and Recommendations for Future Research ...... 73

References ...... 78

v

LIST OF TABLES

TABLE PAGE

1. Source Code Lines of Code, Number of Files and Dependencies ...... 35

2. Source Code Comparison of Native vs. Cross-Platform ...... 35

3. Android Application Package Size and Installed Size ...... 41

4. iOS Application Package Size and Installed Size ...... 41

5. Android Application Menu RAM Usage ...... 42

6. Android Application View Tasks RAM Usage ...... 42

7. iOS Application Menu RAM Usage ...... 43

8. iOS Application View Tasks RAM Usage ...... 43

9. Android Application Startup Times ...... 45

10. Android Application View Transition Times ...... 45

11. iOS Application Startup Times ...... 46

12. iOS Application View Transition Times ...... 46

13. Languages and Testing Tools ...... 60

vi

LIST OF FIGURES

FIGURE PAGE

1. Transition Diagram of Task Station ...... 15

2. Menu Page of The iOS Flutter Application ...... 16

3. View Tasks Page of the iOS Flutter Application ...... 17

4. Task Details Page of The iOS Flutter Application ...... 19

5. Add/Edit Task Page of the iOS Flutter Application ...... 21

6. Android Native Screenshots ...... 49

7. iOS Native Screenshots...... 50

8. Android Flutter Screenshots ...... 51

9. iOS Flutter Screenshots...... 52

10. Android Cordova/Ionic Screenshots ...... 54

11. iOS Cordova/Ionic Screenshots ...... 55

12. Flutter widget tree example ...... 57

13. Ionic Framework HTML example ...... 58

vii

ABSTRACT

EVALUATING THE MOBILE DEVELOPMENT FRAMEWORKS

APACHE CORDOVA AND FLUTTER AND THEIR IMPACT

ON THE DEVELOPMENT PROCESS AND

APPLICATION CHARACTERISTICS

by

© Michael Gonsalves 2018

Master of Science in Computer Science

California State University, Chico

Fall 2018

Mobile application development is an area of software engineering with its own unique development approaches and challenges. The market for mobile applications is highly fragmented, not only across different mobile platforms but also across devices and operating system versions within the same platform. The goal of any application developer is to reach the largest possible audience while minimizing development time, cost, and effort. The current market consists of two major platforms,

Android and iOS. Without the use of cross-platform tools it would be necessary to produce the same application twice to reach both marketplaces.

viii A number of cross-platform development frameworks are available to developers and each one has its own benefits and drawbacks. This paper details a comparison between two cross-platform development frameworks, Google’s Flutter and

Apache Cordova with the Ionic framework. In order to compare the two frameworks against each other and against the experience of developing native applications for each platform, an application prototype was developed using four different frameworks. A task management application was developed as a native Android application, a native iOS application, a Flutter cross-platform application, and an Apache Cordova cross-platform application using the Ionic user interface framework. A series of source code and application performance profiling evaluations were performed in order to determine the impact of choosing a cross-platform development framework on both the development experience and the performance and perceived quality of the deployed applications. Several areas of qualitative differences between the development experiences of Flutter and

Cordova are also detailed.

ix

CHAPTER I

INTRODUCTION

Background

Mobile application development is a relatively newer field in software engineering with a potential target audience of billions of users. Statista estimates that around 1.5 billion Android or iOS smartphones were sold globally in 2016 [1]. Such a large possible market makes the development of mobile applications a desirable focus for software developers. A total of 197 billion applications are estimated to have been downloaded in 2017 alone [2].

Unfortunately, the worldwide market for mobile applications is extremely fragmented. Over the past ten years, the market has moved from several competing mobile operating systems (BlackBerry, Windows Mobile, Symbian, Android and iOS) down to predominantly two platforms, Android and iOS [3]. Even if a developer chooses to focus on just Android and iOS, fragmentation is still a major issue. Joorabchi, Mesbah and

Kruchten [4] describe two different types of fragmentation in the mobile development field, fragmentation across platforms and fragmentation within the same platform. Android and iOS each have their own development environments, software development kits (SDKs), programming languages, and user experience standards. Even within just one platform a large array of possible device types can be found including phones, tablets, watches, TV devices and more. Each device has its own hardware configuration, screen size and

1 operating system support. This level of fragmentation makes developing mobile software especially challenging if the goal is to reach the entire mobile market.

For well over a decade, researchers have been trying to determine how mobile application developers can reach the largest possible audience with the least amount of development effort. In 2017, El-Kassas, Abdullah, Yousef and Wahba [5] described six broad categories of cross-platform development approaches and listed around 20 different cross-platform tools and frameworks. Some of those tools are available for developers now

(MoSync, Apache Cordova, Xamarian, etc.) and some are research or experimental prototypes (MobDSL, MD2 and ICPMD). With such a large number of options, it can be difficult for a developer to know where to start when choosing an approach to support multiple mobile operating systems [6]. While this problem is not exclusive to the field of mobile application development, it remains an open important research area due to the unique nature of mobile development and the continuous evolution of the market and development frameworks.

Statement of the Problem

The framework chosen for the development of a mobile application for multiple target platforms can have an impact on both the development effort required to create the application and on the perceived quality and performance of the generated application. The goal when choosing a framework is to maximize the reusability of the project’s code base and to minimize the total effort required to support multiple platforms while still producing a final product that is responsive, attractive, easy to use, and consistent across platforms.

2 Dhillon and Mahmoud [7] describe the ultimate goal of write once, run

anywhere (WORA) software development and how while development tools have come a

long way, each approach still has its limitations that prevent it from fully meeting that goal.

As the world of mobile application development continues to evolve, there is a need to

examine the available cross-platform development approaches and frameworks and

determine if the goal of WORA is achievable. This study evaluates two cross-platform

development frameworks in order to determine if they allow developers to produce WORA

software while maintaining application quality similar to native platform applications.

In order to narrow the focus of this thesis, two cross-platform frameworks were chosen for evaluation. The first, Apache Cordova [8] (often referred to as PhoneGap), is one of the oldest and most mature frameworks available and it has been featured in numerous research articles [9], [10], [11], [12]. Cordova allows developers to use web technologies such as HTML, CSS and JavaScript to build mobile applications that can run on multiple target mobile operating systems. The web applications are packaged into

WebViews that are embedded into native platform applications. Cordova by itself does not provide any custom user interface elements for mobile applications. The Ionic framework

[13] was used in conjunction with Cordova in this study in order to develop web applications that look and feel like mobile applications.

The other framework, Google’s Flutter [14], is very young and has not yet been widely included in mobile application development research. Flutter is especially intriguing because of Google’s support, considering that Apple’s iOS is their largest competitor in the mobile operating system market place. Developers using Flutter write their applications using the language Dart and the Flutter engine compiles the application to native ARM

3 code packaged with the Flutter runtime engine that renders the user interface to allow the application to run on both Android and iOS. In-depth details of the frameworks evaluated in this thesis will be discussed in Chapter IV.

Purpose of the Study

This thesis explores two research questions through the process of evaluating

Flutter and Apache Cordova with the Ionic framework.

1. What impact does the choice of a development framework have on the application

development process and the long term maintainability of the application?

2. What impact does the choice of a development framework have on the performance

and perceived quality of the generated applications?

The main focus of this paper involves the development of four application prototypes: a native Android application, a native iOS application, a Flutter application and an Apache Cordova (using the Ionic framework) application. The development of these applications and the evaluation of the generated applications will be detailed and the results will provide insight into the research questions of this investigation.

Organization of the Study

This thesis is organized into five chapters. Chapter II consists of reviewing related works in the areas of mobile application development and cross-platform application development approaches and frameworks. In Chapter III, the methodology of this investigation will be detailed including the development of the four different

4 application prototypes and the profiling and evaluation of the prototypes. Chapter IV will explore the characteristics and evaluated metrics of each prototype as well as a review of the key differences between writing applications using the cross-platform frameworks

Flutter and Apache Cordova. Finally, Chapter V will include a summary of the investigation, the limitations of the research, and recommendations for future areas of research.

5

CHAPTER II

REVIEW OF RELATED WORKS

Background

Mobile application development differs from traditional software development in a number for ways. Application developers have to consider a variety of factors including the battery life of the host device, the unique lifecycle of mobile applications including frequent interruptions, and the relatively limited hardware capability of mobile devices [15]. Mobile applications also tend to be much smaller than traditional desktop applications and more reliant on third-party libraries and application programming interfaces (APIs) [16]. The combination of these unique aspects of mobile development and the large potential audience of mobile applications make the field a rich environment for research.

One of the largest challenges of mobile application development is overcoming the fragmentation of the global market. The market place has largely narrowed down to two competing mobile platforms, Android and iOS. Both platforms have their own platform specific characteristics such as different user interface styles and guidelines, different development environments and SDKs, and different app store policies regarding minimum expectations [4], [15], [17]. This makes development of native applications that run on both major platforms effort intensive and potentially unfeasible for small development teams. In

6 order to reach the largest possible audience, and therefore achieve the highest return on investment for the development effort, this challenge has to be overcome.

There are two approaches to this challenge: developing separate native

applications from scratch for each platform or utilizing a cross-platform development

framework. Francese, Gravino, Risi, Scanniello and Tortora [12] surveyed mobile

development engineers and found that a majority of teams developed separate native

applications rather than trying to utilize a cross-platform solution. This approach is

potentially the most expensive in terms of effort as there is little opportunity to reuse

business logic or user interface code between the two platforms. As Joorabchi, Ali and

Mesbah [18] found, creating separate applications can also result in a variety of

inconsistencies appearing in the deployed applications. Smaller development teams lacking

large budgets would want to minimize their efforts and costs and ideally a cross-platform

application development framework would give them the ability to meet those goals.

Cross-Platform Development Approaches

A wide variety of cross-platform development approaches and tools are currently available for use. An extensive amount of research has been done to try and evaluate the available frameworks and provide best practices for developing applications for multiple mobile operating systems. Xanthopoulos and Xinogalos [19] categorized cross-platform development approaches into four broad approaches: web applications, hybrid applications, interpreted applications and generated applications. These categories will be examined in more detail later in this section. Each type of approach has its own benefits and drawbacks and choosing the best approach for a particular application depends

7 on a number of factors. Those factors include the ease of deploying applications to multiple platforms, the hardware and data access requirements of the target application, the preferred programming language and environment of the development team [17], and the importance of native “look and feel” of the application [19]. Ciman, Gaggi and Gonzo [20] add that the licensing of the different tools and the community support available are also important factors.

Web Applications

Creating a mobile friendly web application is probably the most easily portable

approach to cross-platform development. Most mobile devices now have web browsers

installed that support newer web standards such as HTML5 and modern CSS and

JavaScript. This approach allows developers to quickly be able to support different mobile

applications and even potentially desktop browsers as well. There are a number of

drawbacks, however, including the fact that mobile web sites cannot be installed on a

device or sold through app stores and they lack access to many of the hardware features of

the device [10].

A newer version of the web application approach has been developing over the past few years. Majchrzak, Biorn-Hansen and Gronli [21] described Progressive Web

Applications (PWAs) and their advantage over traditional mobile websites. PWAs have the potential to be cached on a device for off-line use and they can access hardware features through the use of bridges. However, this newer approach still lacks consistent browser and platform support and it remains to be seen if the platforms will embrace them due to the fact that the applications are not sold through their marketplaces.

8 Hybrid Applications

The hybrid cross-platform development approach is a combination of the web application approach and developing an integrated application. The most largely known hybrid cross-platform tool is Apache Cordova which allows application engineers to use web technologies to create applications and host them in a native WebView widget. Using the hybrid approach allows a similar level of portability across platforms as web applications with the added ability for an application to be sold through app stores and installed on devices. Hybrid applications also have greater access to the features of a device through JavaScript bridges. The reduction of development effort potentially comes with a performance penalty and a lack of native user interface look and feel. Malavolta, Ruberto,

Soru and Terragni [22] examined the user reviews of applications created through the hybrid approach and found that Android users perceived their user experience and performance to be poorer compared to natively developed applications. A similar study done by Mercado, Munaiah and Meneely [23] found that hybrid applications were more prone to user complaints regarding usability, reliability and performance. Two big reasons for the perceived difference in overall quality are the potential added overhead from the use of JavaScript bridges and the fact that the user interfaces of the applications do not use native user interface elements. If the performance needs and complexity of an application are not large factors, and meeting a native look and feel user interface standard is not a requirement, this approach can still be a good option even with its limitations [11].

Interpreted Applications

The Interpreted and Generated cross-platform development approaches both

generate applications that perform and behave closer to natively developed applications

9 than either mobile web applications or hybrid applications. The interpreted approach involves translating source code to platform-specific instructions in real time [5]. A runtime or virtual machine is packaged with the source code to handle the translation and generate native user interfaces. Titanium, Xamarian and Flutter fall into this category. Interpreted applications tend to have larger memory footprints and overhead due to the included interpretation frameworks but perform and look similar to native applications [6].

Generated Applications

The generated approach involves compiling source code to native byte code that

can be run without a virtual machine or interpreter. Examples of this approach include

XMLVM [24] and ICPMD [25]. The advantage of this approach is that the generated

applications are fully native applications and therefore do not suffer from the user interface

and performance issues that the other cross-platform approaches introduce. This approach

is still largely experimental however as cross-compiling code from one platform to another

is incredibly complex and APIs and features need to be mapped precisely in order to be

supported.

Evaluating Cross-Platform Development Frameworks

A number of cross-platform evaluation frameworks have been developed in

order to help choose the approach and tool best suited for a particular application. Ahti,

Hyrynsalmi and Nevalanien [9] describe a framework that looks at both subjective

measures (user experience, application appearance and easiness of development) as well as

quantitative measures such as application starting time, memory usage and disk space

10 requirements. Dhillon and Mahmoud [7] break their framework into three phases: the capabilities of a cross-platform development tool, the performance metrics of the tool, and the development experience that tool provides (including available support and the learning curve of the tool). Heitkötter, Hanschke, and Majchrzak [10] created two sets of criteria, infrastructure and development. The infrastructure criteria include the licensing and cost of the tools, the platforms that are supported, the long-term support of the tool, and the ability of the tool to provide the look and feel users expect. Among the development criteria are the development environment provided, the maintainability of the generated applications and the ability to move to another framework in the future with minimal effort.

The type of application being developed may be the biggest factor in choosing

the appropriate framework. Applications requiring a lot of graphical support and

animations will likely require a framework that provides better performance and lower

overhead [20]. Graphic intensive games should realistically be developed using the native

SDKs instead of using a cross-platform framework [21]. Light-weight data driven

applications may be a good choice for the hybrid approach. The hybrid approach may also

be ideal for applications that are intended solely for intra-enterprise use and need to be

developed quickly [9]. Ohrt and Turau [26] argue that the most important considerations

are whether a framework satisfies the developer’s needs and whether the deployed

application meets user expectations.

There are a number of limitations when it comes to evaluating the available

cross-platform development frameworks and approaches. According to Majchrzak and

Gronli [27], any research done can only be a snapshot of the current field as the number of

available tools keeps expanding and the existing tools continue to evolve. The

11 overwhelming number of development approaches makes it an impossible task to evaluate all of the tools at the same time. These factors make it likely that this area of software engineering will continue to be a rich source for future research.

12

CHAPTER III

METHODOLOGY

Overview

The goals of this investigation include determining the impact a cross-platform

development framework has on both the development effort required to create an

application and on the performance and perceived quality of the generated applications. In

order explore the research questions of this thesis, four prototype applications were

developed and analysis was conducted in several different areas. The first area included

analyzing the source code and files needed to develop the applications and the second area

involved the profiling and evaluation of the generated applications to gauge their relative

performance and user experience. In addition to the quantitative analysis performed, the

experience of producing the different applications also provided insight and information

about a number of qualitative differences between the development experiences of the

different approaches.

As stated in the introduction of this thesis, there are many different cross- platform development approaches available to mobile developers. In order to make the research manageable, two frameworks were chosen for comparison not only against each other but against the use of the native platform SDKs as well. The two frameworks chosen were Flutter and Apache Cordova (using the Ionic UI Framework). This chapter will detail the application prototype developed for this investigation and the development process for

13 each framework’s prototype. Native Android and iOS application prototypes were also developed to serve as points of comparison for the cross-platform prototypes.

Task Station

The application prototype developed for this research is a simple task management application called Task Station. The application consists of several pages including a menu page, a page listing all existing tasks in the database, a page for creating new tasks or editing existing tasks, and a page to see the full details of a single task. A transition diagram of the application is shown in Figure 1. This application also includes the following features: the ability to access the camera of the target device, the ability to access the networking feature of the device in order to retrieve data from a remote source, the ability to write to the local storage on the device, and the ability to use Google’s

Firebase suite to store and retrieve data and images from the cloud. These feature requirements provided the ability to investigate a variety of areas to get a comprehensive look at the performance characteristics of the application and any limitations of each development approach. The development of an application more complex than a standard

“hello world” application also provided a realistic codebase to examine.

Task Station uses Google’s Cloud Firestore to store the task information for users. Cloud Firestore is a NoSQL online database that also features offline device data persistence and automatic cloud syncing. Google’s Firebase Storage is used as a storage bucket to allow users to upload and download photos attached to tasks. The Firebase suite is available for a variety of systems including Android, iOS, web (JavaScript) and Flutter

14 (through first party plugins). The use of Firebase made it possible to have a consistent simple data and file storage system across the four development platforms.

Fig. 1. Transition Diagram of Task Station

Menu Page

The menu page of Task Station is the first screen that users see upon opening

the application. Figure 2 shows the user interface of the menu page. The page consists of

the application’s logo, a button to navigate to the view all tasks page, a button to navigate

to the add task page and button to sign in to the application or sign out if a user is already

signed in. A small text field is located under the authentication button to display the

current user’s name if a user is currently authenticated through Firebase. If there is not a

current authenticated user, the navigation buttons are disabled and the username text field

is hidden.

15

Fig. 2. Menu Page of The iOS Flutter Application

View All Tasks Page

The view all tasks page of the application allows the user to view a list of the

tasks they have saved in the Firestore cloud database. The user interface for the view

tasks page is shown in Figure 3. Tasks that have due dates in the future are displayed with

a sky blue background. Overdue tasks, with due dates before or equal to the current date,

are displayed with a red background. Tasks that have been marked complete and not

deleted are given a grey background. By default, tasks that have been marked as

completed are not displayed in the list.

16

Fig. 3. View Tasks Page of the iOS Flutter Application

There are a few different ways that a user can filter the tasks that are displayed in the list. The first filtering widget on the page is a select widget that contains a list of task categories (work, school, home, pets, etc.). The default value of this widget is to display tasks from every category. By setting a category filter, users can narrow down the list of displayed tasks. The second method of filtering tasks is setting a date range. The categories for the date range select widget are “all”, “today”, “this week”, and “this month”. The default value is set to view tasks from any date range. Selecting a category other than “all” will only display tasks satisfying the criteria of the selected range. The third and last filtering widget is a checkbox or a switch that allows the user to show or hide tasks that have been marked completed. The default behavior is to hide completed tasks. When a user changes the value of any of the three filtering widgets, the application

17 updates the visible list of tasks on the page. The filtering options can all be used together.

For example, a user can choose to view tasks that are marked with the “school” category that are due on the current date while hiding completed tasks.

The tasks in the list are sorted first by due date and then by priority. Tasks with a higher priority value are displayed before tasks with lower priority values that are due on the same day. Each row in the list displays the name of the task, the task’s assigned category and the task’s due date. In order to view more details of a task, the user can tap a row on the list to navigate to the task details page for the task in that row.

View Task Details Page

Users can view all of the information associated with a task on the task details page. The selected task’s name, category, due date, priority value and recurrence information is displayed on the page. If a photo has been taken and saved with the task, the photo is displayed to the user. The user interface for this page can be found in Figure

4.

Task Station has a very simple photo syncing system in the case that a user is logged into the same account on multiple devices. If a task has a saved photo associated with it but the photo is not located in the local storage of the device, the application attempts to download the photo from Firebase Storage. If a task has a photo that is located on the device, the application checks to see if the photo is saved to the cloud. If the photo is not located in the cloud storage, the application attempts to upload the photo.

One limitation of this simple system is that if a photo is deleted from the cloud it is not

18 deleted from the local storage of any other devices that have already downloaded the photo.

Fig. 4. Task Details Page of The iOS Flutter Application

There are two buttons located towards the bottom of the task details page. The

edit task button will navigate the application to the add/edit task page in edit mode to

allow the user to edit the selected task. The second button is displayed with the titles

“complete task” or “uncomplete task”. If a task is not marked as complete, the user can

tap the complete task button to initiate the appropriate completion behavior for the task.

If the user selects a task that is marked as complete, the “uncomplete task” button will set

the completed value of the task to false. After taking the appropriate action, the

application will navigate back to the view tasks page.

19 Tasks can be set to recur if the complete button is tapped on the task details page. A task can have its recurrence status set to “none” or it can be given a time frame

(days, weeks, months or years) and a number. For example, a task can be set to recur in 3 days or 2 weeks. This feature allows a user to track tasks that need to be done repeatedly without having to re-enter them. If a task has the recurrence category of “none”, it will simply be set as complete and it will remain in the database but by default it will not be displayed in the task list. If a task is set to recur, its due date will be adjusted according to its recurrence category and number.

Add/Edit Task Page

The fourth and final page of Task Station is the add/edit task page. This page

is used for two different situations. Users can either navigate directly to this page from

the menu page to add a new task to the database or they can navigate from the task details

page and edit an existing task. The user interface of the page will be slightly different

depending on if the page is in viewed in add mode or edit mode. Figure 5 shows the user

interface for the add/edit task page in add mode.

In add mode, the page will be displayed with its widgets set to default values.

The first widget is the task name text entry field. This field is used to provide a title or

name for the task. The next widget is a select widget used to assign a category for the

task. A date selection widget is provided to prompt the user to select a task due date.

Next, a text area allows the user to provide a description of arbitrary length. The take

photo button will launch the device’s camera screen to capture and save a photo that is

paired with the task. Once a user takes a photo, the take photo button changes to a delete

20 photo button. If the button is tapped after a photo has been taken, the application will delete the photo from memory and the take photo button will be displayed again.

Fig. 5. Add/Edit Task Page of the iOS Flutter Application

Under the photo button, a slider widget controls the priority value of the task.

Each task has a priority value between one and ten. Tasks with higher priority values are

displayed first in the list on the view all tasks page. The last data field on the page

controls the recurrence behavior of the task. By default, the recurrence value is set to

none. If the checkbox is checked, a new section appears with a text entry field for a

number and a select widget for the recurrence category. At the bottom of the page is the

add task button. Tapping this button will cause the application to validate the information

21 provided for the task and save the task to the database. The user will then be navigated back to the view all tasks page.

If this page is viewed in edit mode, the values of each of the data entry widgets will be set according to the data of the selected task. In addition to the difference in default values, the user interface of the page also has some behavioral differences. If a photo has been taken previously for the selected task, the photo will be displayed and the photo button will be set to “delete photo”. Deleting a photo in edit mode will remove the photo from local storage and cloud storage if necessary. The user is first prompted by an alert dialog that the photo will be deleted before any action is taken.

At the bottom of the page two buttons will be visible, the edit task button and the delete task button. When the user is finished making changes to the task they can tap the edit task button to save the changes to the database and navigate back to the view tasks page. If a user taps the delete task button, an alert dialog will warn them that they are attempting to delete data. If the user confirms their request, the task’s data will be deleted from the database and the application will navigate to the view all tasks page.

Development Hardware Devices

Xcode is required in order to develop applications for iOS devices and Xcode is only available for Apple computers running the OSX operating system. In order to meet this requirement, a Mac was chosen as the development machine. The computer used for developing the applications in this experiment was a 2012 series MacBook Pro running macOS High Sierra 10.13.6. In order to get a realistic sense of the performance and behavior of the completed applications, the decision was made to use real devices

22 instead of relying on software emulators. A 2016 Samsung Galaxy S7 running Android

8.0.0 was used for the Android device and a 2013 iPhone 5s running iOS 11.4.1 was used for the iOS device.

Development Framework Environments

Native Android Development

The native Android version of Task Station was developed with the Android

Studio integrated development environment (IDE). Android Studio version 3.1.4 was

used for both the writing and the compilation/building of the application. Version 26.1.1

of the Android SDK, version 28.0.0 of the Android SDK platform tools and version 1.8

of the Java Development Kit were used in the build process. The primary programming

language for native Android development is Java, XML is used to define the user

interface layouts and configuration information.

Native iOS Development

Xcode 9.4.1 was used as the IDE for the development of the native iOS

application. Swift 4.1 was used as the primary programming language. User interface

development was managed through the use of Xcode’s visual layout editing tools called

storyboards.

Flutter

Android Studio 3.1.4 was used as the IDE for the development of the Flutter applications. The primary programming language was Dart 2.0. Version 0.5.1 of Flutter was used along with version 173.4700 of the Dart Android Studio plugin and version

23 27.1 of the Flutter Android Studio plugin. The same Android platform tools that were used for the native Android application were used for the compilation of the Android

Flutter Runner application. Xcode 9.4.1 was used for the compilation of the iOS Flutter

Runner application.

Apache Cordova and the Ionic Framework

The IDE used for developing the Apace Cordova with the Ionic framework application was Visual Studio Code 1.25.1. Cordova 8.0 was used for compiling the

Cordova applications along with version 7.0.0 of the Cordova Android Platform tools and version 4.5.5 of the Cordova iOS Platform tools. Version 3.9.2 of the Ionic framework was used with version 4.1.0 of the Ionic command line interface tools. The programming and markup languages used included Typescript, 5.2.11, HTML and CSS. The same Android Studio, JDK, Android SDK and platform tools that were used for the native Android application were used to compile the Android Cordova application. The same Xcode tools that were used for the native iOS application were used for the iOS

Cordova application.

Source Code Evaluation

Analyzing the source code of a mobile application can provide a lot of

information about the amount of development effort it takes to write the application and

how much effort it will take to maintain the application over time. Syer, Adams, Zou and

Hassan [28] examined the source code of Android and BlackBerry native applications and

focused on two key dimensions, the characteristics of the source code and the number and

type of dependencies that applications for each platform required. When examining the

24 source code of the applications they used the metrics of the number of files containing source code, the number of classes in the application and the total lines of code in the application. In theory, applications that have larger codebases and more files will require more effort to develop and maintain. The number of library dependencies that an application has can make it less portable to other frameworks and it could also add performance overhead and instability over time as the libraries change.

This investigation’s analysis of the source code of the different generated

application prototypes includes an examination of the number of lines of code, the number

of user managed files and the number of dependencies of the source code. The platform

portability, how much of the source code can be reused in order to develop applications for

both Android and iOS, was also examined. The source code metrics tool Tokei was used to

count the lines of code in the user managed source files and main configuration files for

each prototype. In order to determine the number of dependencies each prototype required,

the configuration files (build.gradle files for Android, podfiles for iOS, pubspec.yaml files

for Flutter and package.json files for Apache Cordova and Ionic) were examined and the

required libraries were counted in each file.

Application Characteristics and Performance Profiling

Substantially more research has been done on comparing the benchmarking

results of applications developed using cross-platform frameworks. Ahti, Hyrynsalmi and

Nevalainen [9] presented an evaluation framework that used the quantitative measures of

application starting time, device random-access memory (RAM) usage and installed

application size. Another study by Willocx, Vossaert and Naessens [6] examined the

25 response time of an application as well its CPU usage, RAM usage and required disk space

(both application package size and installed application size). Majchrzak and Gronli [27] took their analysis a little deeper and studied the comparative performance of applications when they accessed hardware elements of the device such as database manipulation or using the device’s camera.

After the completion of the four application prototypes, a series of application profiling tests were done all of the applications on both an Android device and an iOS device. The goal of these tests was to collect information in the following metrics: the size of the application package before installation, the memory footprint of the application when installed, the application’s use of the device’s RAM while running, the startup time for each application from the user’s point of view and the response time for view transitions. This data provides answers to the question of whether a performance penalty exists when a cross-platform development approach is used when compared with a natively developed application.

Application Package Size and Installed Footprint

Determining the size of the Android application packages was done by examining the output folders in the project folders of each project after the latest build was generated. After a project is built, Android studio produces an .apk file that is used to install the application onto a target device. For the purposes of this experiment, all three of the application packages examined were from debug builds. This means that they are not necessarily as compressed as the final application release builds and they contain debugging information that allows a user to debug or profile the application running on a

26 device or emulator. After locating each .apk file, the file size was gathered by examining the file’s properties.

The memory footprint of the application after it was installed on an Android device was measured by reviewing the storage settings of the target device. Only the actual size of the installed application was measured, any document or cache data associated with the application was ignored.

The method of collecting the package file size for the iOS applications is not as precise as the one used for the Android applications. Apple only allows developers to upload applications if they have a paid developer’s account and the final packaging of an application happens during the upload process. Because it was not possible to access a final .ipa package file for the iOS applications, an estimation was used instead. This estimation was determined by archiving each project in Xcode using a profiling build type. After each project was archived, the contents of the archive were examined and the payload application file inside the archive was moved to a separate folder and compressed. The compressed file is not necessarily the same size of a final App Store package but it provides an estimation of the formally packaged file.

The method for determining the final installed size of an application on an iOS device was similar to the method used for the Android applications. The storage settings of the iPhone were examined and only the size of the application was measured.

All associated data and cache information was ignored.

27 Memory Usage Profiling

Due to the fact that this research involved producing applications for two

different operating systems, a strategy for profiling the RAM usage of each application

had to be developed separately for each platform. The profiling of the Android

applications was completed using the Android Profiler tool packaged with Android

Studio. Apple’s Xcode Instruments profiling tool was used for the testing of the iOS

applications.

For each application two measurements were taken, the RAM usage of the application after the application was launched and the menu page was active and the memory usage of the application after the application navigated to the view tasks page.

Both profiling tools displayed the RAM usage of the application as a chart that ran for the length of the application’s active lifetime. The measurements were taken at the point that the RAM usage leveled off after initial shifts during either the launch of the application or the transition to a new page. For the purposes of this experiment, all of the profiled applications were signed into the same account and used the same dataset of tasks. The profiling process was repeated a total of three times for each application. The purpose of the series of tests was to determine the comparative memory requirements of each application and to determine if there was a performance penalty due to any overhead involved in the non-native applications.

Application Start-up Time and View Transition Time

There are a variety of different strategies that could be used to measure the time it takes for an application to launch and be responsive and the time it takes for an application to navigate between two different pages or views. One strategy is to include a

28 series of console logging statements that print out a time value that is compared to a previous measurement taken before launching the application or changing views. This strategy could potentially give researchers an accurate measurement of time but it can be difficult to standardize the implementation of the strategy across multiple different development frameworks. Including the console logging statements is also potentially risky because their inclusion changes the application itself and the test is not necessarily measuring the original application’s code. This value would still be useful for comparison but it would not be an absolute measurement of time.

Another potential strategy is to measure startup or transition time according to

the user’s perspective by capturing video of the process of launching the application and

navigating between views. For the purposes of this experiment, a video was made for

each of the six applications (native Android, Flutter Android, Cordova Android, native

iOS, Flutter iOS and Cordova iOS). The videos were produced by filming the screen of

the test devices with an external camera. Each video begins with footage of a user’s

finger tapping the appropriate application’s icon on the device’s screen. After the

application icon is tapped, the application starts to launch and first displays a splash

screen. After the splash screen fades away, the initial menu page is shown followed by

the user’s finger tapping the add task button and navigating to the add/edit task page in

add mode. After the add/edit task page loads, the user navigates back to the menu page by

either tapping the hardware back button on the device or tapping the backward navigation

button in the application’s navigation bar.

After all of the videos were filmed, the footage was edited into clips using

Apple’s iMovie application. Each video was edited into three clips as follows:

29 1. Starting with the point of contact of the user’s finger with the application’s icon

up until the application’s splash screen fades and the menu page is fully loaded

and visible.

2. Starting with the point of contact of the user’s finger with the add task button on

the menu page up until the add/edit task page is fully loaded and visible.

3. Starting with the point of contact of the user’s finger with the appropriate

backward navigation button up until the menu page is fully visible again.

After the editing process was completed, the length of each clip was examined

in iMovie and recorded. The complete process was repeated so that each application was

filmed and the length of the edited clips was recorded a total of three times. In order to

make the device’s environment as consistent as possible, no other applications were

running in the foreground while each application was launched and each application was

removed from active memory after each filming period was complete.

There are a couple of limitations to this strategy. The first limitation is that it is difficult to pinpoint the exact moment in the footage when an event occurs so the editing of the clips can be subjective. The second limitation is iMovie does not provide precise time measurements for the length of a clip, it only displays the length in terms of seconds (ex. 1.2 seconds, 1.5 seconds, etc.). These limitations prevent the strategy used in this experiment from resulting in precise time values for each activity measured but it does give provide a useful comparison point from the perspective of the user’s experience with the application.

30 Development Process Qualitative Differences

There are a number of differences in the development experiences of the approaches and frameworks explored in this paper. In addition to the source code analysis and application profiling described in this chapter, the following aspects of mobile software development will be described thoroughly in Chapter IV:

• User interface elements and styling

• User interface design

• Learning curve

• Debugging strategies and tools

• Live reload behavior

• Device filesystem and hardware access

• Google Firebase support

• Licensing and cost

• Application splash screens and icons

• Application compilation and deployment

The process of developing the four prototype applications provided details for

the different qualitative categories listed. These categories are important areas to explore

in order to come to conclusions about the research questions of this thesis.

Summary

The combination of these areas of analysis provided useful information and

helped to shed light on the research questions of this thesis. The analysis of the source code

31 and development process helped to determine the impact a development approach has on the effort required to develop and maintain an application for multiple platforms. The application performance profiling portion of this experiment helped to answer the question of whether the generated applications have any noticeable performance or quality characteristics based on the approach used to develop them. The qualitative comparisons of the different development approaches rounds out the larger picture of the development experience for each approach. The results of these analyses will be reviewed in the next chapter.

32

CHAPTER IV

RESULTS AND DISCUSSION

Overview

With so many cross-platform development framework options available, it is important to thoroughly research development tools before choosing the best option for a project. Do the available tools provide the device hardware access that the application requires? Do the available tools help save on the development time and effort in order to produce applications for multiple platforms versus producing native applications for each platform? Are any measureable performance or memory footprint penalties introduced into applications developed using cross-platform frameworks? What other impact does the choice of development framework have on both the application development process and the characteristics of the deployed application? This experiment was intended to find answers to these questions through the process of evaluating two of the cross-platform frameworks currently available, Flutter and Apache Cordova with the Ionic framework.

After completing the development of the four Task Station prototypes (native

Android, native iOS, Flutter and Apache Cordova with the Ionic framework), a series of source code, memory footprint and performance profiling evaluations were performed.

The source code evaluations helped to determine the impact that each strategy had on the development experience in terms of initial effort and long term maintainability. The application characteristics and performance profiling evaluations were used to examine

33 the impact that each strategy had on the performance and perceived quality of the deployed applications. The ideal goal of a developer is to minimize the effort it takes to create and deploy software without sacrificing the quality of their products.

There are also a number of key differences between the development experiences of using Flutter or Apache Cordova and developing native applications for

Android and iOS devices that cannot be easily measured through tests. Reviewing the available documentation of each framework and using the framework tools to develop the

Task Station prototypes provided insight into these differences and the impact they have on the developer. Some of these qualities may be of a subjective nature due to the preferred programming language and development environment of the developer. Other areas such as the documentation available for the frameworks and the integration of existing software libraries such as Google Firebase are important things to consider to determine if the tools are sufficient for the project being designed. In this chapter the results of the quantitative evaluations will be detailed along with the qualitative features and characteristics of each development strategy.

Source Code Evaluation

The static source code evaluation portion of this thesis was focused on the number of lines of code written for each prototype, the number of user managed files for each prototype and the number of external dependencies for each prototype. By examining these measurements it is possible to draw some conclusions about how the choice of a development framework or strategy impacted the effort required to deploy applications for both Android and iOS devices. Table 1 displays the lines of code,

34 number of user managed files and external dependencies for each of the four development frameworks. Table 2 combines the data for the native Android and iOS prototypes to give a better look at the differences in strategies between using cross- platform development frameworks and writing native applications for both platforms.

Table 1 – Source Code Lines of Code, Number of Files and Dependencies

Application LOC Files Dependencies Native Android 1,870 24 4 Native iOS 1,534 12 6 Flutter 1,267 16 7 Cordova/Ionic 1,123 27 36

Table 2 – Source Code Comparison of Native vs. Cross-Platform

Application LOC Files Dependencies Native Combined 3,404 36 10 Flutter 1,267 16 7 Cordova/Ionic 1,123 27 36

Lines of Code

The number of lines of code written for an application provides a metric to

evaluate the comparable effort it takes to write a piece of software. The data can also be

used to determine how easy it would be to maintain the software throughout its lifespan.

Lines of code is not a perfect measurement as some of the differences in count between

programs could be due to the code style of the framework. Generally speaking, if one

codebase has more lines of code than another it could be said that it took more developer

effort to write and because there is more code to maintain it could require more effort to

revise or update the code in the long term [29].

35 According to the data in Table 1, the native Android application required the most lines of code written with 1,870 lines. In part, this could be attributed to Android requiring more lines of code to connect elements from the user interface to the application code and the fact that Android separates the layout functionality into separate

XML files for each page of the application. Both the Flutter and Cordova prototypes required fewer lines of code than their native platform counterparts. This is especially impressive when the numbers for the native prototypes are combined. According to Table

2, the strategy of developing two native applications required a total of 3,404 lines of code while the cross-platform framework strategies required close to 1,200. This means that by using Flutter or Cordova the number of lines of code to write or maintain is only a little more than a third of the lines required for producing separate native applications for both platforms. The Flutter application codebase was a little more verbose than the

Cordova codebase but not by a significant margin. Both cross-platform frameworks had codebases that were shared 100% between the Android and iOS applications with the exception of the platform specific configuration/manifest files. This seems to suggest that the ideal of write once run anywhere code is possible when using Cordova or Flutter as the development framework.

User Managed Files

Each development framework has its own project structure. Applications

developed using the native Android SDK are broken down into Java class files, and XML

layout and resource files. Native iOS applications are made up of Swift class files and

Storyboard layout files. Flutter applications combine the functionality and layout of the

program into Dart class files. Ionic/Cordova programs are broken down into TypeScript

36 class files, HTML files for layouts and CSS files for styling details. If a framework requires developers to maintain a larger number of files it may lead to a cleaner separation of concerns but it could also mean that the project is more complex to manage.

For the purposes of this thesis, a user managed file is a file that a typical developer would either create or edit in a project. All of the development frameworks examined had a number of files that were generated or modified automatically during the build process.

Table 1 includes a count of user managed files for each of the four application prototypes. The Cordova project contained a total of 27 files, this increase is due to the fact that the code for each page of the application is broken down into separate

Typescript, HTML and CSS files. By comparison, the Flutter project required 16 files.

The Cordova/Ionic project structure has the advantage of better separating the concerns of the layout and behavior of an application page but it can also result in a larger number of files to manage as projects grow.

Table 2 provides a comparison of the number of user generated files for

Flutter or Cordova against developing separate native applications for Android and iOS.

Both of the cross-platform frameworks provided a significant reduction in the number of

project files that a developer would have to manage. After examining the lines of code

and the number for files for each project it is clear that a cross-platform approach can

reduce the overall effort required to deploy software to both platforms.

Dependencies

Applications that rely on a large number of external dependencies can potentially be more difficult to maintain in the long term [30]. As software libraries

37 continue to evolve, changes can be introduced that result in bugs, errors or incompatibilities in applications that use them. There is also the potential that external libraries could introduce security vulnerabilities that the developer may not be aware about [31]. This experiment included an evaluation of the number of required dependencies of the codebases of each Task Station prototype.

The number of dependencies for each version of Task Station can be found in

Table 1. The native Android and iOS applications mostly only required the Google

Firebase libraries to be integrated into their codebases. Flutter required the Firebase

libraries and a few plug-ins including the “image_picker” plug-in for device camera

access, the “path_provider” plug-in for device filesystem access and the “intl” plug-in for

working with dates. The Cordova/Ionic project required the inclusion of 36 different

libraries. One reason for the large number of dependencies is that Apache Cordova does

not provide a user interface framework by itself and all device hardware interactions

require separate plug-ins. The Ionic framework was included in order to provide a user

interface suited for mobile applications. The Cordova project also required the inclusion

of a variety of Angular libraries, Cordova device libraries, Ionic libraries to wrap the

Cordova libraries, the Firebase libraries and the AngularFire2 library to provide Firebase

support using Angular. The Cordova/Ionic codebase was by far more dependent on

external libraries than any of the other codebases. This could mean that projects built

with Cordova and Ionic could be difficult to maintain over time as changes to the

required libraries could result in “application breaking” behavior.

The source code evaluations provided information to help answer the question

of what impact the choice of development framework has on the development and

38 maintenance experience of a mobile application. Both the Flutter and Cordova/Ionic applications required fewer lines of code than writing separate native applications. The cross-platform applications also required the management of fewer files. The Flutter application required several additional dependencies than the native applications but the

Cordova/Ionic application required an extensive list of dependencies. It appears that with the exception of a large number of external dependencies of the Cordova framework, the cross-platform frameworks simplified the development experience by requiring only a single smaller codebase when compared to developing multiple native platform applications. This research question will be discussed further later in this chapter in the qualitative evaluation portion.

Application Characteristics and Performance Profiling

The purpose of the application performance profiling evaluations was to help

answer the question of what impact the choice of a development framework has on the

performance and perceived quality of the generated application. The source code

evaluation results suggest that the use of a cross-platform development framework can

significantly reduce the effort required to create and maintain software for multiple

platforms. Does that reduced effort come with a tradeoff in the quality of the deployed

application? There are a number of benchmarking tests that can be performed to judge the

performance of software. The tests performed in this experiment include a look at the

compiled package size of an application, the memory footprint of an installed application,

the RAM usage at two different points in the life of a running application and the time it

took for an application to launch and to transition between views. The results of these

39 tests will help to determine if there is a trade off in terms of development effort and application quality.

Application Package Size and Installed Footprint

Flutter and Cordova/Ionic reduced the amount of code it took to write

applications for both Android and iOS devices. The user managed codebases may be

smaller, but how much additional overhead is included in the deployed applications?

Flutter applications include the Flutter engine in order to render the user interfaces of the

applications. Cordova applications package the Typescript, HTML and CSS application

code into a WebView that is embedded into a native application. Both frameworks

require the addition of libraries or plugins in order to manage interaction with the

filesystem or hardware of devices. All of this additional overhead could potentially

impact the size of the compiled applications. In general, users prefer applications that

have smaller packages as they take less time to download and require a smaller footprint

of device memory when installed [26].

Table 3 and Table 4 display package size and installed size of the deployed

Android and iOS versions of Task Station. A review of the data suggests that the

Cordova/Ionic applications had the smallest package sizes and the smallest installed

memory footprints. The web based applications were even significantly smaller in size

than the native platform applications. The Flutter applications had larger packages and

required more memory to install than both the Cordova/Ionic and native platform

applications. Cordova’s overhead does not seem to be apparent in the results of this

particular evaluation but the Flutter SDK and the Flutter engine do have a notifiable

effect. The size of the iOS Native package appears to be an outlier and this may be due to

40 the large size of the project’s included Firebase pod files. The default build settings for

Xcode do not appear to streamline the size of project archives for the type of build used in this experiment.

Table 3 – Android Application Package Size and Installed Size

Application Package (MB) Installed (MB) Android Native 5.70 5.69 Android Flutter 16.9 41.29 Android Cordova/Ionic 1.40 2.79

Table 4 – iOS Application Package Size and Installed Size

Application Package (MB) Installed (MB) iOS Native 78.90 66.20 iOS Flutter 22.00 74.00 iOS Cordova/Ionic 3.80 7.90

Memory Usage Profiling

The package size and installation memory footprint evaluations provided

some information about the impact of a development framework on the characteristics of

deployed applications. Some measurement of the memory overhead of a cross-platform

framework could potentially be made through those evaluations but it could also be

measured in the amount of memory required during the life of a running application. In

order to attempt to measure the RAM usage of each type of application, measurements

were taken using platform specific profiling tools at two points during the life of the

application. The first point was after the application finished launching and the menu

page was available for user interaction. The second point was after the application

41 transitioned to the view tasks page and the list of tasks was loaded. The process of taking measurements was repeated a total of three times. These measurements provided more detail regarding the performance overhead of the Cordova/Ionic and Flutter applications.

The RAM usage results for the Android applications can be found in Table 5 and Table 6. The most obvious observation is that the Flutter and Cordova/Ionic applications had higher RAM usage than the native Android application. It is difficult to draw conclusions about a comparison between the RAM usage of the Flutter and

Cordova/Ionic applications themselves as the values varied each time the measurements were taken and no clear trend was found. The results for the menu page showed

Cordova/Ionic requiring slightly more memory while the view tasks page results showed the opposite.

Table 5 – Android Application Menu RAM Usage

Application Trial #1 (MB) Trial #2 (MB) Trial #3 (MB) Android Native 97 99 88 Android Flutter 150 143 139 Android Cordova/Ionic 165 147 147

Table 6 – Android Application View Tasks RAM Usage

Application Trial #1 (MB) Trial #2 (MB) Trial #3 (MB) Android Native 115 101 103 Android Flutter 176 174 180 Android Cordova/Ionic 164 175 155

42 The results of the iOS application RAM usage measurements are contained in

Table 7 and Table 8. The iOS results look very different from the Android results. The

Flutter prototype had the highest RAM usage for both the menu page and view tasks page measurements. During the time the menu page was active, the native iOS prototype had a lower RAM usage than the Cordova/Ionic prototype but the results were flipped when the view tasks page was active. It may be that the Cordova/Ionic prototype requires a higher baseline RAM level but then stays fairly consistent throughout the application’s lifetime while the memory usage of the other prototypes varied according to the workload.

Table 7 – iOS Application Menu RAM Usage

Application Trial #1 (MB) Trial #2 (MB) Trial #3 (MB) iOS Native 64 64 64 iOS Flutter 101 102 102 iOS Cordova/Ionic 78 89 89

Table 8 – iOS Application View Tasks RAM Usage

Application Trial #1 (MB) Trial #2 (MB) Trial #3 (MB) iOS Native 93 94 93 iOS Flutter 125 128 128 iOS Cordova/Ionic 71 77 75

After reviewing the RAM usage data for the Android and iOS applications it

appears that the overhead of the cross-platform frameworks has a noticeable impact on

memory requirements. The results of this evaluation were consistent with the findings of

other previous studies in that the cross-platform applications had higher memory usage

while active [11], [6], [26]. The Flutter prototypes for both Android and iOS required the

43 highest amount of memory and the Cordova/Ionic required more memory than the native applications with the exception of the view tasks page measurement for the iOS device.

Overall it appears that the memory requirements for the Cordova/Ionic application vary slightly but are fairly consistent while the Flutter application requirements increase with higher workloads (such as more widgets to render or data to process). The Task Station prototypes are not data intensive applications so it is possible that other types of applications may behave differently if the same type of evaluations were performed.

Application Start-up Time and View Transition Time

One of the most important aspects of an application in terms of perceived

quality from a user’s perspective is the responsiveness of the application. Does the

application load quickly? When the user interacts with the application’s interface does the

application respond right away? Mobile application users expect that an application will

respond to interaction immediately and perceive applications that do not meet that

expectation as poorer in quality [6].

Three different evaluations were performed in order to attempt to measure the

responsiveness of each version of Task Station. The first evaluation was a measurement

of the amount of time it took between the point at which that the user tapped the icon of

the application on their device and when the menu page of the application was fully

rendered. The second evaluation measured the time that elapsed between the user tapping

the add task button on the menu page and the add task page rendering fully. The last test

measured the time between the point where the user requested that the application

navigate back to the menu page and the point where the menu page was fully visible

again.

44 The application startup times for the Android applications can be found in

Table 9 and the transition times for Android can be found in Table 10. The Cordova/Ionic application had the longest startup time by a large margin. The Flutter application consistently launched slightly faster than the native Android application. In terms of page transition times, the Flutter version displayed the add task page faster than the native version and the Cordova/Ionic version but only by a small margin. The Cordova/Ionic version had response times that were nearly identical to the native application. The backward navigation times are not displayed here in a table because for all of the Android applications the response time was consistently around a half of a second. Overall, the

Flutter Android version started up the quickest and navigated to the add task page in the shortest amount of time. The Cordova/Ionic Android version performed similar to the native Android version during the navigation tests but took significantly longer to launch fully.

Table 9 – Android Application Startup Times

Application Trial #1 (s) Trial #2 (s) Trial #3 (s) Android Native 1.3 1.9 1.5 Android Flutter 1.2 1.0 1.1 Android Cordova/Ionic 3.3 3.9 3.3

Table 10 – Android Application View Transition Times

Application Trial #1 (s) Trial #2 (s) Trial #3 (s) Android Native 0.7 0.8 0.7 Android Flutter 0.5 0.4 0.5 Android Cordova/Ionic 0.6 0.6 0.7

45 The iOS startup and transition times can be found in Table 11 and Table 12.

Again the results showed that the Flutter version was the quickest to launch and transition to the add task page. The Cordova/Ionic version was the slowest version in terms of startup time but it performed similarly to the native iOS version after startup. The backward navigation response times were nearly identical to the Android versions with response times of a half of a second.

Table 11 – iOS Application Startup Times

Application Trial #1 (s) Trial #2 (s) Trial #3 (s) iOS Native 2.1 2.4 1.1 iOS Flutter 2.0 1.2 1.2 iOS Cordova/Ionic 3.4 2.9 3.1

Table 12 – iOS Application View Transition Times

Application Trial #1 (s) Trial #2 (s) Trial #3 (s) iOS Native 1.3 1.1 1.2 iOS Flutter 0.4 0.4 0.5 iOS Cordova/Ionic 1.3 1.1 1.0

The results of the startup, transition, and backward navigation tests were consistent between the Android and iOS device tests. The Flutter versions were the fastest and most responsive versions while the Cordova/Ionic versions were slow to start but responded comparably with the native versions after launching. The startup and response time behavior of the hybrid applications were consistent with prior research [6].

When the data for all of the application characteristics and performance profiling evaluations are taken into consideration it does not appear that there is

46 necessarily a response time penalty that corresponds with higher device memory or RAM requirements. The Flutter applications had the largest installed sizes and the highest memory requirements but also launched faster and transitioned between views faster than the native or Ionic/Cordova versions. The responsiveness of an application is a major factor in terms of how a user perceives the quality of an application and Flutter versions appeared to consistently perform better in this category. The performance of an application is not the only factor that determines the quality of a mobile application, several qualitative areas such as user interface elements and styling will be explored in detail in the next section of this chapter.

Development Process Qualitative Differences

Both Flutter and Apache Cordova allow for the development of mobile applications for Android and iOS with a significant reduction in time and effort when compared to the strategy of developing separate native applications. A series of quantitative measurements of application characteristics and performance metrics were described earlier in this chapter. Those measurements provide part of the picture of the impact the choice of framework has on the development experience and the characteristics and quality of the deployed applications. There are a number of other qualities and differences between the two cross-platform frameworks that can have important impacts as well. These qualitative features of the frameworks will be discussed in this section.

47 User Interface Elements and Styling

The Android and iOS platforms each have their own user interface guidelines and design principles. Android’s design philosophy is known as Material Design, Apple has its own interface design theme known as Cupertino. The users of a particular platform may not know the particular details of the platform’s design principles but they will likely notice that most applications they use have a similar look and feel [19], [10],

[27].

One of the challenges of cross-platform development is being able to deploy software for multiple platforms that look and behave similarly to native platform applications. Another hurdle is that the widgets that make up the user interface of an application may not be available for all platforms and if there are similar widgets, they may function very differently. A key feature of a cross-platform development framework is the ability to create interfaces that are familiar to platform users while meeting the needs of the application. Ideally, the framework would also automatically tailor the interfaces to each platform without additional effort on the part of the developer.

The user interfaces for the native Android and iOS versions of Task Station can be found in Figure 6 and Figure 7. Some of the key design differences between the native platform applications include the layout of the navigation bar, the styling of buttons, the look and behavior of the “picker” or “select” widgets on the view all tasks page and the add/edit task page, the use of switches in iOS instead of checkboxes in

Android, and the functionality of the date picker widgets. Each platform also has its own default font family and text characteristics. The interfaces of the Android and iOS applications look very different but they both have the same overall functionality.

48

Fig. 6. Android Native Screenshots

49

Fig. 7. iOS Native Screenshots

The Flutter framework makes it easy to quickly design user interfaces and deploy applications for both Android and iOS but it does not fully attempt to match the look and feel of each platform. Figure 8 shows the visual layouts of the Android Flutter prototype and Figure 9 displays the layouts of the iOS Flutter prototype. For the most part, the Android and iOS versions both look very similar. One of the biggest differences is the layout of the navigation bar at the top of the application. The iOS version centers

50 the title of each page while the Android title is left aligned. The appearance of the back arrow on the navigation bar is also different for each platform. Other minor differences include text characteristics, image sizes and padding amounts.

Fig. 8. Android Flutter Screenshots

51

Fig. 9. iOS Flutter Screenshots

Flutter primarily features Material Design widgets but it does have a growing catalog of Cupertino widgets. The widgets used in the layout tree can be from either design standard but they need to be specified in the source code. At the time this research was performed, the framework did not support specifying a generic widget type that would be replaced with the appropriate platform widget during the build process. Using the Material Design widgets results in applications that look and feel similar to Android applications with some minor exceptions like the difference between the native Android

52 embedded date picker and the Flutter date picker that is launched as an overlaying dialog.

Only Material Design and Flutter widgets were used in the development of the Flutter

Task Station prototypes.

The Ionic framework is used in conjunction with Apache Cordova in order to

allow for the design of web applications that have interface elements similar to native

mobile applications. The framework features a large number of widgets, each with

platform specific style versions. One nice feature of Ionic is that it is more flexible than

the Flutter framework when it comes to automatically tailoring platform widgets. During

the build process, Ionic assigns platform based CSS classes to the HTML interface

elements to provide an adaptable user interface. The visual layouts of the Android and

iOS Cordova/Ionic applications are shown in Figure 10 and Figure 11.

Similar to Flutter, the Ionic framework automatically adapts the layout of the

application’s navigation bar depending on the platform. Other platform difference include

text font families and styling, button styling, checkbox appearance, and margin and

padding values. Even though Ionic does more than Flutter by default to tailor layouts to

their deployed platforms, the Ionic widgets still appear to match the style of Android

applications more than iOS applications. The Ionic select widgets look like Android

select widgets but their selection dialogs behave differently. The Ionic date picker widget

launches a separate dialog similar to Flutter instead of using a widget that resembles the

iOS or Android native date pickers.

53

Fig. 10. Android Cordova/Ionic Screenshots

54

Fig. 11. iOS Cordova/Ionic Screenshots

Both frameworks provide some ability to tailor the interfaces of their applications to a target platform but neither fully delivers native look and feel for both

Android and iOS. In an ideal world, a cross-platform development framework would automatically switch out generic interface elements with the native equivalents. In reality, this would be a nearly impossible requirement as each platform has its own needs and layouts can be sensitive to minor changes. For the purposes of the development of the

55 Task Station prototypes, both Flutter and Ionic provided a selection of quality user interface elements and allowed for the ability to deploy the applications to both platforms using only one layout design.

User Interface Design

One of the most time consuming aspects of mobile application development is designing and maintaining the user interface of the application. There are a lot of important details that need to be managed and a development framework must provide a user friendly method for doing so. Google’s Flutter and Apache Cordova with the Ionic framework both handle the user interface design very differently.

Flutter organizes both the layout of a page and its behaviors into the same

Dart class file. The interface is defined as a nested tree of widgets where every element of the interface, including the page itself, is a widget. Widgets are classified as either stateful or non-stateful. A stateful widget will be re-rendered if the state of the page has changed. A non-stateful widget is rendered when the active page is initially drawn and does not change through the life of the page. An example of a Flutter widget tree is displayed in Figure 12.

The layout of a page in Flutter is defined in the build function of the page. The build function returns the nested tree that contains all of the widgets that make up the layout of the page. The example above shows the layout of the menu page of the Task

Station Flutter prototype. The parent widget of the tree is a Scaffold widget that contains a single Column. The Column hosts several Container widgets that wrap Button, Text or

Image widgets. Each type of widget has its own properties but they can also be wrapped

56 in Container or Padding widgets to add additional properties such as margin and padding information. Function calls can be used as elements of the tree to manage which widgets are displayed according to the current state of the page. Whenever data related to the page changes, the state of the page must be updated so that any widgets that are bound to the data can be re-rendered.

Fig. 12. Flutter widget tree example

57 Flutter’s approach to interface design may be not be intuitive for newer developers at first but when paired with Flutter’s live reload capabilities it is easy to experiment with the tree and instantly see an updated layout. As layouts get larger and more complex, it may get difficult to manage all of the elements of a page in an organized manner but the use of custom widgets may help to better provide structure.

Flutter widgets were designed with a focus on the concept of composition over inheritance, the use of custom widgets that are composed out of other groups of widgets allows a developer to take a modular approach to interface design.

The Ionic framework takes a very different approach to user interface design.

The behavior of a page is written in a separate Typescript file, the general layout of the page is defined in an HTML file and the fine detail of the layout is described in a CSS file. The HTML layout of the menu page of the Cordova/Ionic prototype is displayed in

Figure 13.

Fig. 13. Ionic Framework HTML example

This user interface design process should look familiar to anyone with web

experience because designing an application interface using the Ionic framework is

58 exactly like designing the layout of a web application. The HTML file for the menu page is broken down into two sections. First the layout for the page’s navigation bar is defined in the header section of the page and then the body content of the page is defined. All of the elements of the layout are defined using HTML tags including some custom Ionic tags that allow developers to use widgets that were designed for use in mobile applications. Event handlers are assigned to HTML elements and properties of the elements are bound to the data in the page through the use of Angular directives.

The Cordova/Ionic approach to user interface design is cleaner than the Flutter approach in that it separates out the different aspects of the page. This can make it easier to make a change to the behavior or look of a page without accidently impacting another element of the page. One downside of this approach is that it results in more files to manage especially in larger projects.

The choice of which framework provides a better user interface design

experience comes down to the technical background and programming language

preference of the developer. The Flutter approach may seem similar to the XML layout

system used to design interfaces for Android applications while the Cordova/Ionic

approach will appeal to developers that have experience in web design and web

programming. Both frameworks provide developers with the tools they need to easily

design attractive layouts for their mobile applications.

Learning Curve

One of the most important aspects of a development framework is the ability to quickly learn how to use the framework and start creating mobile software. The longer

59 it takes to learn a framework, the more it costs a development team in terms of human effort, salaries, and potential lost revenue. If a framework’s documentation is difficult to understand or if the framework requires the use of multiple supporting technologies it can be less attractive to developers. A framework with a higher learning curve can also make it difficult to find solutions to problems that arise during the design of applications. The development of the Task Station prototypes provided a lot of insight into the learning curves of the Cordova/Ionic framework and Flutter. An overview of the programming languages and common testing tools for each framework can be found in Table 13.

Table 13 – Languages and Testing Tools

Front-End Back-End Common Framework Languages Languages Testing Tools

JUnit Native Android XML Java / Kotlin Espresso

Objective- / XCTest Native iOS XML Swift XCUITest

Dart Test Package Flutter Dart Dart Flutter_Test Package Flutter WidgetTest

Angular Cordova/Ionic HTML TypeScript Jasmine and Karma CSS

The Ionic framework requires the use and knowledge of a variety of

technologies. A developer is required to know multiple programming and markup

languages including TypeScript, Angular, HTML and CSS. In addition to these

60 languages, it is also important to have an understanding of the user interface elements that Ionic provides and the use of Ionic and Cordova plugins. In order to learn everything required to use the Ionic framework, a developer needs to consult documentation maintained by numerous different organizations. The supporting technologies that Ionic relies on are developed and maintained separately from Ionic and changes to these technologies can have impacts on Ionic applications. This can make it difficult to track down the underlying cause of bugs or errors.

The learning curve of the Flutter SDK is much smaller in contrast to the Ionic

framework. The only programming language required for Flutter is Google’s Dart. The

plugins for Flutter can be found in a single repository and do not require the use of

wrapper plugins such as the ones that are found in the Ionic framework. The

documentation for both the user interface elements of Flutter and the development

features of the SDK are found on the same website. The fact that Flutter relies on a single

programming language, requires fewer external technologies, and is developed alongside

other Google products means that it is far simpler to get started and to find solutions to

problems. Flutter was still a very young framework at the time this paper was written but

during the development of the Task Station prototypes it was not difficult to quickly find

answers to issues that arose. Unlike the Ionic applications that are embedded in

WebViews, Flutter applications have comparable access to the native platform

applications in terms of platform device features. This means that developers with a

background in Android and/or iOS programming will likely find Flutter easier to learn

and use.

61 Debugging Strategies and Tools

Every software project will inevitably contain bugs and programming errors.

Because of this fact, debugging is an extremely important part of the software design and

maintenance processes. The Ionic framework with Cordova and the Flutter SDK both

provide mobile application developers different strategies for debugging their projects.

Applications developed with Apache Cordova are essentially web sites that are embedded in native platform software. The use of native debugging tools does not provide the ability to fully examine the data and behavior of an application. The native mobile application itself is only responsible for launching the embedded web application and providing a bridge to the device’s operating system features. In order to examine and debug the embedded application, the use of the platform’s web browser tools is required.

For Android applications, developers can use Google Chrome’s developer tools to connect to the device’s browser and inspect the elements and source code of the web application. Similarly, iOS applications require the use of Safari’s developer tools. The added layer of embedding an application within an application makes it more difficult to quickly find the cause of bugs and programming issues.

Software developed with the use of the Flutter SDK can be debugged using the Dart Observatory tool included with the framework. The Flutter debugging tools provide features that are comparable with the features provided by the Android Studio and Xcode debugging tools. The Dart Observatory allows the inspection and statement- level debugging of the full lifespan of the application in comparison to the browser based debugging tools used for Ionic/Cordova applications. This not only simplifies the number of tools that are required for a developer to learn but it also means that the same tool can

62 be used to track down any issues that arise while developing a project and the tool was specifically designed for debugging Dart based applications. Overall, the debugging tools of Flutter help to make it easier to create and maintain mobile applications over their lifespan than the using the Ionic framework with Apache Cordova.

Live Reload Behavior

During the process of debugging or testing an application, it is often necessary

to make changes to layouts or source code and then determine if the changes had the

desired effect. This process can be tedious if the application has to be stopped, re-

compiled and started again as build times can vary and the state of the application is lost

and has be to recreated. One of the newer features of mobile development tools is the

ability to live reload changes to the appearance or behavior of an application without

losing the application’s state or having to recompile the project. This feature is available

in both the Flutter framework tools and the Ionic framework with Cordova, but the

behavior of this feature is not fully consistent between the two.

One of the most impressive aspects of Flutter is the framework’s hot reload

feature. Hot reload allows the developer to make changes to the Dart source code of an

application and instantly see the effects in the running program. This feature makes

building layouts much easier as every change in the widget tree is instantly visible on the

device. Hot reload is possible because debug builds of Flutter applications are run on

devices through the use of a Dart Virtual Machine. After a hot reload is requested, the

source code changes are analyzed and injected into the virtual machine and the changes

are implemented in most cases without requiring the application to be restarted. Flutter’s

63 hot reload capability can drastically shorten the development and testing processes of mobile software.

The Ionic framework also has a live reload feature but it has a number of

limitations. When an Ionic application is built and deployed, the web source files are

embedded into a native application and packaged for the different platforms. The Ionic

command line interface allows for an application to be run with the use of a live reload

flag. When the application is run in live reload mode, the web source files are hosted on a

local web server on the development machine and the deployed application uses the

development server as a host for source code instead of an embedded web application.

The use of a separate development server introduces some complications into the live

reload process. Many of the Ionic native plugins can be used in live reload mode but

some device features are not available and that can introduce bugs and errors that are

difficult to track down. The external server can also introduce cross-site scripting and file

permission issues. These complications mean that applications that rely on certain

platform device features such as filesystem use or Google Authentication are not fully

supported in live reload mode. In addition to these issues, only changes to the CSS

properties of a page can be implemented without restarting the application. Changes to

the TypeScript or HTML files will cause the development server to restart the program

and destroy the state of the application.

Flutter has live reload capabilities that are much more user friendly than the

Ionic framework. Flutter’s tools even surpass the live reload tools available as part of the

native Android and iOS development frameworks. The ability to quickly make changes

while testing mobile software can have a huge impact on the overall development

64 experience and the Flutter framework provides this ability in a way that makes it stand out among other development frameworks.

Device Filesystem and Hardware Access

Many mobile applications, unless they are very simple, require access to

device features such as the filesystem, the camera, network connectivity, GPS or the

accelerometer. The native platform SDKs provide full access to these features. If a

developer wants to target multiple platforms through the use of a cross-platform

development framework, the framework must provide a simple application programming

interface (API) to use those features on all target devices.

Task Station required the use of the device camera, the device’s network

connection and access to the local filesystem. Both Cordova/Ionic and Flutter provided

access to these features through the use of plugins. The plugins allow for a developer to

write code related to device hardware once knowing that the appropriate platform code

will be included during the compilation process. For the most part, the behavior of the

device feature plugins was comparable for both frameworks but the Ionic framework did

have a few additional complications.

Ionic provides a user interface framework for hybrid applications and uses

Apache Cordova to compile and package the applications. Cordova handles the communication between the embedded web application and the device’s features. It is possible to only include the direct Cordova plugins but they are not very user friendly and feature a lot of low level detail. Ionic provides a series of plugin wrappers that simplify

65 the use of device features. The use of wrappers makes access to hardware features much easier but they increase the overhead of the project.

In addition to the increased number of dependencies, access to the local filesystem is inconsistent across the different elements of the web application. Files in the application’s private local storage can be referenced in the Typescript code but directly referencing files in the HTML layouts (for instance as an image source) can result in file permissions errors.

Both Flutter and Ionic/Cordova provided APIs for all of the device features

that were required for this experiment but Flutter provided a simpler development

experience. Flutter did not require as many additional plugins as the Ionic framework and

it was only necessary to read one set of documentation to use the plugins. Because the

Flutter applications are compiled into fully native applications, there were fewer

complications in contrast to the use of embedded web applications. Overall, the Flutter’s

access to device features was smoother and featured fewer stumbling blocks.

Google Firebase Support

The Google Firebase suite [32] provides a number of useful tools for mobile

application developers. As of May 2018, around 1.2 million applications were using the

Firebase SDK [33] . Task Station relies on Cloud Firestore to store the task information

and automatically sync between local data and cloud data. The project also uses Cloud

Storage to store the images associated with tasks. The Firebase Authentication module

provides a drop-in user authentication interface to allow users to sign into the application

on a variety of devices. The Firebase modules fully support the native Android and iOS

66 SDKs but the same support is not available for the Flutter and Cordova/Ionic frameworks.

Flutter is a project that is being developed by Google, who also developed the

Firebase suite. Native device support is available through the use of first party plugins

found in the Flutter Packages repository. All of the plugins were in early stages of

development at the time that this experiment was completed and they did not provide

complete support for all of the available Firebase features. Even though they were still in

the early beta stage, the plugins provided similar support for all of the features that the

native platform applications used.

The Ionic framework uses Cordova to package web applications into

WebViews embedded in native platform mobile applications. Firebase provides a

JavaScript based API for use in web applications that can be used in Ionic applications.

This API provides support for many of the features included in the native Android and

iOS APIs but lacks some key features. The AngularFire2 library was used in

development of the Cordova prototype to provide more support for Firebase. The

Firebase web API provides support for Firebase Authentication but it relies on redirecting

the web application to login pages. This behavior is not possible in a Cordova application

so the use of a separate Google authentication plugin was required to provide

authentication tokens to the Firebase web API. In addition to these issues, the web API

also lacks support for the offline data persistence feature of Cloud Firestore.

Firebase is not an absolutely necessary tool for mobile development but it

does provide developers a series of tools to quickly set up data storage models, user

67 authentication and analytics tools. Overall, Flutter provided the ability to use the same features that were available in the native platform APIs. Because Firebase and Flutter are both products from the same company, it is likely that Flutter’s Firebase support will continue to improve over time. The Ionic framework required additional plugins and work arounds to implement behavior consistent with the other Task Station prototypes.

Ionic was also not able to provide full support for offline data persistence so the

Cordova/Ionic applications cannot be used fully without a reliable network connection.

Licensing and Cost

The marketplace of mobile development frameworks is made of dozens of

options. Some frameworks are available for use with no cost and have their source code

available in public repositories. Other frameworks require paid licenses and are built

using propriety technologies. The ideal development framework needs to fit the

requirements and resources of a development team. The two frameworks that were

evaluated as part of this experiment are both open source projects that are available to use

for free.

Ionic has an additional set of tools available called Ionic Pro for which developers can purchase monthly access. Ionic Pro features tools that allow for visual user interface design, live application deployment and cloud based project builds. In a similar fashion, Apache Cordova has a paid product known as PhoneGap Build that allows developers to upload their web application source files and build native platform packages in the cloud. Only the free versions of the Ionic framework and Apache

Cordova were used in the development of the Task Station prototypes.

68 Application Splash Screens and Icon Support

Two important aspects of any mobile application that are key to first

impressions are the icons and splash screen of the application. There are multiple types of

mobile devices and a variety of sizes and configurations to support. This can make the

process of managing these elements a challenge. Flutter and Cordova/Ionic handle the

design and management of the splash screen and icons in different ways.

Flutter does not really provide any direct support in managing the icons or

splash screens of a project. When a platform is added to a project (such as Android or

iOS), the Flutter SDK adds a folder of platform specific files to the project. The

management of the splash screen and icons of a Flutter project is handled the same way

as using the native platform SDKs. The developer is responsible for separately

configuring them for each platform in the project’s platform folders.

By contrast, The Ionic framework provides a command line tool that makes

handing icons and splash screens easy. If the project contains an icon image file and a

splash screen logo image file in the project’s resources folder, the “ionic resources”

command will automatically generate all of the required image files for the platforms

defined in the project. This saves the developer from needing to spend a lot of time

saving various sized icons and logos. The Ionic native Splash Screen plugin handles

displaying and hiding the application’s splash screen by default.

Application Compilation and Deployment

The ability to write, debug and run mobile software is a crucial part of any

development framework but the end goal of any project is to deploy the application to

69 platform devices and marketplaces. The ideal framework must allow for the ability to produce platform application packages easily. Luckily, both Flutter and the Ionic framework with Apache Cordova make this process fairly painless.

Both frameworks rely on the native Android and iOS SDKs to package and sign applications for deployment. For cross-platform projects it is necessary to configure the platform specific project files and to setup up application signing profiles or key stores. This process is identical for both frameworks. After the appropriate signing provisions are in place and the native platform manifests are properly configured the

Flutter and Ionic command line interfaces or IDE plugins can manage application package deployment with simple commands.

Summary

The wish list of any mobile application developer includes the ability to write software for multiple platforms with a single codebase and the ability to easily package and deploy the software. Apache Cordova and Flutter are both able to meet these wishes and dramatically reduce the effort required to support both the Android and iOS platforms. The use of either cross-platform framework required less code to be written, fewer files to manage and less development time than producing separate native platform applications. Even giving this fact, each framework provided a different development experience and the generated applications had a number of different characteristics.

While both frameworks simplified the process of developing cross-platform applications, Flutter provided a better overall development experience. The learning curve of Flutter was smaller compared to using the Ionic framework with Cordova. There

70 are fewer technologies to learn and the documentation can be found in one place. Fewer required external dependencies results in projects that are simpler to manage and maintain. Flutter’s hot reload feature allows for rapid user interface design and the availability of first person plugins makes integrating Firebase into a project fairly painless. All of these features made the overall development experience of the Flutter prototypes easier and more enjoyable compared to the Ionic development experience.

Flutter produced applications that were larger in size and required more

memory during their active lifetime when compared to native applications but they were

quicker to respond to user interaction than the Cordova/Ionic and native platform

applications. The Ionic prototypes had smaller memory footprints and took longer to

launch than the other Task Station prototypes but had similar response times to the native

platform applications after startup.

Both frameworks featured a comparable set of interface elements and they

both at least partially tailor deployed applications to their target platforms. Neither

framework produced applications that were indistinguishable from native platform

applications but the quality of the applications was impressive when the significant

reduction in development effort is taken into consideration.

71

CHAPTER V

SUMMARY AND RECOMMENDATIONS

Summary

This paper details an evaluation of the mobile development frameworks

Flutter and Apache Cordova with the Ionic framework and their impacts on both the development process and the characteristics of the applications deployed from the frameworks. The design of Task Station, a task management application prototype, was described along with the development of deployed prototypes using the native Android

SDK, the native iOS SDK and the Flutter and Cordova/Ionic frameworks. A series of evaluations were performed on the source code of each prototype and the compiled applications. Qualitative features and differences in the development experiences of the prototypes were also described.

Overall, Google’s Flutter provided a better development experience than the production of the Cordova prototype and the combined production of the native platform prototypes. Both the Flutter and Cordova prototypes required significantly fewer lines of code than the combined native prototypes and fewer user managed files to maintain. The

Flutter prototype required more device memory to install and used more memory while running but the application started quicker than the native or Cordova prototypes and navigated between pages quicker. The development timeline of the Flutter prototype was significantly faster than the Cordova prototype due to the better available documentation,

72 fewer technologies to learn and manage, and better hardware and Firebase integration through the use of first-party plugins.

Flutter allowed the production of Android and iOS applications with a single codebase and minimal additional configuration, similar to Cordova. The deployed Flutter applications were highly responsive to user interaction and featured high quality widgets for the user interfaces of the pages of the applications. The inclusion of the Flutter engine had a measurable impact on the size of the installed application and the RAM requirements of the application but did not seem to have a negative impact on the responsiveness of the application. It is possible that as Flutter matures it will become more efficient in terms of memory requirements. Flutter was still a young technology at the time of this research but it appears that its future is highly promising.

Through the use of cross-platform development frameworks it is possible to develop and deploy mobile software to the Android and iOS platforms. While both frameworks evaluated in this thesis reduced the total development effort required, neither framework is a perfect solution. It is necessary for developers to evaluate the options and tools available to them and determine which tools best meet the needs of their particular projects and allow them to leverage their programming backgrounds to quickly produce quality software.

Limitations and Recommendations for Future Research

As with any research project there are a number of limitations to the findings in this thesis. More research could be done to expand upon the findings in order answer the research questions in more detail and with more certainty. In this section, several of

73 these limitations will be detailed along with recommendations for areas of future research.

Only one type of application, a data-driven task management application, was

developed and evaluated in this experiment. There are a number of different types of

applications and the requirements for each type may make them better suited for different

development strategies and tools [6]. One area for future research would be to expand the

experiment in order to include a variety of types of applications. A game or other

application that features animations, heavy graphics use or intensive processor or network

workloads for example would likely benefit from a different development strategy.

The source code evaluations in this paper only consisted of counting lines of code, the number of files, and external dependencies. There are additional source code metrics that could be included in future evaluations. For example, the cyclomatic complexity of the methods in an application can help to predict the difficulty of testing the application. The number of linearly independent paths in a section of source code helps to determine the number of required unit tests in order to reach satisfactory code coverage. The metric of lines of code is fairly superficial but it has been found to be correlated with complexity [29]. Further evaluation of the source code of each prototype could help to provide additional information about the impact each framework has on development and maintenance effort.

As Dhillon and Mahmoud [7] point out in their work, benchmarking and evaluating applications written for multiple platforms using multiple frameworks is difficult due to the fact that a consistent evaluation strategy may not be available. The

74 research in this paper is limited by the fact that each prototype had varying access to the application lifecycle and the native logging systems. Both Android and iOS also run on different hardware and devices so it is impossible to consistently evaluate the different prototypes across the platforms. Due to this fact, it is important to not compare the results of the quantitative evaluations across the Android and iOS platforms.

The four Task Station prototypes were only tested on one device for each platform, it may be that the findings of this research only apply to applications deployed to the particular devices used in the experiment. In the future, similar research could be done with evaluations being performed on applications deployed not only to multiple devices of the same type but also on multiple different types of devices such as tablets, smart watches, or TV connected devices. The variety of devices would provide further insight into the performance characteristics of the deployed applications as well as the ability of each cross-platform development framework to overcome the internal fragmentation of the individual platforms.

Any time developers have to duplicate their efforts to produce multiple

versions of the same application inconsistencies between the applications will likely be

introduced [18]. This was true during the development of the four Task Station

prototypes. Each prototype has minor variations in the user interfaces of the pages of the

application and in the implementation of the different features of the application. Some of

these variations are due to differences in the strategies used to produce the prototypes but

some are also due to the complexity of managing four prototypes at once and the

difficulty of keeping every detail perfectly consistent. It is possible that these differences

could have had an impact on the findings in this paper.

75 The process of developing multiple application prototypes is effort and time intensive. The scope of this thesis was limited to only evaluating Flutter and Apache

Cordova with the Ionic framework in order to keep the project manageable for a single developer. Dozens of different cross-platform development frameworks are available on the market and they continue to evolve over time. More frameworks could evaluated in a similar fashion in order to broaden the findings of this paper or similar research could be performed again using the same frameworks in the future to see how the frameworks have changed and if the limitations of a framework have been addressed or resolved.

Google’s Flutter in particular was still in beta during the time this research was undertaken and it is possible that the eventual release version of the framework could differ significantly from the version evaluated here.

Unit testing is a very important part of developing maintainable software.

Without automated unit testing, manual testing must be performed which can be effort intensive and could potentially lead to overlooking bugs and errors. The unit testing capabilities of each development strategy were not evaluated as part of this thesis. Future research could be done on exploring the unit testing capabilities and limitations for each framework.

The fields of mobile application development and cross-platform development frameworks are rich with opportunities for more research due to the complexity of the fields and the current and future market potential for mobile device software. The ability to deploy an application to multiple types of devices across multiple platforms from a single codebase that can be easily maintained remains the ideal for any mobile developer.

The search for a development framework that allows for a reduction in development

76 effort and time, provides a smooth development experience, and deploys high quality applications will continue as the mobile software market continues to expand and evolve.

77 REFERENCES

REFERENCES

[1] “Number of Smartphone Users Worldwide 2014-2020,” Statista . [Online]. Available: https://www.statista.com/statistics/330695/number-of-smartphone-users- worldwide/. [Accessed: 10-Mar-2018]. [2] “Annual Number of Downloads Worldwide 2021 | Statistic,” Statista . [Online]. Available: https://www.statista.com/statistics/271644/worldwide-free-and- paid-mobile-app-store-downloads/. [Accessed: 10-Mar-2018]. [3] “Mobile OS Market Share 2017,” Statista . [Online]. Available: https://www.statista.com/statistics/266136/global-market-share-held-by- smartphone-operating-systems/. [Accessed: 28-Feb-2018]. [4] M. E. Joorabchi, A. Mesbah, and P. Kruchten, “Real Challenges in Mobile App Development,” in 2013 ACM / IEEE International Symposium on Empirical Software Engineering and Measurement , 2013, pp. 15–24. [5] W. S. El-Kassas, B. A. Abdullah, A. H. Yousef, and A. M. Wahba, “Taxonomy of Cross-Platform Mobile Applications Development Approaches,” Ain Shams Eng. J. , vol. 8, no. 2, pp. 163–190, Jun. 2017. [6] M. Willocx, J. Vossaert, and V. Naessens, “Comparing Performance Parameters of Mobile App Development Strategies,” in Proceedings of the International Conference on Mobile Software Engineering and Systems , New York, NY, USA, 2016, pp. 38–47. [7] S. Dhillon and Q. H. Mahmoud, “An Evaluation Framework for Cross-Platform Mobile Application Development Tools,” Softw. Pract. Exp. , vol. 45, no. 10, pp. 1331–1357, Oct. 2015. [8] “Apache Cordova Documentation.” [Online]. Available: https://cordova.apache.org/docs/en/latest/. [Accessed: 16-Sep-2018]. [9] V. Ahti, S. Hyrynsalmi, and O. Nevalainen, “An Evaluation Framework for Cross- Platform Mobile App Development Tools: A Case Analysis of Adobe PhoneGap Framework,” in Proceedings of the 17th International Conference on Computer Systems and Technologies 2016 , New York, NY, USA, 2016, pp. 41–48. [10] H. Heitkötter, S. Hanschke, and T. A. Majchrzak, “Evaluating Cross-Platform Development Approaches for Mobile Applications,” in International Conference on Web Information Systems and Technologies , 2012, pp. 120–138. [11] H. J. Kim, S. Karunaratne, H. Regenbrecht, I. Warren, and B. C. Wunsche, “Evaluation of Cross-Platform Development Tools for Patient Self-Reporting on Mobile Devices,” in Proc. 8th Australasian Workshop Health Inform. Knowl. Manag. , 2015, pp. 55–61. [12] R. Francese, C. Gravino, M. Risi, G. Scanniello, and G. Tortora, “Mobile App Development and Management: Results from a Qualitative Investigation,” in Proceedings of the 4th International Conference on Mobile Software Engineering and Systems , Piscataway, NJ, USA, 2017, pp. 133–143.

79 [13] “Ionic Framework Documentation.” [Online]. Available: https://ionicframework.com/docs/. [Accessed: 16-Sep-2018]. [14] “Flutter Documentation.” [Online]. Available: https://flutter.io/docs/. [Accessed: 16- Sep-2018]. [15] A. Aldayel and K. Alnafjan, “Challenges and Best Practices for Mobile Application Development: Review Paper,” in Proceedings of the International Conference on Compute and Data Analysis , New York, NY, USA, 2017, pp. 41–48. [16] M. Nagappan and E. Shihab, “Future Trends in Software Engineering Research for Mobile Apps,” in 2016 IEEE 23rd International Conference on Software Analysis, Evolution, and Reengineering (SANER) , 2016, vol. 5, pp. 21–32. [17] M. Palmieri, I. Singh, and A. Cicchetti, “Comparison of Cross-Platform Mobile Development Tools,” in 2012 16th International Conference on Intelligence in Next Generation Networks , 2012, pp. 179–186. [18] M. E. Joorabchi, M. Ali, and A. Mesbah, “Detecting Inconsistencies in Multi- Platform Mobile Apps,” in 2015 IEEE 26th International Symposium on Software Reliability Engineering (ISSRE) , 2015, pp. 450–460. [19] S. Xanthopoulos and S. Xinogalos, “A Comparative Analysis of Cross-platform Development Approaches for Mobile Applications,” in Proceedings of the 6th Balkan Conference in Informatics , New York, NY, USA, 2013, pp. 213–220. [20] M. Ciman, O. Gaggi, and N. Gonzo, “Cross-platform Mobile Development: A Study on Apps with Animations,” in Proceedings of the 29th Annual ACM Symposium on Applied Computing , New York, NY, USA, 2014, pp. 757–759. [21] T. A. Majchrzak, A. Biørn-Hansen, and T.-M. Grønli, “Progressive Web Apps: the Definite Approach to Cross-Platform Development?,” 2018. [22] I. Malavolta, S. Ruberto, T. Soru, and V. Terragni, “End Users’ Perception of Hybrid Mobile Apps in the Google Play Store,” in 2015 IEEE International Conference on Mobile Services , 2015, pp. 25–32. [23] I. T. Mercado, N. Munaiah, and A. Meneely, “The Impact of Cross-platform Development Approaches for Mobile Applications from the User’s Perspective,” in Proceedings of the International Workshop on App Market Analytics , New York, NY, USA, 2016, pp. 43–49. [24] A. Puder and O. Antebi, “Cross-Compiling Android Applications to iOS and 7,” Mob. Netw. Appl. , vol. 18, no. 1, pp. 3–21, Feb. 2013. [25] W. S. El-Kassas, B. A. Abdullah, A. H. Yousef, and A. M. Wahba, “Enhanced Code Conversion Approach for the Integrated Cross-Platform Mobile Development (ICPMD),” IEEE Trans. Softw. Eng. , vol. 42, no. 11, pp. 1036–1053, Nov. 2016. [26] J. Ohrt and V. Turau, “Cross-Platform Development Tools for Smartphone Applications,” Computer , vol. 45, no. 9, pp. 72–79, Sep. 2012. [27] T. Majchrzak and T.-M. Grønli, “Comprehensive Analysis of Innovative Cross- Platform App Development Frameworks,” in Proceedings of the 50th Hawaii International Conference on System Sciences , 2017. [28] M. D. Syer, B. Adams, Y. Zou, and A. E. Hassan, “Exploring the Development of Micro-apps: A Case Study on the BlackBerry and Android Platforms,” in 2011 IEEE 11th International Working Conference on Source Code Analysis and Manipulation , 2011, pp. 55–64.

80 [29] R. K. Lind and K. Vairavan, “An Experimental Investigation of Software Metrics and their Relationship to Software Development Effort,” IEEE Trans. Softw. Eng. , vol. 15, no. 5, pp. 649–653, May 1989. [30] V. Bauer, L. Heinemann, and F. Deissenboeck, “A Structured Approach to Assess Third-Party Library Usage,” in 2012 28th IEEE International Conference on Software Maintenance (ICSM) , 2012, pp. 483–492. [31] M. Willocx, J. Vossaert, and V. Naessens, “Security Analysis of Cordova Applications in Google Play,” in Proceedings of the 12th International Conference on Availability, Reliability and Security , New York, NY, USA, 2017, pp. 46:1–46:7. [32] “Firebase Documentation,” Firebase . [Online]. Available: https://firebase.google.com/docs/. [Accessed: 17-Oct-2018]. [33] “What’s New in Firebase at I/O 2018,” The Firebase Blog . [Online]. Available: http://firebase.googleblog.com/2018/05/whats-new-in-firebase-at-io-2018.html. [Accessed: 17-Oct-2018].

81