Fakult¨at fur¨ Informatik

CSR-21-02

Adaptive User Interface for Automotive Demonstrator

Hasan Aljzaere · Owes Khan · Wolfram Hardt

Juli 2021

Chemnitzer Informatik-Berichte

Adaptive User Interface for Automotive Demonstrator

Master Thesis

Submitted in Fulfilment of the Requirements for the Academic Degree M.Sc.

Dept. of Computer Science Chair of Computer Engineering

Submitted by: Hasan Aljzaere Student ID: 322205 Date: 22.12.1988

Supervising tutor: Prof. Dr. Dr. h. . W. Hardt M. Sc. Owes Khan

1

Abstract

The BlackPearl in the Computer Engineering Department is an Automotive Demonstrator, which has a variety of sensors, and users can control these via the server. The server is responsible for the remote interaction, the Smart Queue, and the Raspberry Pi display for human interaction.

The Automotive Demonstrator consists of four components, which are installed on the CE-Box: Main QML Application, Main Server, Live Stream, and Smart Queue. All of these servers are running on three single-board computers (Raspberry Pi 3B): Main, BlackPearl, and Camera servers. The Automotive Demonstrator is built with the latest version from both and NodeJS, and the components can access, store and exchange the data in JSON format. The BlackPearl will be controlled via four types of interaction methods: Web server, Voice commands (Sparrow), Pi Display, and Gamepad.

The outcome of this thesis is a configurable and adaptive User Interface for Automotive Demonstrator, and this can be easily updated, customized, and accessible for new applications without the need to update or rebuild the program.

Keywords: User Interaction, Adaptive UI, Configurable UI, Smart Queue, Raspberry Pi, Qt/QML, NodeJS

2

Zusammenfassung

Der BlackPearl ist im Fachbereich Technische Informatik ein Automotive Demonstrator, der über eine Vielzahl von Sensoren verfügt, die der Benutzer über einen Server steuern kann. Der Server ist für die Ferninteraktion zuständig und die Smart Queue sowie der Raspberry Pi Display für die menschliche Interaktion.

Der Automotive Demonstrator besteht aus vier Komponenten, die auf der CE-Box installiert sind: Haupt-QML-Anwendung, Hauptserver, Live-Stream und Smart Queue. Alle diese Server laufen auf drei Einplatinencomputern (Raspberry Pi 3B): Haupt-, BlackPearl- und Kamera-Server. Der Automotive Demonstrator ist mit der neuesten Version von sowohl Qt als auch NodeJS ausgestattet, und die Komponenten können auf die Daten im JSON-Format zugreifen, speichern und austauschen. Der BlackPearl wird über vier Arten von Interaktionsmethoden gesteuert: Webserver, Sprachbefehle (Sparrow), Pi-Display und Gamepad.

Das Ergebnis dieser Arbeit ist eine konfigurierbare und anpassungsfähige Benutzeroberfläche für den Automotive Demonstrator, die leicht aktualisiert, angepasst und für neue Anwendungen zugänglich gemacht werden kann, ohne dass das Programm aktualisiert oder neu erstellt werden muss.

Keywords: User Interaction, Adaptive UI, Configurable UI, Smart Queue, Raspberry Pi, Qt/QML, NodeJS

3

Content

Abstract ...... 2

Zusammenfassung ...... 3

Content ...... 4

List of Figures ...... 7

List of Tables ...... 9

List of Code Snippets ...... 10

List of Abbreviations ...... 11

1 Introduction ...... 12

1.1 Motivation ...... 12

1.2 Technical Background ...... 13

1.2.1 YellowCar ...... 13

1.2.2 BlackPearl ...... 15

1.2.3 Qt Framework ...... 18

1.2.4 NodeJs ...... 20

1.2.5 CMUSphinx ...... 23

1.2.6 GTK ...... 24

1.2.7 Felgo ...... 24

1.2.8 Kivy ...... 25

1.2.9 ElectronJS ...... 26

1.2.10 Django ...... 26

1.3 Problem Statement...... 27

1.4 Requirements ...... 27

1.4.1 Main requirements ...... 28

1.4.2 Current vs proposed system ...... 28

1.4.3 Why Qt and NodeJS ...... 30

2 State of Art ...... 32

2.1 BlackPearl: Extended Automotive Multi-ECU Demonstrator Platform ...... 32

2.2 Adaptive User Interaction ...... 34 4

2.2.1 A Framework for Adaptive User Interface Generation based on User Behavioural Patterns ...... 34

2.2.2 A RESTful Architecture for Adaptive and Multi-device Application Sharing...... 35

2.2.3 Model-based adaptive user interface based on context and user experience evaluation ...... 37

2.3 Speech Assistance ...... 41

3 Concept ...... 43

3.1 Main Concept ...... 43

3.2 Adaptivity and Configurability ...... 46

3.3 Smart Queue ...... 47

3.4 User Interaction ...... 48

3.4.1 Remote UI ...... 48

3.4.2 BlackPearl Display ...... 49

3.4.3 Voice UI ...... 50

3.4.4 Gamepad ...... 51

4 Implementation ...... 53

4.1 Main implementation ...... 53

4.2 Application Programming Interface (API) ...... 54

4.3 Rules Management ...... 56

4.4 BlackPearl Server...... 57

4.4.1 Main Server ...... 58

4.4.2 Web Interface ...... 59

4.4.3 Custom App ...... 59

4.4.4 Web Interface Settings ...... 60

4.4.5 Camera Server ...... 62

4.5 Smart Queue ...... 63

4.6 ...... 65

4.7 User Interface ...... 71

4.7.1 Remote User Interaction ...... 72

4.7.2 Voice User Interaction ...... 74 5

4.7.3 Android Application ...... 75

5 Results ...... 76

5.1 Performance ...... 76

5.1.1 Application Framework ...... 76

5.1.2 BlackPearl Server ...... 79

5.2 Usability ...... 84

5.3 Features ...... 86

5.4 Limitation ...... 86

6 Conclusion and Further Work ...... 88

6.1 Conclusion ...... 88

6.2 Further Work ...... 89

Bibliography ...... 91

A. Appendix A – Custom Qt framework ...... 94

B. Appendix B – Remote Deploy ...... 98

C. Appendix C – Code Snippets ...... 100

A. BlackPearl Server...... 100

B. Camera Server ...... 103

C. QML Snippet ...... 103

D. Appendix C – QR Codes ...... 105

Selbstständigkeitserklärung...... 106

6

List of Figures

Figure 1.1: YellowCar structure from [1]...... 14 Figure 1.2: Client-Server interface structure from [1, 28]...... 15 Figure 1.3: The BlackPearl [1]...... 15 Figure 1.4: Modules of the BlackPearl from [1]...... 16 Figure 1.5: BlackPearl Sensor Unit Concept [1]...... 17 Figure 1.6: BlackPearl Sensor Unit implementation [1]...... 17 Figure 1.7: Modules of the BlackPearl from [2]...... 18 Figure 1.8: screenshot...... 20 Figure 1.9: Monolithic architecture vs Microservice architecture in a nutshell [6]...... 21 Figure 1.10: Blocking I/O vs Non-Blocking I/O...... 22 Figure 1.11: Sphinx 4 Architecture [8]...... 23 Figure 2.1 System design and Main servers [14]...... 34 Figure 2.2: Webpage Heatmaps [14]...... 35 Figure 2.3: Conceptual view of the (-MVC) architecture [15]...... 36 Figure 2.4: Interaction pattern while initialization [15]...... 36 Figure 2.5: The experimental environment (top), User desktop (left), and handheld device (right) [15]...... 37 Figure 2.6: Mining Minds Platform [16]...... 38 Figure 2.7: User, context, and device models [16]...... 39 Figure 2.8: Adaptive behaviour data flow [16]...... 39 Figure 2.9: Low vision scenario (left) and user dashboard (right) [16]...... 40 Figure 3.1: Main Concept...... 44 Figure 3.2: Main Concept 2...... 45 Figure 3.3: Alternative Main Concept...... 45 Figure 3.4: BlackPearl API Concept...... 46 Figure 3.5: Smart Queue Concept...... 48 Figure 3.6: BlackPearl Server Concept...... 49 Figure 3.7: BlackPearl Display Concept...... 50 Figure 3.8: Sparrow Concept...... 51 Figure 3.9: BlackPearl Display Concept...... 52 Figure 4.1: BlackPearl Main Implementation...... 54 Figure 4.2: BlackPearl API...... 55 Figure 4.3: BlackPearl API interface...... 56 Figure 4.4: BlackPearl main server...... 58 Figure 4.5: Web Interface Custom Application...... 59 Figure 4.6: BlackPearl Server Frontend...... 60 7

Figure 4.7: BlackPearl Server Settings implementation...... 61 Figure 4.8: BlackPearl Server Settings...... 62 Figure 4.9: BlackPearl Server Status...... 62 Figure 4.10: Camera Server Implementation...... 62 Figure 4.11: Smart Queue CAN-Bus Write rule...... 63 Figure 4.12: Smart Queue CAN-Bus Read rules...... 64 Figure 4.13: Pi Display Application framework...... 65 Figure 4.14: BlackPearl Display Menu...... 66 Figure 4.15: BlackPearl Display Home...... 66 Figure 4.16: BlackPearl Display Car...... 67 Figure 4.17: Dynamic GUI via JSON...... 68 Figure 4.18: Display Dynamic GUI default...... 68 Figure 4.19: Display Dynamic GUI updated...... 68 Figure 4.20: BlackPearl Display Settings...... 69 Figure 4.21: BlackPearl Display Rules...... 69 Figure 4.22: QML Addable Function...... 71 Figure 4.23: BlackPearl Remote UI...... 72 Figure 4.24: BlackPearl Display Car ...... 73 Figure 4.25: Sparrow Voice UI ...... 75 Figure 4.26: Android Remote Controller...... 75 Figure 5.1: Memory Allocation Performance ...... 77 Figure 5.2: Signal-Binding Performance...... 78 Figure 5.3: JavaScript Performance...... 78 Figure 5.4: Scene Graph...... 79 Figure 5.5: BlackPearl server busy...... 81 Figure 5.6: BlackPearl server ...... 82 Figure 5.7: BlackPearl server CPU and Memory...... 83 Figure 5.8: BlackPearl server loops...... 83 Figure 5.9: BlackPearl server Sockets...... 84 Figure 5.10: BlackPearl server HTTP requests...... 84 Figure 5.11: End-to-End latency measurements...... 87 Figure 6.1: Qt Deploy Hotspot...... 90

8

List of Tables

Table 1: Felgo Licenses ...... 25 Table 2: Requirements ...... 28 Table 3:Current vs Proposed System...... 29 Table 4: Qt vs Felgo, GTK, and Kivy ...... 30 Table 5: NodeJS vs ElectronJS and Python Web ...... 31 Table 6: advantages and disadvantages for this work [16]...... 40 Table 7: Speech assistance engine comparison ...... 42 Table 8: CAN-Bus Rule’s Elements...... 55 Table 9: Wifi Rule’s Elements...... 55 Table 10: QML event category ...... 77 Table 11: The New System Limitations...... 87

9

List of Code Snippets

Code snippet 1: JSON Smart Queue Rules...... 100 Code snippet 2: Ultrasonic Sensors...... 100 Code snippet 3: Smart Queue Write CAN-Bus Rules...... 101 Code snippet 4: Smart Queue Read CAN-Bus Rules...... 102 Code snippet 5: Lights Smart Queue Rule...... 102 Code snippet 6: Joystick function...... 103 Code snippet 7: Live Camera Streaming...... 103 Code snippet 8: JSON QML Dynamic GUI...... 104 Code snippet 9: C++ class integration in QML...... 104

10

List of Abbreviations

API Application Programming Interface CAN Controller Area Network CSS Cascading Style Sheets dbc Database Container ECU Electronic Control Unit GUI HTML Hypertext Markup Language HTTP Hypertext Transfer Protocol JS JavaScript JSON JavaScript Object Notation QML Qt Modelling Language REST Representational State Transfer SQL Structured Query Language UI User Interaction VM Virtual Machine WS WebSocket XML Extensible Markup Language

11

1 Introduction

Most of the developed systems in the Automotive Industry are delivered with a predefined and designed Graphical User Interface (GUI), which means all users will use the same system without giving them the ability to customize or to create their own view. As well as not giving the developers the ability to get the advantages of the system usability and the integration of their custom application without the need to interface with the main system source code. The Computer Engineering department has developed an automotive demonstrator platform at the Chemnitz University of Technology that can be used for image processing, functional tests, performance evaluation, and optimization [1]. This platform has multiple Electronic Control Units (ECUs), an extra rack (CE-Box), which has six slots for up to six single-board computers (Raspberry pi 3 B+), and a touchscreen. These boards and the automotive demonstrator can communicate with each other using Wifi and CAN-Bus. This platform has some critical drawbacks, which will be introduced in this chapter and then in detail in chapter 2. Afterward, in chapter 3, the main thesis concept will be introduced and explained. This thesis final approach and solution will be implemented in chapter 4 and then this approach will be analyzed and evaluated in chapter 5.

In this chapter, these mentioned topics will be introduced comprehensively: Motivation (Section 1.1), Technical Background (section 1.2), Problem Statement (section 1.3) and Requirements (section 1.4).

1.1 Motivation

The Computer Engineering department at the Chemnitz University of Technology has already developed a working system with Graphical User Interface (GUI) and a web server for the BlackPearl. Before this thesis was encountered, the current system had already been analyzed and tested. The result shows the following two main drawbacks: the web server and the (Python) are not the best solutions for such tasks. These drawbacks lead the current system to abuse the hardware resources and prevent optimization. Furthermore, the web server has a high latency, which means when the user of the Web Interface tries to control the car by pressing one or two of these keyboard buttons “WASD” then releasing them, it will take the command one second or less to update or finish. The most important drawback is the lack of adaptability and usability, which means the users do not have the capabilities to change the Graphical User Interface (GUI) view or having more interaction options. Besides, the developers have to change the source code if they require adding new 12 functions or applications as well as interaction with other new applications. In case of adding or updating a CAN-Bus ID, the developers need to add them as well to the source code then compile it. The last drawback was the limited number of User Interaction (UI) methods, which are touch (using Pi display with Kivy designed Graphical User Interface) and remote (using a keyboard and HTML buttons) interaction. Consequently, the current server had been optimized. It was, nonetheless, better to initially redesign and reprogram the whole web server. Additionally, it is suggested that redesigning the whole system with better architecture and choosing the right programming language to solve this thesis challenge and problem. Qt framework for the main application and NodeJS for the web server was therefore applied. These two are powerful, lightweight, with high performance, and offer high usability and adaptivity. Furthermore, the replacement of the development programming language (Python) with QML and JavaScript was enhanced to achieve better adaptivity and usability.

1.2 Technical Background

In this section, the main two Automotive Demonstrators, which were developed from the Computer Engineering department, will be introduced: the YellowCar in 1.2.1 and then the BlackPearl in 1.2.2. Moreover, the complete introduction for both the Qt framework in 1.2.3, the NodeJS runtime environment in 1.2.4, and the CMUSphnix speech recognition engine in 1.2.5. Afterward, three well-known, robust and cross- platform Graphical User Interface (GUI) frameworks in this order will be briefly introduced: GTK in 1.2.6, Felgo in 1.2.7, and Kivy in 1.2.8. Finally, two web frameworks and runtime environments will be introduced: ElectronJS in 1.2.9 and Django in 1.2.10. Based on this technical background check, the main thesis requirement from table 2, and the comparison between the current and the proposed system from table 3, only one Graphical User Interface (GUI) framework and one runtime environment will be selected to fulfill the requirements and introduce a solution for the problem of this thesis.

1.2.1 YellowCar The first automotive demonstrator is the YellowCar, which the Computer Engineering department developed to execute and carry out testing functions, performance assessment, software/hardware architectures design, implementation, and optimization of hardware-independent software architectures. The YellowCar is based on an electric mini car for kids, which has three Electronic Control Units (ECUs): the ProcessingECU, the FeatureECU, and the AssistantECU [1, 28]. The three Electronic

13

Control Units (ECUs) use AUTOSAR standards, which is an automotive software architecture standard used by many known automotive companies [1]. As shown in Figure 1.1, the ECUs in the YellowCar communicate with each other via a network of CAN-Bus (125 Kbit/s). The ProcessingECU has the role of reading the signals from the sensors then passing them to the FeatureECU and the AssistantECU. After the sensor data from the ProcessingECU is passed, the AssistantECU is responsible for demonstrating and controlling movement actuators, which means it can execute the drive control functions and operations like driving or steering the YellowCar. Furthermore, the FeatureECU is responsible for controlling lights and indicators, i.e. right, left, warning, and the low beam [1]. Apart from these three ECUs, which are built based on AUTOSAR standards; the YellowCar has two extra non- AUTOSAR ECUS, the Raspberry Pi and Planet boards [1].

Figure 1.1: YellowCar structure from [1].

As shown in Figure 1.2, The YellowCar has a remote control functionality, which can be used on a client-server interface, for that, a CAN-Bus device is needed (TinyCAN) to classify and visualize the received data from the YellowCar via the wireless network and at the same time grant the user the ability to control the car.

14

Figure 1.2: Client-Server interface structure from [1, 28].

1.2.2 BlackPearl

Figure 1.3: The BlackPearl [1].

This thesis is implemented on the automotive demonstrator (BlackPearl), which is based on the previously mentioned YellowCar and is based on an electrical kids car. However, the BlackPearl mainly focuses on image processing jobs and for this reason, an extra rack (CE-Box) is installed on the BlackPearl.The BlackPearl was designed in the Computer Engineering Department for testing, performance assessment, and optimizations for hardware-independent implementation of Advanced Driver 15

Assistance Systems (ADAS) functions. Due to the need to use image processing sensors and devices, which require high processing units, the BlackPearl is equipped with a multi-board rack (CE-Box) [1]. The BlackPearl has six ultrasonic sensors and these are installed in this order: three on the front side and three on the backside, and these are used for distance measurements. Additionally, it has speed sensors to show control information, and additional cameras are used for the image processing board. Furthermore, the BlackPearl has controllable lights and actors, which consist of two DC electrical motor gearboxes, which control driving and steering [1].

Figure 1.4: Modules of the BlackPearl from [1].

As shown in Figure 1.4, all Electronic Control Units (ECUs) are communicating with each other via the CAN-Bus network. The automotive demonstrator rack (CE-Box) has four main modules that communicate via the CAN-Bus. First, the sensors and the actuators. The sensors read data values, which are forwarded to the actuators. Second, the network of control units that communicate via the CAN-Bus. These implement the general control of the system and the autonomous driving functions. Third, the multi-core processors, which perform and implement the image processing, consisting of the image pre-processing. Moreover, the imaging devices, such as cameras are connected to this module. Fourth, the Raspberry Pi Display, which shows status, visualize and control information. Actually, up to six ECUs can be installed in the EC-Box and in the future may support more units. As shown in Figure 1.5, the sensor unit of the BlackPearl has an Input circuit, which has three types of input: first input is the Environmental Sensors, which include ultrasonic sensors (USS), Light Detection Ranging (Lidar), and support other types of detecting sensors. The second input is the Vehicle Sensors, which include 16

Motor encoder, Battery, steering sensor, and other sensors. The third input is a mixture of interaction inputs, such as buttons, switches, and other user interaction parts. The Output circuit reasonable for the actors, like steering and driving, as well as changing the lights of the LED, e.g. Mono LED, RGB LED, or LED bar, in the car.

Figure 1.5: BlackPearl Sensor Unit Concept [1].

Figure 1.6 shows the implementation of the processing module, sensors, steering, and motor driver the BlackPearl [1].

Figure 1.6: BlackPearl Sensor Unit implementation [1].

17

1.2.3 Qt Framework

Figure 1.7: Modules of the BlackPearl from [2].

Qt [3] is a cross-platform framework, which means that the developed program can work on varieties of devices and operating systems (such as , , Mac OS, IOS, Android, and Windows mobiles). The idea of Qt is to code once, build and run anywhere. In addition, Qt is not a standalone framework. It is written in C++ and it helps to enhance and boost the C++ language with options like “”. All C++ like GCC, C++, Clang, MinGW, and Microsoft Visual C++ (MSVC) can compile it [2, 3]. Furthermore, this framework comes with a native building tool, which is called “” and it is a cross-platform frontend for platform-native build systems, like CMake, Make, GNU, MSVS, and . In addition, it has its own toolkit, which is introduced with the Qt 4.7 version [2, 3]. In March 2009, Qt released their own Integrated Development Environment (IDE) and called it “Qt Creator”, which was included under the Qt 4.7. However, developers can still use the Qt framework with other Integrated Development Environment (IDE) like Microsoft Visual C++ or normal text editor [2]. Nevertheless, the Qt framework focuses not only on Graphical User Interface (GUI) but it also comes with great libraries (modules) for cross-platform solutions and development, such as database plugins, network, graphic (e.g. OpenGL), sensors, web, protocols (like Serial, NFC, Bluetooth) and many other plugins and add-ons. Moreover, Qt is accessible under two types of licenses: Community, which is free and used under GPL and LGPL licenses, and the Commercial license.

18

QML In 2009 introduced Qt this mark-up language along with toolkit and QML [3] stands for Qt Modelling Language. It is used mainly to define Graphic User Interfaces (GUIs), signals, and controls. The main reason for developing this is the need for touch input support and easy rendering functionality in 2d and 3d as well as animation support. Furthermore, the QML used the Qt Quick toolkit. it supports JavaScript, which uses a custom V4 JavaScript engine, and it can be connected with C++ classes and functions in back-end integration. Additionally, the QML file/document defines “Qt Quick” components and control elements in a hierarchical tree [3].

Basic syntax

This example shows how a simple QML application with Qt framework is implemented: import QtQuick 2.15

Rectangle { id: canvas width: 250 height: 200 color: "white" Image { id: logo source: "img/TUClogo.png" anchors.centerIn: parent x: canvas.height / 5 } MouseArea { anchors.fill: parent onClicked: { Console.log(“Clicked!”) //JavaScript printing command } } }

This example will draw a white coloured background window; the university logo will be shown in the centre of this window, and the application will print a message when the user clicked the logo.

19

Qt Creator Qt Creator [4] is an advanced cross-platform Integrated Development Environment (IDE) and it is designed mainly for assisting QML developers as well as new Qt users. This IDE has many features that can boost the productivity of the developers like code editor with C++, QML, JavaScript, and ECMAScript support, quick code navigation tools, refactoring of source code, visual debugger, Graphical User Interface (GUI) designer, and code analyzer. Additionally, it can establish a remote deploy connection between the development computer and the target device.

Figure 1.8: Qt Creator screenshot.

1.2.4 NodeJs NodeJS [5] is written in the C++ programming language. It is defined as a cross- platform runtime environment and not as a library or framework, but rather based on Google Chrome JavaScript engine V8 to provide optimum performance and memory management. It was released in 2009 as an Open-Source project. Although the native developing language for NodeJS is JavaScript, NodeJS supports ported applications and modules from other languages like C++, Python … etc. NodeJS comes with a great set of numerous functionalities like HTTP, HTTPS, NET, TLS, DNS, OpenCV, CAN-Bus, etc. In addition, NodeJS comes with many APIs like document operations, databases, JavaScript Object Notation (JSON), etc. The design of NodeJS makes it work with event-driven and non-blocking input/output models to boost the efficiency of the runtime environment. Furthermore, it can achieve many real-time tasks on multiple connected devices. 20

NodeJS is mostly used to create and develop web servers and the only main difference between NodeJS and other web programming languages (e.g. PHP) is that NodeJS runs codes and commands in parallel and uses “callbacks” to acknowledge errors, success, or finish. The strength of NodeJS shows in six fields: Powerful Stack, Performance, Scalability, Package Manager, Support, and Absolute JavaScript Object Notation (JSON) support. The first strength is the Powerful stack, which means it comes with a complete stack of JavaScript development, like efficiency, code usability, high-speed performance, a huge community of developers, and free modules. This advantage will lead to adaptable, robust, and time/resource-saving development. The second strength is Performance, which means the NodeJS is faster when it is once compared with other web programming scripts and languages like PHP, Python, and GO. Moreover, this is possible due to three main factors: first, the Google Chrome V8 engine. Second, non- blocking Input/output (which will be clarified later in detail) and Asynchronous request handling. Lastly, the event-based model. Third, Scalability means that the application can be divided into very small services instead of using one application to do the whole calculations and run the functions (Monolithic vs Microservices [6]) [5].

Figure 1.9: Monolithic architecture vs Microservice architecture in a nutshell [6]. Fourth, the package manager is one of the powerful aspects of NodeJS and it is known as Node Package Manager (NPM). The NPM uses “npm registry” to find the free and the paid modules and then add them to the NodeJS project.

21

Fifth, the number of companies that are using NodeJS is increasing and many top- companies are using NodeJS, such as PayPal, Netflix, and Uber. Lastly and most importantly for this thesis, is the flawless and absolute support for the JavaScript Object Notation (JSON) arrays and files. Most web-programming languages (e.g. PHP) are using databases to save the user and system data like MySQL and uses the JSON array only to pass data between pages or frontend and backend, but for NodeJS it is optimal to use the JSON format for saving the data instead of using databases to keep the efficiency and the high performance. In addition, JSON format is highly used for passing data between the applications or the functions. To improve the performance and efficiency, NodeJS can load native C/C++ programs directly in the application and these programs can be build using the right C/C++ compilers and the required libraries/headers from the NodeJS. These programs can be added at the end as “Add-ons” with the help of an API written in C language and it is called “N-API”. Returning to the non-blocking factor for the NodeJS, as already mentioned this runtime environment is using native JavaScript, which in theory means it uses one thread event-loop, but this was solved by using the non-blocking input/output method. In addition, this made the need for multi-threading useless due to the ability to get multiple requests and return responses simultaneously [5, 6]. As shown in Figure 1.10, there are two users – Tim and Bob – and these want to send a request at the same time to the NodeJS backend to print their maximum driven speed (for example). In the blocking Input / Output method, Tim would wait until the server responded and then print the requested value. When Tim’s task finished, Bob needed to wait for the response of the server and then get the requested value. On the other hand, in the non-blocking Input / Output method both users send their request and wait for the response in parallel.

Figure 1.10: Blocking I/O vs Non-Blocking I/O.

22

This example shows how a simple server with NodeJS is implemented:

var ServerHTTP = require('http');

ServerHTTP.createServer(function (request, response) { response.writeHead(200, {'Content-Type': 'text/plain'}); response.end('Es lebt!!!'); }).listen(8000);

1.2.5 CMUSphinx Most of the produced or manufactured devices nowadays are equipped with a speech assistant, such as Alexa, Google, Bixby, or Siri. In addition, some of these speech assistants are Online and require an Internet connection to handle predefined or new commands.

CMUSphinx [7, 8] or as short Sphinx was released by the Carnegie Mellon University (CMU). Sphinx is an open-source Automatic Speech Recognition (ASR) system that was written in and the Pocketsphinx in C and it uses the Hidden Markov models (HMMs). It is good for developing not only a stable system but also a multi-language system. In addition, it is free to use and provides good English support models and dictionaries, and the most important - the ability to handle commands offline.

Figure 1.11: Sphinx 4 Architecture [8].

23

As shown in Figure 1.11, the architecture of Sphinx was developed with a high level of both adaptivity and modularity, which means the modules, can be easily modified or removed, which makes it easy for the developers to experiment without the need to modify other models or the core system. The main parts of the Sphinx are Front-end, Feature, and Linguist. First, Front-end is the responsible for Digital Signal Processing (DSP) on incoming audio commands. Second, the output of the Front-end is called a Feature and these can be used for decoding. Finally, the Linguist is the Knowledge base that assists the Decoder in finding the right command that corresponds to the digital signal (audio command). Moreover, the Linguist consists of three modules: Dictionary, Acoustic, and Language Models.

 Dictionary has the pronunciations of the commands.  The acoustic Model has the playback sounds of the commands and the training model issues it from Acoustic data.  Language Model has the playback sound with how often it could be repeated and occurs.

The additional parts are the Search Graph and the Decoder. The search graph constructs a graph based on the knowledge base, Dictionary, Acoustic, and Language Models. The Decoder is the main part of the system, which handles the incoming data and processes the features from the Front-end linked with the knowledge base, and carries out a search to identify the most likely words that match the words in the features.

1.2.6 GTK GTK [9] is a stable open-source and cross-platform widget toolkit, written in C language, for designing Graphical User Interface (GUI). Furthermore, it can be used with programming languages, like C/C++, Python, JavaScript, Perl, RUST, etc. GTK uses the official GNOME bindings, which makes it the best for designing Graphical User Interfaces (GUI) for Linux-based machines.

1.2.7 Felgo Felgo [10] is a cross-platform framework based on Qt. It can develop edge-cutting applications with a native appearance for Windows, Linux, Android, IOS, Embedded, and Web applications. This framework uses Qt QML to design the Graphical User Interface (GUI), which is based on JavaScript and has the ability to add C++ functions

24 and classes, and the codes will be built and compiled using the native machines building tools. Even though Felgo is based on the Qt framework, it has additional libraries for mobile and game development and Graphical User Interface rapid development. Apart from mobile and game development, Felgo boosts the Qt framework with more than two hundred Application Programming Interfaces (APIs). Felgo uses the same IDE from Qt, the Qt Creator, and Qt design studio. In addition, with this framework, developers can develop great applications with a “single code- base” and in this way; the developers can save up to 80% of coding time and development. In addition, it provides cloud services, which means the developers can use QML with User Authentication, Real-time chat, and cloud synchronization.

Business Startup Personal Price €350 €79 free Core Platforms X X X Mobile & Desktop Web Platforms X X - Embedded Platforms Raspberry, Arduino, QNX, Raspberry & - VxWorks, i.MX6 & Toradex Arduino Develop Apps X X X Custom Splash X X - Screen Professional Timely X - - Support Source Code Access X - - Table 1: Felgo Licenses.

1.2.8 Kivy Kivy [11] is a free, open-source, cross-platform Python-based framework for Graphical User Interface development, and it supports only Python programming language. In addition, it comes by default with multi-touch screen support without the need to import any special libraries. Same as Qt the graphic library support “OpenGL” and it is based on “Vertex Buffer Object”. Kivy uses Kv language, which makes it very easy to develop a Graphical User Interface with a readable layout. This simple code is to create a loading dialog with file dialog and Open buttons.

25

#:kivy 1.11.1 : BoxLayout: size: root.size pos: root.pos orientation: "vertical" FileChooserListView: id: filechooser BoxLayout: size_hint_y: None height: 30 Button: text: "Open" on_release: root.load(filechooser.path, filechooser.selection)

The benefits of using Kivy:  Easy to use and support Multi-touch screens.  Cross-platform.  Based and written in Python with ported libraries using Cython.  Using OpenGL Graphic library.  Able to support C/C++ codes using Cython. On the other hand, the drawbacks of using Kivy:  No native rendered Graphical User Interface (GUI).  Large package size, due to the python interpreter, which has to be included.  Lack of Community Support.  Lack of good documentation and tutorials.  Unstable Graphical User Interface (GUI) designer (Kivy Designer).

1.2.9 ElectronJS ElectronJS [12] is an open-source and cross-platform framework, which uses and NodeJS. Additionally, it makes developing applications easy with help of JavaScript, HTML, CSS, and the Google Chrome engine.

1.2.10 Django Django [13] is a free and open-source web development framework for Python. In addition, it gets the Python performance and makes it fast, robust, secure, and scalable. Furthermore, it comes with a powerful Representational State Transfer (REST) framework [27]. The advantages of using Django:  Security: Basic, Session, Token, and RemoteUser Authentication.  Comes with a built-in administration interface.  Scalability and flexibility.

26

1.3 Problem Statement

One of the main challenges that face Human Computer Interaction developers is how to design and develop a Graphical User Interface that can adapt to the user of this system and the Software/Hardware requirements without the need to reprogram, update or change the Software design. The current software was developed using python and it lacks of three main points: optimization, usability, and adaptability. That means whenever a new rule/ CAN-Bus message ID needs to be used or integrated into the program, the developer is required to recompile and rebuild the program then upload it to the BlackPearl, and this costs time and money. In addition, the streaming server had a high latency rate, which makes including a streaming server useless for the users and a waste of time, money, and resources for the company. Furthermore, the user remote control functions have latency problems approximately one second to change or stop the command and that makes it not safe enough with hindrance around the car. The proposed solution in this thesis is to create an API to be easily implemented and integrated into other programming languages. Additionally, this will make it easy for the developers and the users, who do not have any programming background, to add and modify new rules/ CAN-Bus IDs and for other developers, the ability to call the required data without the need to use the CAN-Bus interface, read, modify or reprogram the main system. In this way, the proposed solution will save the time, money, and resources of the company and at the same time make it easy to update the system remotely without the need for physical installation or update. Furthermore, new apps can be installed in the main system without the need to recompile or rebuild the main system again.

1.4 Requirements

After introducing the Automotive Demonstrator and the frameworks that will be compared on many aspects and based on these aspects only one framework will be used for the touch interaction and one for the web server. In this section, the main requirements for the thesis will be listed, detailed comparison between the current and the proposed platform will be listed, and then explain the reason why the Qt framework is chosen over GTK, Felgo, and Kivy. Afterward, NodeJS over ElectronJS and Django to solve the problem and to filful the requirements of this thesis.

27

1.4.1 Main requirements

Current BlackPearl Technology requirement - Camera and Data Visualization. - Multiple user interaction methods. - Adaptivity and Usability. - Easy setup and modification. - Add new features without build and compile. - Remote update and deployment. User Interaction Web Interface (keyboard). Resources - CAN-Bus. - WiFi. Usability Changes and updates need to be done on the source file and then rebuild the application. Use cases - Testing. - Performance assessment. - Optimizations of ADAS functions. - Image processing. Drawbacks - Poor Server performance. - High Camera frames drop. - High server latency (900-1100 ms). - Very limited user interaction methods. Table 2: Main Requirements. As shown in table 2, the main requirement for this thesis is the visualization of data and the camera streaming, introducing more user interaction methods, solving the poor performance, high latency, and frame drop issues. In addition, the need to add new features and applications easily without the need to interface with the platform source code, and give the users and developers more usability and customization abilities.

1.4.2 Current vs proposed system Based on the Main requirement from section 1.4.1, table 3 was created to explain the need for developing a new system with better technology that will make the system achieve better performance, introduce more user interaction methods, and more features.

28

Current System Proposed System Programming Language Python QML and Javascript with NodeJS User Interactions Web Interface - Web Interface (Remote). - Touch UI (Touchscreen). - Audio UI (Sparrow). Control methods Keyboard “WASD” - Virtual Joystick (touch or mouse). - Gamepad. - Voice commands (Sparrow). Integration with CE-Box No - CAN-Bus. - WiFi. Communication - CAN-Bus. - CAN-Bus. - WiFi. - WiFi. - Ethernet. - Lin. - Bluetooth. New Apps and Features no - Via Web Interface. - Via Pi display. Easy setup and modify no Yes, without the need to redeployment. Dynamic Signals no Using the Rule Manager, BlackPearl API, and the Smart Queue. Stream support no Yes up to 3 cameras Visualization Only web interface Camera and App visualization Performance OK High Latency High Very low FPS drops Mid to High Low to semi-live Security Hotspot password. Hotspot password and restricted user access. Usability and adaptively No - Smart Queue. - BlackPearl API. Support future research no - Image processing. applications - Autonomous driving. - Remote system deployment (section 6.2). Table 3: Current vs Proposed System.

29

1.4.3 Why Qt and NodeJS

Feature Qt [3] Felgo [10] GTK [9] Kivy [11] Public 1995 2012 1998 2011 Release Custom yes yes yes yes rendered UI / widgets / animations Native yes no yes no rendered UI Programming QML, JS, C++, Python, QML, JS, and C++ C, C++, JS Perl, Python Language Ring, Java, Rust, Crystal, Python, Rust, Go, C#, , ADA, , Vala, D, ADA, Haskell, Lisp, and Lua Haskell, Lua, R, and Ruby Development Qt Creator, qt design Qt Creator, Visual Studio, , Anjuta, PyCharm IDEs studio, and Visual Studio and Web IDE NetBeans, and Text Editors Visual UI Qt Quick Designer Qt Quick Designer Glade Kivy Editor Designer Remote App yes yes Yes, but not no Update stable Supported iOS, Android, Desktop iOS, Android, Desktop Linux, Windows, Linux, Platforms (Windows + Mac + (Windows + Mac + Linux), OS X, Windows, Linux), Embedded (e.g. Embedded (e.g. Raspberry OpenVMS, OS X, Raspberry Pi, Arduino, Pi, Arduino, i.MX6, ...), Web Android (with Android, i.MX6, ...), Web limitations) iOS, and and Raspberry Raspberry Pi Pi Support Enterprise Support Level Enterprise Support Level & Github, GTK Github, & SLA available SLA available Blog, and Stackoverfl Stackoverflow ow, forums License Commercial, community Felgo: Closed-source free free and open- MIT (Free (under several versions version, source code source under software) of the GPL and the available for commercial GNU License LGPL) customers. Qt: LGPL & Commercial Release It is released when 6-10 updates per year, long- long-time- long-time- Cycle needed and when quality time-supported versions supported supported is good enough, versions versions 1-13 weeks Main Mercedes-Benz, Tesla, T-Mobile, Qt, Siemens, Red Companies Peugeot, Parrot Bull, Cyan, Mediamarkt & Automotive, and German Saturn. Air Traffic Control Table 4: Qt vs Felgo, GTK, and Kivy.

30

Table 4 helps us to decide, why this thesis uses the Qt framework over the other mentioned frameworks above (GTK, Flego, and Kivy), based on adding/update applications without the need of rebuilding and compiling, cross-platform, Touch screen, streaming, cost, etc. In table 5, we can find the comparison between the web frameworks and why we chose NodeJS over ElectronJS and Django.

Feature Nodejs [5] Electronjs [12] Python web [13] Public 2009 2013 2005 Release Architecture Event-driven Model Event-driven Model Model-template-view Custom yes yes yes rendered UI / widgets / animations Native yes no no rendered UI Programming C, C++ and JavaScript C++, JavaScript, Python Language Objective-C++, Python and Objective-C Development Visual Studio, , Visual Studio, Atom, PyCharm IDEs Sublime Text, and Adobe Sublime Text, and Dreamweaver Adobe Dreamweaver Visual UI Adobe Dreamweaver Adobe Dreamweaver - Editor Remote App yes yes yes Update Supported iOS, Android, Desktop Linux, Windows, OS X, Linux, Windows, OS X, Platforms (Windows + Mac + Linux), Android, iOS, and Android, iOS, and Embedded (e.g. Raspberry Raspberry Pi Raspberry Pi Pi, Arduino, i.MX6, ...), Web Support Github, Stackoverflow, Github, Stackoverflow, Github, Stackoverflow, forums forums forums

License MIT (Free software) MIT (Free software) 3-clause BSD (Free software) Release long-time-supported long-time-supported long-time-supported Cycle versions versions, new versions versions every 2-4 months

Main Amazon Web Services, , Netflix, Instagram, Companies IBM, PayPal, Microsoft, Facebook Messenger, Spotify, Youtube, and Netflix. Twitch, Slack, InVision Dropbox, Pinterest. Table 5: NodeJS vs ElectronJS and Python Web.

31

2 State of Art

After introducing the automotive demonstrator, which is called the BlackPearl and the first generation, the Yellow car, then technical background check was made, and the main requirements for this thesis were listed in chapter 1. Chapter 2 will introduce the related work and compare them with the proposed solution.

The focus of this thesis is divided into three parts:  Adaptivity.  User Interaction.  Usability.

In section 2.1, the current work from the Computer Engineering department, which was already implemented on the BlackPearl will be introduced and then addressing the drawbacks and advantages of this system. In section 2.2, three related works will be introduced, which were implemented in NodeJS and Python. In the last section 2.3, three speech assistance frameworks and engines will be introduced and then compared with the proposed framework for this thesis.

2.1 BlackPearl: Extended Automotive Multi-ECU Demonstrator Platform

The current system was developed using Python programming language and the Graphical User Interface was designed using the Kivy framework. This system consists of three components:  BlackPearl Server This server was written in Python with Web Interface support, which allows the users of this application to control the BlackPearl and get the ultrasonic sensor information. The users on the Web Interface can control the car using only the keyboard buttons “WASD” on the computer or Bluetooth keyboard connected to a phone or tablet. Otherwise, the handheld users can only change the car lights and their colors by clicking on HTML buttons or read the car front and rear ultrasonic sensors. In addition, the server has a high latency that makes driving the car not stable and safe, which means the car will always stop or change directions after almost one second. The web server is using JavaScript Object Notation (JSON) arrays to communicate with the main python file to send CAN- Bus messages. The CAN-Bus functions are written manually which means the developer has to update and change the message IDs or the body in the source 32

code itself, then compile, and deploy the server. Additionally, the server has no filters or rules, which means any incoming messages can be sent or forwarded to the CAN-Bus device and then to the car.

 Camera Server The Camera Server is written as well in Python and using the OpenCV library, which is used for image capturing, recognition, and processing. This server is working standalone and there are no connections between the web server and this server. The current server has high latency and Frame per Seconds (FPS) drops.

 Graphical User Interface on Pi display. The Graphical User Interface (GUI) is designed using Kivy, which is a Python- based framework. The Graphical User Interface is rather simple and there is no connection between both Camera and Web Interface servers. The car data will be fetched manually via CAN-Bus.

That means all these components are working standalone and each component is having its CAN-Bus connection.

This thesis will propose a solution that will be designed and implemented using optimized JavaScript and NodeJS runtime environments. The proposed BlackPearl Server will introduce more User Interaction methods, optimization, better performance, low latency, and usability. The current system is using only a web interface as the user interface to control the car and the Pi display to demonstrate simple functionalities. On the other hand, the proposed system will provide a Web Interface, Voice commands, and Gamepad as user interaction methods and the Pi display to demonstrate all possible functionalities and user-developed applications. Furthermore, the new Camera server is working as standalone like the current system, but this server can be integrated into any programming language that supports WebSockets. Additionally, the new Graphical User Interface (GUI) will be implemented using Qt Modelling Language (QML) and it can be extended with extra applications, which the developer can add on runtime. The main challenge that will be introduced as a new feature is to develop and implement a Smart Queue and Application Programming Interface (API), which will make it easy for the BlackPearl components to communicate between all connected applications on the BlackPearl Hotspot (WiFi) and allow the developers to integrate the BlackPearl modules, components and data in their application easily. In addition, the developers are not required to have any experience on how to implement the CAN-

33

Bus connection or interface in their applications as well as implementing the Camera streaming module.

2.2 Adaptive User Interaction

Three related works to the Adaptive User Interaction were been found. All theses three works were focusing on creating an adaptive user interface without the need to redeploy the main system and the implementation has to be easy to setup and to modify, in order to meet this thesis main requirements from section 1.4.1.

2.2.1 A Framework for Adaptive User Interface Generation based on User Behavioural Patterns The first work is this research paper [14], which presents an Adaptive User Interface generator based on analyzing the user’s behavior. This framework will use the “Mean access time” [14], which means the less accessed and used components or features will be hidden or removed from the user Graphical User Interface (GUI).

Figure 2.1 System design and Main servers [14]. As shown in Figure 2.1, the framework consists of three main servers: the Main Server, which is implemented in the NodeJS runtime environment and is connected to the MongoDB database. The Dashboard Server, which can be only accessed by the website owners and it provides resources to draw the Graphical User Interface (GUI). The third server is the Analytic Server, which is responsible for recording and registering the user’s behaviour. This server is implemented using Python Web development (Flask and Django) and PostgreSQL database, and then the collected data from these system users will be analyzed using a machine-learning algorithm, which is developed using Python SciKit. This framework will trigger a JavaScript function in the web server when the mouse is moved and then these functions will create heat-maps based on mouse movement, waiting time, and the number of clicks as shown in Figure 2.2. [14]

34

Figure 2.2: Webpage Heatmaps [14].

These collected user’s behaviour data (in milliseconds) will be sent to the Main Server, in case the value was zero then it means the users of this application performed a mouse click, otherwise, the server will get the user’s behaviour data. Afterward, the analysis will be carried out using Python and then the trained model will be created and stored in the database. Based on this trained model and if the user was labelled, the “JavaScript controller” will remove the unused components and features from the User Interface and render only the used User Interface (UI) components. Finally, this framework is implemented mainly to resize and rearrange the Graphical User Interface (GUI) based on the user’s behaviour to improve usability and minimize GUI reconfiguration or redesign. Some of these application users can identify the related work from as a privacy violator or spy tool, due to the collecting of user’s mouse movements, if this was not mentioned in the data privacy policy. [14]

2.2.2 A RESTful Architecture for Adaptive and Multi-device Application Sharing The second work presents an architecture for Adaptive and multi-device Application sharing using a Representational State Transfer (REST) architectural style [15, 27]. The proposed solution in this research work states that the users will have the ability to share the Model-View-Controller compatible interactive applications with remote devices and the users can customize their interface design in the way they want. The user interface and the application state/view will be synchronized using an event-based system [15]. The architecture consists of three patterns: the model, which has the core features, and the data. The second pattern is the views, which show information to the user. The third pattern is the Controllers, which process the user inputs. [15]

35

Figure 2.3: Conceptual view of the (R-MVC) architecture [15]. Figure 2.3 illustrates the concept design of the proposed approach, which consists of two standalone applications: the main application, which executes on the host and offers the functionality of both the model and the controller, and the user agent application, which runs on the remote device. This work uses the same Model-View- Controller pattern but with an event-based mechanism, which means the Remote Model-View-Controller (R-MVC) [26] will offer the same functionalities to different devices and the changes caused on the view by the model and the interactions as remote events [15].

Figure 2.4: Interaction pattern while initialization [15]. As shown in Figure 2.4, when the user agent application starts, it will request a view representation of the user interface from the main application. Afterward, the user

36 agent application renders the user interface, and when the initialization is finished, in case of changes, the server will send the changes [15].

Figure 2.5: The experimental environment (top), User desktop (left), and handheld device (right) [15]. Figure 2.5 shows the implementation of the conceptual design from figure 2.3. The environment was programmed with Python and Qt framework. The experimental environment is running on a laptop and the handheld device will represent the user agent application. Mainly, the user interface in both applications is the same, but some widgets will be rendered differently depending on the host platform. This implementation was tested only for one user and in the case of multiuser; there is a chance that two users will have the same user interface to view and representation [15].

2.2.3 Model-based adaptive user interface based on context and user experience evaluation This work delivers a system, which has the ability to adapt the user interface based on given aspects like handicaps, device usage, and environmental aspects like lights, noise, etc. The system is based on the model-based user interface (MBUID) [24, 25] to solve the user interface differences. This approach was implemented in their ongoing project Mining Minds (MM), which is made to offer user personalized sport and wellness services based on the collected data from the users and routines monitoring service. Figure 2.6 shows the conceptual design of the Mining Minds platform. This platform consists of five components: Data, Information, Knowledge, Service, and supporting layer. This uses the “concept of curation at different levels in different layers”. This figure illustrates also the way of connection between the five layers. The concept of this work is that the platform will get data from different types of resources such as videos, sensors, surveys, interaction trackers, etc. Moreover,

37 based on these collected data the platform will personalize the user interface for this end-user [16].

Figure 2.6: Mining Minds Platform [16].

For creating the adaptive user interface development models and adaption rules, need to be generated. As shown in Figure 2.7, the adaptive user interface building tool consists of three models: user, context, and device. The user model reasonable in storing the user information such as mobility, sensory (touch, hearing, and sight sensitivity), disability, experience, physiological state, cognitive (concentration, learning ability, and attention), and user experience (user interface experience feedback). This model checks the happiness and the acceptance of the user interface from the user perspective after applying the user interface changes according to user context. The second model is the context model, this model responsible for storing the contextual aspects like lights, event occurrences, and noise level. This model has two main data context classifications: physical, and time and location context. The last model is the device model, this model saves the user device's information like screen resolution and sensors. The main two classes for device classification are hardware, which means input/output (I/O), touchscreen, sensors, memory, battery status, connectivity, other interaction modalities, etc. The software gets the operating systems browser information, supported markup languages, etc. The adaptive User Interface rules generator is a web-based tool, which generates rules in an intuitive method. These rules will be generated based on contextual dimensions such as user, platform, and environment. The user has the capabilities to create adapting rules in form of Conditions-Actions: If (Condition Part) do (Action Part).

38

Figure 2.7: User, context, and device models [16].

Figure 2.8 illustrates the user interface adapting process. This process is running in real-time and monitoring the user’s context and reasoning. The monitoring function will starts at the moment that the user starts interaction with the system via different interaction methods. First, the models will be loaded, based on these models, the rules will be generated from the adaptive user interface generator. Secondly, the rules will be used in the user interface adaption engine to render the adaptive user interface for the user. Finally, if there were any updates in the models or the rules, the User Interface will be synchronized and updated with the latest version. [16]

Figure 2.8: Adaptive behaviour data flow [16].

Figure 2.9 shows two scenarios of adaptive user interface generation. The first scenario is the user dashboard, which shows that a user called John has registered his characteristics in the user model and after activating his account on the Mining Minds (MM) platform, he was accessing only wellness services. This user has low 39 vision problems and this will be used from the Reasoner when the adaption engine generates the adaptive UI based on Rule 4. In case of any changes or feedbacks, the user can add his feedback and send it to the developers to apply rule changes if needed. [16]

Figure 2.9: Low vision scenario (left) and user dashboard (right) [16].

The second scenario is to update the low vision rule on John’s interface, by making the fonts bigger and easy to read. [16]

Advantages Disadvantages - Adaptive accuracy increased the - Frequent UI changes are annoying for user’s efficiency. the users (negative overall reaction). - Generation of UIs on runtime. - Affecting the learning ability of the - No need to redeploy the system. system. - Newly added rules will not affect the - Only a basic level of UI adaption rules system. can be managed and generated. Table 6: advantages and disadvantages for this work [16].

The proposed solution is not like the other related work from [14, 15], because most of the developed applications are designed in advance, and in case of any updates or changes, the developers will edit the application, rebuild it and then deploy it in a form of an update or patch file. The proposed approach from [16] was designed to give the platform the ability to redesign the Graphical User Interface (GUI) for each user based on their characteristics and interaction with the system. This approach will not allow the users or the developers to add new features or applications to the system. The proposed solution for the automotive demonstrator is not only to rearrange or redesign the Graphical User Interface (GUI) like the mentioned works from [14, 15, 16] but also to add new features and applications on runtime to the BlackPearl Web Interface (NodeJS) and as well as for the Display application (QML). This solution will 40 make the BlackPearl UI application work on any device that has a browser and Wireless connection to connect to the BlackPearl Hotspot. Moreover, the users of this Automotive Demonstrator can control the received data from the BlackPearl and has the ability to control the car in four different ways: Remote, Touch, Voice commands, and Gamepad. Besides, the BlackPearl will come with Application Programming Interface (API) that the developers and the permitted users can easily access to update or add new rules and configurations to the BlackPearl.

2.3 Speech Assistance

In this section, three famous speech recognition engines and frameworks have been found, which can be compared with the proposed framework CMUSphinx [7]:

 DeepSpeech [17]. This engine was developed from Mozilla based on a research paper from a Chinese company called “Baidu” and it is an open-source speech recognition engine that can run online or offline. This framework can recognize full texts and paragraphs and it is support Python, NodeJS and C programming language. Additionally, it is using Tensorflow for model training.

 Alibaba [18]. The Alibaba Company provides live real-time speech recognition, Ultra-high decoding speed, and high accuracy.

 Google Assistant SDK [19]. Google provides this Assistant SDK as an online tool and it can be used for experimental purposes only. This tool supports C++, Go, Java, NodeJS, and all platforms that support “gRPC” [19].

As previously mentioned in section 1.2.4, CMUSphinx is an open-source Automatic Speech Recognition (ASR) system, Sphinx is written in Java and Pocketsphinx in C, and it uses the Hidden Markov models (HMMs) and is good for developing not only a stable system but also a Multilanguage system. In addition, it is free to use and provides good English support models and dictionaries, and most importantly, the ability to handle commands without the need for Internet connection, and this good to avoid privacy violation. This simple table below shows why CMUSphinx for the voice command User Interaction part should be implemented.

41

CMUSphinx [7] DeepSpeech Alibaba [18] Google SDK [17] [19] Open-source Yes Yes no no

Online / Offline Online / Offline Online Online Offline Features - C/C++ - C/C++, Java, - Powerful - Powerful - Lightweight. and Python - Instant speech - Instant - Easy to train. - Powerful recognition speech - Supports engine. - Web support. recognition many - Wake-word languages. support - Own training - Web support model.

Drawbacks - No stable - Better for long - Pricey - Not free web support. sentences and - Privacy issue. - Privacy - No wake- command. - Require issue. word support - Heavy on RPi3 Internet - Require connections Internet connections.

Table 7: Speech assistance engine comparison .

42

3 Concept

Introducing and proposing the thesis concept is the first milestone in the solution of the thesis problem. After making sure that no related solutions in chapter 2 were found, this chapter will propose a concept design for this thesis problem, this conceptual design will be implemented in chapter 4, and then analyzed in chapter 5 for performance and efficiency.

Based on the requirements from table 2, the conceptual approach was designed to fulfil these requirements. First, the Easy setup and the ability to update the system without redeployment as well as adding or removing features, signals or custom applications. Secondly, adaptivity and usability that will makes it easy for the users and the developers to use and update. Thirdly, data and camera visualization, which was missing or not completed in the current system. Fourthly, the ability to update or modify the system remotely. Finally, when this concept is designed, the poor server performance and the high latency must be into consideration.

This chapter explains systematically how the solution will be designed and then ready to be implemented. It starts with the main concept (Section 3.1), Adaptivity and Configurability (Section 3.2), and is followed by the Smart Queue (Section 3.3). Finally, the multiple user interaction in section 3.4.

3.1 Main Concept

As stated in the Master thesis description, the BlackPearl consists generally of three parts:  ProccessingECU.  Demonstrator.  CE-Box.

Figure 3.1 describes the general concept of the main system and as it is shown in the proposed main concept, the BlackPearl consists of 3 components:  Demonstrator.  CE-Box.  User Interactions.

The Demonstrator consists of Raspberry Pi 3B and Pi display. On the Demonstrator, the Application framework for the Graphical User Interface (GUI) will be installed,

43 including the Car Dashboard, live streaming from the Camera server, and show the exchanged data from the BlackPearl server.

The CE-Box has the Demonstrator fixed on the top, and it can be extended by connecting up to six boards, such as Camera Server, BlackPearl Server, and other servers/boards. The communication between the Demonstrator and the CE-box can be established via CAN-Bus, WiFi, Bluetooth, Ethernet, Serial, and Lin.

The User Interaction between the users and the Demonstrator can be done using three methods: Remote interaction using phones, tablets, computers, or Gamepad, Voice commands using the speech assistance “Sparrow”, and direct interaction using the touch screen on the CE-Box. The developers can get access to BlackPearl modules via the Application Programming Interface (API) and they can integrate these modules in their application or introduce new features and modules.

Figure 3.1: Main Concept.

The main part of this thesis is the Smart Queue. As shown in Figure 3.2, the BlackPearl system will be executed, waiting for the Smart Queue to initialize the CAN-Bus and WiFi rules then the Graphical User Interface (GUI) will be drawn and the user of this application can interact with the system using: Touch, Voice, Remote interaction and the Gamepad as an extra.

44

Figure 3.2: Main Concept 2.

Another design was proposed as shown in Figure 3.3. This conceptual design is using the Service-Oriented approach. When the BlackPearl turns on, the Smart Queue will check the configuration file then check the connected boards or devices locally or remotely to the CE-Box, and when the data are successfully fetched, the BlackPearl platform will run all applications and servers. In the background, the Smart Queue will be always running and waiting for the CE-Box or other device, if a new component is connected, this component will send the installation message (e.g. Hardware ID) to the Smart Queue to register the new component as a new service. Afterward, the Smart Queue will update the BlackPearl platform and add or update the platform according to the detected and installed services.

Figure 3.3: Alternative Main Concept.

45

However, after checking the requirements and the deadlines, the main concept from Figure 3.2 was chosen for this thesis.

3.2 Adaptivity and Configurability

As stated in this master thesis requirements, the main challenge is to program and develop an adaptive and configurable system that makes it easy for the users of the application to interact with as well as for the developers to use the API without the need to read or change the source code of the BlackPearl system. As shown in Figure 3.4, the BlackPearl Application Programming Interface (API) will be running under the NodeJS runtime environment in parallel with the Web interface. This figure shows how the interaction between the users of this application, developers, and the BlackPearl server will be designed and how the system will be configured accordingly. The users and the developers need to connect first to the BlackPearl’s hotspot via WiFi. As for the users, they have to call the BlackPearl Web Interface to use the System. In the case of developers, they need to connect to the BlackPearl API in order to call the required data or to add or update the Smart Queue rules. For example, if the developer (Ellen) needs to include the ultrasonic sensors values or the camera container in her code, she needs to include the message ID of the CAN-Bus that responds to the ultrasonic sensors, or the URL for the Camera server and the responding parameter for it. Furthermore, this process happens without the need to program the CAN-Bus of the BlackPearl or writing code for the live streaming camera container.

Figure 3.4: BlackPearl API Concept.

46

As for the Pi display or also known as “Touch User Interaction”, the application will come with three main standard applications, which the thesis will discuss in detail later in section 3.3.2. The developers have the option to install or add their own custom application to the main application framework without the need to recompile or edit the source files of this platform. They have by default the permissions to use the precompiled and included functions and classes from the application framework. For example, the developer “Ellen” programmed her own custom application to read and display the engine speed (Revolutions per Minute), the recognized traffic signs using the precompiled image processing framework, and the API from the BlackPearl Server. The whole process is integrated without the need to program the Camera or the CAN- Bus like the previous scenario.

3.3 Smart Queue

In order to how the Smart Queue will be designed and how it will work on the BlackPearl Server, the idea and the concept behind the Smart Queue will be further elaborated. Figure 3.4 illustrates the main parts of the Smart Queue:  Rules Database.  Smart Queue Filter.

The Rules Database is the most important part not only for the Smart Queue but also for the BlackPearl Server. It consists mainly of three tables: WiFi, CAN-Bus, and other tables, which can be added in the future. Finally, the Smart Queue Filter, which has the role to check and filter the incoming and outgoing messages. For example, the CAN-Bus is receiving three messages with these IDs: 1,2,5 and as shown in Figure 3.5, the listed rules for the CAN-Bus showing only the message with these IDs 1 and 2. In this case, the Smart Queue will check the three IDs and compare them with the Rules Table, and then if the ID was listed in the table, then the Smart Queue will pass it to the BlackPearl Server; otherwise, it will block it. That means, in our example, the CAN-Bus message with the ID 5 will be blocked. To avoid blocking the new IDs, the developer needs to add the new messages in the Rules Table and how this message can be accessed by other developers. For now, the delete function was programmed and implemented; however, the function was disabled until the test phase finished successfully.

47

Figure 3.5: Smart Queue Concept.

3.4 User Interaction

As mentioned above, the main User Interaction methods are: 1. Remote User Interaction (UI). 2. Voice User Interaction (UI). 3. Touch User Interaction (UI). 4. Gamepad.

The four User Interaction (UI) methods will be explained comprehensively in this order: the Remote User Interaction (UI) or known as “BlackPearl Web server” in section 3.4.1, Voice User Interaction in section 3.4.2, Touch User Interaction is also known as “Pi Display” in section 3.4.3, and finally the Gamepad interaction in section 3.4.4.

3.4.1 Remote UI After understanding the main concept, the concept of the BlackPearl Server (or the Remote User Interaction) can be described, which is our core component and without it, the system will not execute and initialized. As shown in Figure 3.6, the server consists of two main modules: the Web Interface and the Main server (or the server Backend). The web server can be accessed remotely via WiFi using phones, tablets, or computers or locally via CAN-Bus or the touchscreen. On the web server, the user can check the vehicle status, sensors, such as ultrasonic sensors, user interaction, which include the touch steering wheel (virtual joystick) and the gamepad, the live camera streaming from the Camera Server, which is installed in the CE-Box and finally 48 the Smart Queue Rules Management System. Furthermore, the Main server (The Backend) has three subcomponents: the Communication application, Smart Queue, and the JSON Qt test Application. The Communication application has the role of controlling and filtering the incoming and outgoing messages and passes them to the Smart Queue. Additionally, the Smart Queue application will get the messages and filter them and it is responsible for managing the rules database, which can alter and control the Smart Queue using the Smart Queue Management System. Finally, the JSON Qt Test Connector application is used only for testing the functionality of the dynamic Graphical User Interface (GUI) application, which is installed on the Automotive Demonstrator main application, and the role of this application is to pass the Graphical User Interface (GUI) components in JSON and let the Qt main application redraw the GUI dynamically.

Figure 3.6: BlackPearl Server Concept.

3.4.2 BlackPearl Display The second main interaction method is the touch User Interaction (UI), also known as Pi Display, and programmed with application framework. As shown in Figure 3.7, the application framework is divided into two parts: First, the default applications, which are standard and the user of this application can hide them on runtime. The second part is the Addable applications which allow the developers to add their own custom application to the framework main application without the need to update, recompile, rebuild or redeploy the framework main application.

49

The three default applications and a setting application, which comes with the framework application. These three default applications are the car dashboard application, which shows the car speed (Km/h), the engine information (rpm), and the live streaming camera. The second application shows live data from the front and the rear Ultrasonic Sensors (USS). The third application comes with a simple Dynamic Graphical User Interface (GUI) to demonstrate another way of adaptivity and configurability for the application. Finally, the setting page or application gives the status of the system and the ability to change system values on runtime. As for the Addable applications, the developer can add up to three custom applications and has the permissions by default to access the classes, libraries, or the modules from the framework main application. At this stage, only three custom applications will be allowed to be added from the user or the developer for testing and performance purposes. After the implementation and the test phases are finished, it is possible that this number of added applications might be extended to up to seven custom applications. Also, the Pi Display is connected to the BlackPearl Server, the Smart Queue, and the Camera Server. This is because the default applications are getting the values from the BlackPearl server and not using any interface with the BlackPearl CAN-Bus as well as the live streaming, which is using the same address that the BlackPearl is using for the Web Interface but different port.

Figure 3.7: BlackPearl Display Concept.

3.4.3 Voice UI The Voice User Interaction (UI) is a “nice-to-have” interaction method and for now, this method will be executed at this point on an external computer and the command will be sent to the BlackPearl over WiFi, but the user input should be changed to “Sparrow”. As shown in Figure 3.8, the Speech assistance also known as “Sparrow” is designed to be very lightweight, fast, and accurate. The running process of the Sparrow is very simple: first, the application will be executed, then the user presses the microphone button. Secondly, the Sparrow will check if the user has a microphone. Thirdly, if the user has a microphone plugged in, the framework will listen repeatedly to the 50 microphone until the user gives a command. Fourthly, the Sparrow will check the voice pattern and compare it with the trained dictionary files. Finally, if the command was recognized, then it will be translated to a text command and then sent to the BlackPearl server. The Sparrow might be executed later after the implementation and the test phases finished on the Pi Display itself and add a Bluetooth microphone for the commands instead of the 3.5 jack standard microphone port.

Figure 3.8: Sparrow Concept.

3.4.4 Gamepad The fourth method for User Interaction (UI) is the Gamepad. At this point, the PlayStation 4 controller will be implemented to control the car and drive it with the gamepad. The PlayStation 4 controller will be connected to the computer or the phone of the user of the Web Interface application then the user has to grant permission to the browser to get access to the USB or Bluetooth controller to use the gamepad. Figure 3.9 illustrates the concept design for the gamepad working flow. First, the user calls the BlackPearl Web Interface. Secondly, the start engine command needs to be sent via CAN-Bus to start the engine of the car. In case of error, the Web Interface will show the error code or status in the browser console to the users of this application.

51

Thirdly, the user needs to select which control type to use, by default the Virtual User Interface or also known as “Virtual Joystick” is activated. When the user actives the Sparrow interaction method, the car will now get the commands from the Voice assistance only. If the user activates the Gamepad, then the browser will ask the user for permission to use the PlayStation 4 controller. Then the user of the Web Interface can send the commands using the gamepad. In case of any error, the BlackPearl Web Interface will deactivate the failed interaction method and active the Virtual User Interaction (Virtual User Interface). The Gamepad interaction function will read only four Arrow buttons from the PlayStation 4 controller: Up, Down, Right, and Left. The user can press one button or two at a time, but the Gamepad interaction function can read all controller inputs: fourteen buttons, two analog sticks, and one directions button (D-pad).

Figure 3.9: BlackPearl Display Concept.

52

4 Implementation

Based on the requirements from chapter 1, the related works from chapter 2, and the conceptual design from chapter 3, the proposed solution will be in this chapter implemented and it will provide comprehensive information for the implementation of the BlackPearl system and its components. This approach is divided into six main sections:  Main implementation.  Application Programming Interface (API).  BlackPearl Server.  Smart Queue and the Rules Management application.  Application framework.  User Interaction methods. In this chapter, the above listed implementations will be in this following order comprehensively introduced and implemented. First, the main concept design from 3.1 will be implemented in section 4.1. Second, the adaptivity conceptual design from 3.2 in section 4.2, and the configurability in section 4.3 will be implemented. Third, the BlackPearl and the camera servers design will be implemented in section 4.4. Fourth, the Smart Queue conceptual design from 3.3 will be implemented in section 4.5. Finally, the user interaction and interface from 3.4 will be divided into two sections. The application framework (Pi display) will be implemented in section 4.6 then the remote interaction in the final section 4.7.

4.1 Main implementation

Figure 4.1 illustrates the main implementation for the BlackPearl platform, which consists of three single board computers and a touch screen. On the BlackPearl, there are two main components, CE-Box and Pi display. In the CE-Box, two servers were installed using the NodeJS runtime environment. In slot 5, the camera server was installed and it is running one application, which sends the captured live streaming from the USB camera then passes it via Wifi connection using the WebSocket (WS) to the BlackPearl server. This application can detect up to three USB cameras. In slot 6, the BlackPearl server was implemented using also the NodeJS runtime environment and it comes with four main applications: Web Interface, Communication, Smart Queue, and the JSON dynamic GUI connector. In addition, the Web Interface application can be extended using the custom app function, which gives the ability to the developer to add only one web-application. The last component is the Pi display, which is implemented using the Qt framework with Qt markup language (QML) and JavaScript, and this comes with main four applications that can be extended up to 53 seven applications with the help of the “Addable QML Apps” application. All BlackPearl components are connected together via CAN-Bus and Wifi network.

Figure 4.1: BlackPearl Main Implementation.

4.2 Application Programming Interface (API)

The Application Programming Interface (API) and the Smart Queue is the most important part in this thesis. In this section, the file structures and then how this API can be accessed locally and remotely will be firstly explained. The API is written in JavaScript under the NodeJS runtime environment. In addition, the database and the data exchange are based on JSON arrays. The JavaScript Object Notation (JSON) database can be described as shown in Figure 4.2, the API consists mainly of two components: CANBus and WiFi.

54

Figure 4.2: BlackPearl API.

The CAN-Bus component consists of:

Msg_ID The CAN-Bus message-id in HEX format. Rule’s Name The name of the signals/rules. Access This can be a single variable or array, R/W This can identify the type of the message if it Read (R), Write (W), or both Read and Write (R/W). Read means Incoming, Write sending or outgoing and R/W means both incoming and outgoing. Status The status of the signal/rule if it is active (1) or disabled (0). Table 8: CAN-Bus Rule’s Elements.

The Wifi component consists of:

URL The URL is the server address and the corresponding access port. Rule’s Name The name of the signals/rules. Access This can be a single variable or array, R/W This can identify the type of the message if it Read (R), Write (W), or both Read and Write (R/W). Read means Incoming, Write sending or outgoing and R/W means both incoming and outgoing. Status The status of the signal/rule if it is active (1) or disabled (0). Table 9: Wifi Rule’s Elements.

55

4.3 Rules Management

All these rules were imported from the BlackPearl dbc file and in the future, this will be imported automatically. Figure 4.3 shows the process of managing the Rules for the Smart Queue. The Rules or configurations management can be accessed under this Uniform Resource Locator (URL): http://blackpearl:3100. Furthermore, the application has three functions: the first function is the print function, which lists all values in the JSON database. The second function is the add_Rule function, which can add two rules: CANBus and WiFi data. The third function is the update or edit function, in which we can update all signals/rules data. The fourth function is the delete function, which is deactivated due to the experimental phase.

Figure 4.3: BlackPearl API interface.

For example, if a developer needs to access the front ultrasonic sensor data, this developer needs to use the WiFi class and then the WebSocket (WS) attribute without the need to interface with the CAN-Bus. The live values of the front ultrasonic sensors can be integrated into the developer’s application in one simple function instead of writing multiple lines of code. That means the developer can access the raw data using this JavaScript code. The raw data can be also accessed using any other programming language that supports JavaScript Object Notation (JSON). As shown in code snippet 1, this JavaScript function needs to be acknowledged. Starting with point A, the developer needs to create a WebSocket (WS) connection between the developer’s application and the Application Programming Interface (API). Then the developer needs to use the Application Programming Interface (API) keyword

56

“CanRead” or instead of the keyword, the connection application Uniform Resource Locator (URL), otherwise the CAN-Bus incoming data will not be readable. In point B, the function will update the status flags and inform the system that the incoming data from the CAN-Bus are readable or a connection between the system and the Communication application is established. In point C, the developer creates a variable called “sensors” as a JSON decoder. Finally, in point D, the developer creates a simple condition to get only the front ultrasonic sensor data using the last value in the JSON array. Value 1 is corresponding to the CAN-Bus message-id 0x001, and 2 is for the rear ultrasonic sensor data in HEX 0x002. Additionally, the Smart Queue will only allow the JSON data exchange between the Application Programming Interface (API) and the other applications, which are the listed signals and rules in the rules or configurations database.

4.4 BlackPearl Server

In this section, the file structures will be firstly explained and then elaborate how this server can be accessed and integrated. The BlackPearl Server consists of two main components: Web Interface and Backend or also known as the Main server. In order to active the BlackPearl server, three applications need to be started: The main application, Smart Queue, a Communication Application, and the JSON Qt connector application. As shown in Figure 4.4, the backend application is implemented completely in JavaScript under NodeJS runtime environment and the Web Interface based on Hypertext Markup Language (HTML5), JavaScript and styled with Bootstrap 4. The communication between the Web Interface and the backend is made using JavaScript Object Notation (JSON) data for incoming and outgoing data exchange. In addition, the Camera Server can run in parallel on the same Raspberry Pi 3B or another device. WebSocket (WS) were used due to the efficiency and the ability to send and receive data in real-time, and because this was good implemented and optimized in both Qt framework and NodeJS .

57

4.4.1 Main Server

Figure 4.4: BlackPearl main server.

Figure 4.4 shows how the main server is implemented. As mentioned in Figure 4.1 the BlackPearl server consists of main four applications. The first application is the web interface, which has main seven functions and these are engine, car control, custom app, demos, ultrasonic sensors, speed, and the web server settings. The second application is communication, which uses the smart queue rules to send and receive data via CAN-Bus and WiFi. The third application is the smart queue, which is responsible for message filtering and rules management based on rules from the BlackPearl dbc file. The fourth and the last application is the dynamic JSON Graphical User Interface (GUI) connector, which renders the JSON test QML page by sending the GUI elements in JSON array format. In addition, the BlackPearl server has a connection with the camera server via WebSocket (WS).

The NodeJS modules that are installed on the BlackPearl server are socket can, express and express-handlebars. Furthermore, the only extra module that was installed on the camera server is the ported version from OpenCV (opencv4nodejs). Even though the nodeJS comes with a native Hypertext Transfer Protocol (HTTP) module that can handle all HTTP requests and responses, another framework called “express.js” had to be installed and implemented instead of the native HTTP module. This framework allows us to use all modern web server functionality. In addition, it

58 supports templates, views, cookies, data handling, routing, dynamic data exchange, and Cross-origin Resource Sharing (CORS).

4.4.2 Web Interface After a successful connection to the BlackPearl Hotspot, the user of the Web Interface can access the BlackPearl server under this Uniform Resource Locator (URL) http://blackpearl:3000 and the Camera server can be accessed only using WebSocket (WS) connection and this can be accessed under this link ws://blackpearl:8000.

4.4.3 Custom App Figure 4.5 demonstrates adding a simple application based on Hypertext Markup Language (HTML) and JavaScript, which sends and receives dummy car speed from the BlackPearl server, in order to test functionality and connectivity. The developers need to call the hidden upload page to upload their HTML application to BlackPearl Server, and then they need to upload it again from the Web Interface to activate the application. At this point, the developers can upload only one custom application to keep the server’s high performance and low latency.

Figure 4.5: Web Interface Custom Application.

59

4.4.4 Web Interface Settings

Figure 4.6: BlackPearl Server Frontend.

As shown in Figure 4.6, the screenshot shows the frontend of the BlackPearl Web Interface application. The users of this application have four tabs, in which they can interact with the BlackPearl. The first tab is used to start the engine of the car. The second tab is to control the car by sending commands to move or change the car lights. The third tab is to upload a custom application without the need to edit the “Web Interface” source file. Finally, the demo tab, which consists of four preprogrammed demos. To start the BlackPearl, the user needs to click or tap the Engine button and the Web Interface will send a JSON array using JavaScript to the backend application, and the Smart Queue on the BlackPearl server will get the CAN-Bus message ID from the Configuration or Rule table and then forward it to the Communication Application. In this case, the CAN-Bus message ID 0x050 will be sent to the BlackPearl backend to start the engine. Afterward, the user can click on the Control tab and then start controlling the BlackPearl. Additionally, the user can try four preprogrammed Demos under the Demos tab, which are pre-installed on the BlackPearl itself. If the user of the Web Interface needs to active new options, features, or configuration, the user can use the advance or the basic settings as shown in Figure 4.8. There are six options available. The first option is “Live Streaming”, which will establish a connection between the Web Interface and the Camera server using WebSocket (WS). In case if the user wants to add, change or test the live streaming camera server, the user can enter the new WebSocket (WS) connection URL in the “WS address” text input field. The second option is the Ultrasonic Sensors, which will get only the 60 incoming CAN-Bus messages that correspond to ultrasonic sensors (id 0x001 for front and 0x002 for rear sensors). The third option is the “Gamepad (PS4)”, by default, the virtual joystick will be activated after the engine message is sent to the backend of the BlackPearl server. At this stage, the PlayStation 4 gamepad support is added, and in the future, the support for another type of remote controller can be added. The fourth option is the “Sparrow”, which will let the user of the system control the BlackPearl over voice commands using the QML and the CMUSphinx framework. The fifth option is the “Speed”, which displays the car speed live. The last option is the “Debug Mode” if this option is active, the user or the developer can get the system errors, information, and warnings printed in the browser and the server console/terminal as well as send a test or custom CAN-Bus messages to the BlackPearl. The user or the developer can also change the Smart Queue rules or configurations by pressing the “Rules” button. The user can check the status of the system to check the active options, modules, and configurations as shown in Figure 4.9.

Figure 4.7: BlackPearl Server Settings implementation.

Figure 4.7 shows the main four functions, which are implemented under the web interface application. The first function is the debug function, which active the logging mode and lets the developers to send custom CAN-Bus messages. The second function is the option function, and this function has two values ON or OFF. The third function is the input function, which means the user interaction methods. The last function is the server status function, which shows the server application status such as camera server, CAN-Bus, etc.

61

Figure 4.8: BlackPearl Server Settings. Figure 4.9: BlackPearl Server Status.

4.4.5 Camera Server The second server on the BlackPearl is the Camera server, which was already implemented in the previous system. However, due to the high latency and the lack of optimization, the Camera server using OpenCV ported version for NodeJS was implemented. The application was implemented initially using Socket.io but due to the high latency and the Frame per Seconds (FPS) drops, WebSocket (WS) was used instead.

Figure 4.10: Camera Server Implementation.

62

As shown in Figure 4.10 and the code snippet 7, point A shows how the server initiates the OpenCV framework and then creates a Frame per Second (FPS) variable with 45 FPS. In step B, the application defines the capture device, which is the USB camera (0). In step C, the application sets the capture device height and width, for our system the less, the best. Afterward, the application creates a WebSocket connection to send or stream the captured frames to the BlackPearl server, QML application, or any other third-party applications. The final step is D, the application creates a loop to capture the frames as jpg and sends it as a string using the WebSocket to the BlackPearl Web Interface or any other third-party application.

4.5 Smart Queue

The smart Queue is written using pure JavaScript and JavaScript Object Notation (JSON) arrays. The Smart Queue consists of two main files: communication and configuration or also known as queue management. First, the communication file is divided into two files to optimize the server, boost the performance and reduce the latency. These are corresponding to incoming and outgoing communications. Additionally, these can be easily integrated into a third-party application or another web application. As shown in the code snippet 3, there are five simple points on how the Smart Queue was implemented.

Figure 4.11: Smart Queue CAN-Bus Write rule.

As shown in Figure 4.11 and the code snippet 3, the first point is A, the NodeJS application needs to locate the Configuration or Rules JSON file, where all rules for both CAN-Bus and WiFi in readable text-format were defined. The Config JSON Object has two objects: can and wifi. Without this file, the system will not work probably. Afterward, the number of all CAN-Bus Rules is required. In point B, a “for” loop to get all CANBus loops for comparison and filtration is created. The C point consists of three sub-points: first, select only the active and outgoing CAN-Bus rules, then two variables need creating: outgoing buffer, which consists of CAN-Bus message-id and body. The second variable is an array, which stores how the data will be accessed based on the 63

BlackPearl API definition. Next, the application will check how many variables under the CAN-Bus “access” component, then the application will use again a “for” loop to store the coming data from the BlackPearl Web Interface in the outgoing buffer that this application created. Finally, in case the “access” component array has less than eight variables then the “if” condition will fill the rest of the CAN-Bus message body with zeroes. In point C, the application will store the CAN-Bus message-id from the Configuration JSON file along with the incoming data that was stored in the “access” array into the created outgoing buffer. The final point is E, the application will check first if the incoming CAN-Bus message-id for the BlackPearl Server is the same message-id from the created buffer, then the Smart Queue will send the message to the BlackPearl CAN-Bus. In this case, no spamming or unwanted messages will be sent or forwarded to the CANBus.

Figure 4.12: Smart Queue CAN-Bus Read rules.

As shown in Figure 4.12 and the code snippet 4, the second Communication file part is the incoming CAN-Bus messages or as already defined in the API as “Read”. The application will use almost the same algorithm, but as shown in point A, instead of W (outgoing) it uses R (incoming). Then on the contrary to the outgoing algorithm, the application stores directly the CAN-Bus message-id from the Configuration file in the incoming buffer that was created. Point B contains three steps: first, the application will compare using the “if” condition between the CAN-Bus message-id, which the application called from the configuration file, and the incoming CAN-Bus message-id from the car (BlackPearl). Afterward, the application will create an array and store the length of the “access” component from the BlackPearl API then it will store the incoming messages from the car into the created incoming buffer body. The final step in point B will fill the rest of the message body with zeroes. Finally, the application will replace the last array value with the configuration file CAN-Bus message-id. Next, it will send the raw message body in JSON format using WebSocket (WS) to guarantee real-time updates.

64

4.6 Application Framework

Figure 4.13: Pi Display Application framework.

Figure 4.13 represents the Pi display application structure, which consists of five applications. Home, car, JSON dynamic GUI, settings, and the QML custom application. This section will explain how the Qt Modeling Language was implemented on the Pi Display, which is also used for the Touch User Interaction. The QML based application consists of seven main pages: The Main page, Menu page, Car information page, Ultrasonic sensor page, Dynamic Graphical User Interface (GUI) page, the addable application container, and the Settings page. The first file is the main page, which includes all pages and the global variables and functions. On this page, the application will call the BlackPearl API and the Smart Queue. In addition, it has to call the standard pre-installed applications and the needed URLs for the WebSockets and other network connections using the Graphical User Interface Initialization JSON file. When all needed configurations are loaded, the main page will display the three standard pre-installed applications as shown in figure 4.14. The users of the Automotive Demonstrator can hide these applications by holding the icon of the application for one second and then pressing the close icon. In the right corner, there are two buttons: X for closing the whole applications and + for adding a new single QML file application and the icon name should be the same as the QML file name.

65

Figure 4.14: BlackPearl Display Menu.

The users of Automotive Demonstrator can return to the main page easily by tapping or clicking the header image twice. The only page, which cannot be deleted is the settings and the rules info pages. As shown in Figure 4.15, the Home page was inspired by the BMW I series tachometer with some changes. This page was implemented to display the car speed, engine, camera, car errors/warnings, direction indicators, and detected traffic signs. In addition, this page is designed as a demo page with dummy values, due to the lack of the required sensors and their data. Moreover, the sign detection function is using dummy images and it is not fully implemented. In the middle of the tachometer, the application will get a live stream from the BlackPearl Camera System using the WebSocket connection, and the users can choose and change the cameras, in case more than one camera on the Camera Server is connected.

Figure 4.15: BlackPearl Display Home.

As shown in figure 4.16, the car application will display the ultrasonic sensor data, raw data and the color bars will disappear when the sensor range changes [0, 25, 66

50,100] and in colors [red, orange, green, green]. This application uses the incoming CAN-Bus messages from the BlackPearl Server over WiFi and uses the BlackPearl Application Programming Interface (API) without any interaction with the CAN-Bus device.

Figure 4.16: BlackPearl Display Car.

The third pre-installed application is the “Dynamic Graphical User Interface”. The main challenge in this application is to give the ability to the developers to redraw this application using only a JSON file and without the need to create a new QML application. This was implemented by creating and compiling the number of Text labels, Switches, Sliders, and Speed Gauges. We created a special algorithm with the help of a native Qt function “createQmlObject()” that allows us to create and modify objects on runtime. As shown in the JavaScript snippet below, the Graphical User Interface consists of ten Elements: Type, Access, X, Y, H, W, V, Value, Text, and font size. First, the Type has five values: Text (t), Image (img), Slider (sl), Switches (sw), and WebSocket (WS). Second, the Access if it was static (s) or live/dynamic using WebSocket (WS). For the position of the element, the developer needs to edit the X and Y coordinates. If the element type was an image then the Height (H) and the width (W) need to be updated. The important element is the Visibility (V) and without it the Graphical User Interface (GUI) will not be updated probably. For all elements except image (img), the developer needs to set the text and font size. In the case of using a WebSocket as a type, the developer needs to use the WebSocket URL as Value. Finally, the application will store the GUI information in JSON array and send it using JavaScript GET response to the Pi Display application to update the Dynamic GUI page. Figure 4.17 illustrates the dynamic GUI application.

67

Figure 4.17: Dynamic GUI via JSON.

As shown in Figure 4.18 and the code snippet 8, which shows how the application is initialized with eight dummy texts, four switches, four sliders, and a speed gauge. In case there is no connection between the BlackPearl Server and the JSON application, the Graphical User Interface will be rested to the initialization phase. Otherwise, the application will redraw the Graphical User Interface (GUI) as shown in Figure 4.19, which updates the GUI information from the above-created JSON array and all these changes happen in the application runtime.

Figure 4.18: Display Dynamic GUI default. Figure 4.19: Display Dynamic GUI updated.

The final pre-installed application is the settings page, which contains the settings and the configuration of these three parts: Live streaming, Dynamic GUI page, and Dynamic Menu page in addition to the server status that checks if all modules and addons are working probably. As shown in Figure 4.20, the developer can change or test new camera streams by updating the WebSocket URL and then click on the update button. Secondly, if the developer can also change the JSON server, which is responsible for redrawing or changing the GUI design in the Dynamic GUI page, by 68 changing both the JSON Server and the JSON WebSocket for the server. Furthermore, if the developer wants to change the main menu design on the Home page, this can be done by changing the URL for the JSON server. Finally, the server status will turn to green only if the connection between the BlackPearl Display application and other servers is established successfully.

Figure 4.20: BlackPearl Display Settings.

Figure 4.21 shows the used rules and configurations by the Smart Queue and their data exchange direction (incoming, outgoing, or both) and this page is an extension to the Settings page, which means this page cannot be deleted. For now, this page displays only the information for WiFi and CAN-Bus Rules, but if the Rule table is extended with more communication rules, this page will be updated automatically.

Figure 4.21: BlackPearl Display Rules.

Figure 4.22 shows the steps to add a new application to the BlackPearl Display application by clicking the plus button (+) and then the developer needs to locate the

69

QML application source file (not compiled) and add it to the application. The developer does not need to compile or build the QML application, because the BlackPearl Display Application already has all needed libraries precompiled. For example, if the developer needs to use the Camera Streaming module, which was already implemented in the Home Page, then the developed QML application has to include only the name of the module in this format “Camera{}” then the BlackPearl main application will link the module call/request with the precompiled Camera Streaming application. In addition, the developer should create a single QML source file, which has all functions and classes that the developed QML requires. Otherwise, the added application will not function properly. The developers can create their icon by naming the icon image with the same name as the QML source file. The current function is limited to three addable applications, of which the developer or the end-user can delete or replace with a new QML application. The added application in figure 4.15 is an example application to test the Graphical User Interface (GUI) rendering, elements, and functions integration, and implementation.

Additionally, the developer needs to use only QML and pure JavaScript to create the addable application. To hold the same performance, the developers need to avoid or reduce the number of rectangles and effects. In addition, the application’s background has to be transparent or it is better to create the application without a background and avoid images if possible.

70

Figure 4.22: QML Addable Function.

4.7 User Interface

As stated in this thesis concept, the User Interaction will consist of three main methods: Touch (BlackPearl Display), Voice commands (CMU Sphinx), Remote, and Gamepad (BlackPearl Server).

71

Despite the explanation in section 4.3 of the implementation of the Remote and Gamepad User interaction method comprehensively, further explanations related to the Graphic User Interface should be elaborated, which are responsible for the Virtual Controller and Gamepad. In section 4.5, a comprehensive explanation has been made about the implementation of the Touch User Interaction or also known as the BlackPearl Display application. In this section, the thesis will explain the extra points for the implementation of the Remote User Interaction then the Voice Commands method.

4.7.1 Remote User Interaction

Figure 4.23: BlackPearl Remote UI.

As shown in figure 4.23, the Remote User Interaction (UI) consists mainly of four components: Lights (C), live stream (D), Virtual Joystick (E) and RGB light sliders (F). The first component is the Lights, which were implemented using static and predefined values. When the user of the Web Interface clicks or tips the Light button, the JavaScript function will pass the command to the Backend server (NodeJS), which will 72 check if the CAN-Bus message-id in the smart queue or not then send it to the CAN- Bus if this id is in the rules table. The lights that the user can send are Direction Identicator, Daytime, Hazard, Low beam, and the RGB lights. The thesis implemented this component using the code snippet 5. All CAN-Bus messages will be sent using this simple JSON array and then sent to the backend server. The second main component is the Virtual Joystick. This component is active by default.

Figure 4.24: BlackPearl Display Car

Figure 4.24 shows the car drive or control function, which consists of only two functions. The first function is the mapping function and this will map the joystick x- and yValues. Afterward, the mapped value will be sent to the BlackPearl backend and then to the proccessignECU to make the car move. The CAN_id for moving the car is 0x52 and the for the Direction is :

Directions = right + (2 * left) + (4 * backward) + (8 * forward);

The algorithm was designed to reduce the number of loops and if operations, then send it to filter functions then to the Smart Queue to check if this operation is allowed or not. For example, if the joystick direction was “U”, then forward will be changed to 1 and set 0 for backward, right, and left. In addition, this function will be reused from the gamepad and the voice command as well, due to the same CAN-Bus id and functionality. In Figure 4.8, the users of the Web Interface can active three more components from the Settings side menu and these components are ultrasonic Sensors, Tachometer, and Camera streaming. For each of these components, the BlackPearl server will assign a new port, in case there are a high load and latency on the BlackPearl server. In case the user of the Web Interface activated the Gamepad

73 component, the virtual joystick will be disabled and the BlackPearl will only get the remote control commands from the Gamepad. The Gamepad function was written in pure Javascript to reduce the latency and the Web Interface will filter these commands using the same function from the Virtual Joystick and send it to the Smart Queue and then to the Can-Bus. In addition, if the users of Web Interface active the Voice Interaction (Sparrow) then the Web Interface will disable both the Virtual Joystick and the Gamepad. The Voice User Interaction (Sparrow) was written using the CMUSphinx framework, which used C/C++ programming language and JavaScript to send the commands to the BlackPearl Server. The users need to execute the Sparrow application and to click the microphone icon. Afterward, the C/C++ application will check if the user already has the microphone plugged in with the right permissions or not, then the Sparrow will repeatedly listen to the microphone until the user gives a registered command. This application was implemented using the native Qt functions that allow QML applications to use C/C++ classes and functions. The C/C++ function will listen to the microphone of the user using the CMUSphinx framework and then compare it with the trained command list.

4.7.2 Voice User Interaction Code snippet 9 shows the Qt native function, which will send the recognized command to the QML file. The QML part shows the sparrow function loop, which calls the command variable then checks the command with a switch condition. Afterward, the function will send the command to the BlackPearl server as a JSON array using Hypertext Transfer Protocol (HTTP) request with the POST method. In case of changes or updates in the application, the developer needs to edit the Sparrow application and then rebuild it. The accepted command that was trained is: Forward, Backward, northeast, northwest, southeast, southwest, start, and stop. Additionally, the German language is still under testing, but this Language Modal will be implemented at the end of this thesis. Also, there will be more commands trained to control the BlackPearl Display; for example, controlling the menu, lights, open, delete or close the BlackPearl Display application.

74

Figure 4.25: Sparrow Voice UI

4.7.3 Android Application Figure 4.26 shows an under-developing Android application written with QML and JavaScript. The Application will connect to the BlackPearl automatically, if it is not connected, then the Camera will start receiving the live stream after a successful connection with the BlackPearl Server. The D-Pad will have the same role as the Virtual Joystick in the Web Interface. More features and a settings menu to the application were expected to be added; however, as stated above, the application is still under development.

Figure 4.26: Android Remote Controller.

75

5 Results

This thesis's main goal is to design, implement and then analyze and monitor the server for performance, usability, and efficiency. The implementation of the conceptual design from chapter 3 was successfully implemented in chapter 4. In this chapter, the proposed solution shows how the new BlackPearl platform performance was improved (in section 5.1), how it meets the usability requirements (in section 5.2), and list the new system features (in section 5.3).

5.1 Performance

In this section, the thesis explains how the system was monitored. As for QML (section 5.1.1), two types of tests were implemented: Clang-Tidy, Clazy, and QML profiler. Afterward, two monitoring tools for the NodeJS were applied: Express Status Monitor [20] and the IBM performance-monitoring tool “Appmetrics” [21]. The Clang-Tidy test provides diagnostics and fixes for programming errors, like style violations or interface misuse. Moreover, the Clazy helps Clang understand the Qt framework semantics and it list the Qt warnings, unnecessary memory allocation and proposes solutions for the marked warnings and errors. The QML profiler performance test will identify and display performance issues, like latency, slow rendering of graphical user interfaces and codes, and JavaScript runtime and delay. This test is mainly used with QML applications that are developed with JavaScript.

5.1.1 Application Framework As for the Pi Display application, two types of performance analyzing tools and tests were applied, which are built-in tools in Qt creator IDE:  Clang-Tidy and Clazy: These two checks were used to analyze the main and the Sparrow C/C++ files and both were processed successfully with no errors in 10 seconds.

 QML Profiler. A native Qt performance-analyzing tool that monitors the memory usage, Animation, Signals, and JavaScript processing time.

76

The best performance and analyzing tool for the QML is the native QML profiler. In this chapter, the Pi display application was analyzed for more than 30 days to make sure that this application is optimized and can be delivered with high standards. The listed analyzing data was cut from 500 seconds analyzed range. This table will make it easy to understand the captured data:

Description Figure Memory Usage Shows the JavaScript memory manager. 5.1 Binding Shows the time for a binding and how long it takes to 5.2 finish. JavaScript Shows the elapsed time that a JavaScript function 5.3 needs to finish executing and handling signals. Scene Graph Shows the time when scene graph frames are drawn 5.4 and timing information for many stages [23]. Table 10: QML event category

Figure 5.1: Memory Allocation Performance As shown in Figure 5.1, the red-colored chart is corresponding to Heap allocation and the second chart is for Heap usage. In Heap allocation, the QML file that was on top is the Dynamic JSON Graphical User Interface application and specifies the function code that is responsible for redrawing or rendering the Graphical User Interface elements. This application took ~5.8% of the memory allocation and 1.07 seconds to execute successfully and then 0.4 seconds to get the JSON data from the BlackPearl server to update the GUI. In addition to that, the Menu function that creates the Main Menu takes 0.2% from the memory allocation and 0.6 seconds to execute and adds all items from the BlackPearl Menu JSON array. Moreover, as for the heap usage, the “Shadow Input control” was 0.23% and it took between 0.01 to 1.2 seconds to execute.

77

Figure 5.2: Signal-Binding Performance.

Figure 5.2 shows the time that signals needed to execute and that was between 20 to 476 microseconds. In addition, the time that the application framework needed to connect the signals with the application elements, and how long this proccess took to finish.

Figure 5.3: JavaScript Performance. Figure 5.3 shows the time that the JavaScript functions and codes needed to execute and that was between 20 microseconds to 136 milliseconds. The function that takes between 100 and 136 milliseconds was the link establishing function between the Display and the BlackPearl Server.

78

Figure 5.4: Scene Graph.

Finally, to boost the performance of the QML application, these changes were made: ● Reduce the number of rectangles as much as possible. ● Use WebSocket instead of IO socket, due to latency issues. ● Limit the addable QML applications to three. ● Destroy Application and reset the closed QML application (page). ● Build a custom Qt framework version with only needed libraries.

5.1.2 BlackPearl Server In order to test and profile the performance of the BlackPearl server, which was implemented using NodeJS runtime environment as backend and Hypertext Markup Language (HTML5), Cascade Styling Sheet (CSS3), JQuery, Bootstrap, and JavaScript as frontend and to optimize this server, these steps/rules were implemented: ● Using only minimal versions of scripts and styling. ● The JavaScript functions and files had to be executed and included at the end of the HTML files. ● Developing an algorithm that can be reused from most functions to avoid redundancy and code duplication. ● Using WebSockets instead of IO sockets to optimize the camera streaming and reduce the Frame per Second (FPS) drops. ● Dividing the Backend applications into external nodes with different ports to reduce the processing time and resource usage.

79

To create these performance analyzing tables and charts, a reliable and professional tool from IBM called “Appmetrics” [21] was used, the second application called “Express Status Monitor [20]” and the V8 profiler from chrome browser. In order to ensure that the right performance information was applied and depending on these exported data, the BlackPearl server was optimized.

Figure 5.5 shows live system status, which was fetched using “Express Status Monitor [20]”, during using the Web Interface. Before optimization, the real values were very high: firstly, the CPU usage was 10% with Jitter problems that make the Application use the CPU and raise the usage to 30 ~ 35%, and causes high FPS drops. Secondly, the memory was getting too much garbage data with unwanted functions behavior and in busy status, it reached 200MB at one time. Thirdly, the Heap Usage and Allocation was three times more than the current one. Fourthly, the One Minute Load Average was between 10 and 20. Fifthly, there were some loops, which were making the response time in seconds instead of milliseconds, between 5 and 8 seconds. Finally, the HTTP requests were very high and reached 21 requests per second. After implementing the above-listed steps, the final optimized version from the BlackPearl server was obtained and all new data are shown in Figure 5.5.

80

Figure 5.5: BlackPearl server busy.

Figure 5.6 shows the status data for the BlackPearl server in idle status, which means the server sends only the “Stop” command to the BlackPearl car. The only small differences which can be recognized between Figure 5.5 and Figure 5.6 are that the CPU Usage is 2% less the whole time, Memory Usage is 5 to 10 MB less than the Active/ Busy status, and One Minute Load Average is 0.6 to 1.1 less in the Idle status. On the other hand, the Requests per Second (rps) and the Heap usage were not stable and constant.

81

Figure 5.6: BlackPearl server idle.

The second Performance test was made using the IBM monitor tool “Appmetrics” [21]. Figure 5.7 shows the CPU usage for both Systems (Raspberry PI 3B) and Process (BlackPearl Server). The x-axis is corresponding for the running time and the y-axis for the CPU usage in percentage. The CPU usage was low when the runtime environment started and then it started to increase to reach 75% and then returned to 50 %. This high CPU usage for the Raspberry Pi was because of some background tasks, of which their IDs in the task manager were found and then deactivated. In addition, the use of the CAN-Bus interface made the CPU usage increase but not above 25%. The runtime environment itself, including the backend and the frontend, did not take more than 3% of the total CPU usage. In the right chart, the x-axis is corresponding for the running time and the y-axis for memory usage in Megabytes. As for the memory Allocation and usage, the maximum usage for both was 62 MB. It has been found that when the user of the Web Interface activates only the needed setting, for example, ultrasonic sensors and/ or camera, the memory usage would be reduced almost to 40 MB.

82

Figure 5.7: BlackPearl server CPU and Memory.

Figure 5.8 shows the maximum duration that the loops took to finish. The x-axis is corresponding for the running time and the y-axis for the Loop execution times in milliseconds. The BlackPearl starts without sending any commands, but it waits for the Users of the Web Interface to activate extra settings or send the “start engine” command then the server will send the “Stop” command to the BlackPearl. Therefore, it will not start moving by itself. The high timing between 160ms and 220ms was for sending the Virtual Joystick commands to make the car move forward and backward, and then changed the RGB lights to purple instead of orange. There was a test over the server for almost half an hour and the higher timing for a loop was 220ms. The event loop timing can vary depending on the number of processes and the applications that the system or the server use. For example, if we use the Camera server and the BlackPearl server on one Raspberry Pi, the maximum timing will slightly increase and the Camera server can send fewer frames to the BlackPearl server. Nevertheless, this will not make it less efficient. Most of this thesis time was devoted to optimizing and how to get the best performance from this work. The main three aspects that had our priority in this work: first, execution and finish time for the loops as shown in Figure 5.8.

Figure 5.8: BlackPearl server loops.

The second aspect is the WebSockets, the WebSockets were used in this work to receive the streaming frames from the Camera Server and ultrasonic sensors. As shown in Figure 5.9, the x-axis is corresponding for the running time and the y-axis for WebSocket requests in milliseconds. The ultrasonic sensors received data will take higher than 1ms to be updated and when the Camera setting was activated, the Server 83 needs 3.5ms to check if the Camera Server online or not (in our test, the Camera Server was online) then the received camera frames will take between 1 to 1.5ms to update the live streaming container.

Figure 5.9: BlackPearl server Sockets.

The last aspect was the Hypertext Transfer Protocol (HTTP) requests as shown in Figure 5.10. The x-axis is corresponding for the running time and the y-axis for HTTP incoming requests in milliseconds. The maximum timing to finish the request was for requesting the “Start Engine” command and it took 60ms to finish. The second higher timing was for the Smart Queue, which needs from 10 to 17 ms to finish checking the incoming and outgoing messages and data. The other timing was for the settings and configurations request and it was between 0.2 to 10 ms.

Figure 5.10: BlackPearl server HTTP requests.

5.2 Usability

Usability is very important when it comes to implementing a system with a Graphical User Interface (GUI). As mentioned in the concept and implementation chapters, one of our main goals is to achieve a usable system that has these characteristics: easy to learn for new users, effective and efficient, and less to no error System. As mentioned in section 4.5, The user can interact with the BlackPearl using the Web Interface or the Pi display (section 4.4). 84

The user of the Web Interface needs to connect to the BlackPearl, call the Web Interface, click and open the setting side menu to add new configurations: Ultrasonic sensors, Live Streaming, Speed, Gamepad, and Voice Command (Sparrow). Afterward, the user needs to click the “Start Engine” button, in case the user wants to drive the car remotely. Then active the “Control” tab to start controlling the car. In the case of Pi Display, the user can click one of the icons to use the pre-installed applications: Car Dashboard (Home), Ultrasonic Sensors (Car), Dynamic JSON GUI, and Settings. To return to the main menu, the user needs to click the header twice and then the active application will close and return to the main menu. In case the user of the Pi application wants to hide or delete a pre-installed application, the user needs to click and hold the icon for 2 seconds and then click the “X” button, and the application will disappear. If the user wants to add a new application, the “+” button needs to be clicked and then a new QML application can be added to the main menu. This thesis mention two scenarios in which the developers can easily interact with the BlackPearl. First, the developer Bob wants to call the Camera module in his QML application and then add it to the Pi Display. The simplest way to do this by adding this single line in the QML file with the new id or just without any special information:

Camera {}

If he wants to change the Camera server Internet Protocol (IP) address, he just needs to add this simple line:

socket.url = “new server URL”

In the second scenario, the developer Ellen is developing a web application and she wants to display ultrasonic sensors. She needs to create a WebSocket connection and then call the CAN-Bus messages for incoming ultrasonic sensors messages. In this system, the corresponding IDs for the ultrasonic sensors are 0x001 (front) and 0x002 (rear). Afterward, she needs to assign the ultrasonic value using the formula mentioned below. This simple code snippet shows how to call the ultrasonic sensors information without interacting with the CAN-Bus itself: var ws = new WebSocket(“CAN read application IP”); ws.onmessage = function (event); var USS = JSON.parse(event.data); if (USS[8] == 1) // id 0x001 front {

85

var sensorFrontRight =parseInt( parseFloat( USS[2] ) / 2.55 ); var sensorFrontCenter =parseInt( parseFloat( USS[1] ) / 2.55 ); var sensorFrontLeft =parseInt( parseFloat( USS[0] ) / 2.55 ); } if (USS[8] == 2) // id 0x002 rear { var sensorRearRight =parseInt( parseFloat( USS[2] ) / 2.55 ); var sensorRearCenter =parseInt( parseFloat( USS[1] ) / 2.55 ); var sensorRearLeft =parseInt( parseFloat( USS[0] ) / 2.55 ); }

Finally, based on the above scenarios and the usage steps, this thesis achieved the usability goal and adaptivity.

5.3 Features

The main features that this thesis achieved are multiple User Interaction methods (Remote, Touch, and Sparrow), the Pi Display application can be updated remotely, Gamepad support, Smart Queue, and usable API that can be easily integrated, along with a fully optimized server with low latency and high performance. Moreover, the most important feature and the challenge in this work is to update and extend the system by adding new applications to both the Web Interface and the Pi Display without the need to modify the source code or recompile and deploy the system.

5.4 Limitation

Due to the pandemic time, less tests and experiments were done on the BlackPearl Car. Besides, the ProccessingECU was sometimes not stable. In addition, the BlackPearl server was installed on sperate Raspberry Pi 3b single computer, the Camera server on another Raspberry Pi 3b, and the idea on installing all servers on the same Raspberry Pi, but this idea could not be on the car tested. The main limitation in this propsed solution is the number of the addable QML application, which is limited to three addable application. As for the web interface only one web application can be added. Besides, on the camera server only up to three cameras can be added via USB. The rule management database, which the Smart Queue use for filtering, is implemented in pure JSON arrays.

86

Application framework (QML) Limited Up to 3 custom addable Application. Web Interface Limited to 1 custom addable web application. Camera server Limited up to 3 USB cameras. Smart Queue Messages and signals stored as JSON array. Table 11: The New System Limitations.

As shown in Figure 5.11, these end-to-end latency results (approximate and rounded values) were fetched from the Express Status Monitor [20] and the IBM performance- monitoring tool “Appmetrics” [21]. The steering functions, which includes the virtual joystick and the gamepad, the minimum value was 40ms, maximum value 100ms, and the average value 54ms. The voice user interaction (Sparrow) has more latency than the other user interaction methods, the minimum time was 180ms, the maximum time was 500ms, and the average time was 300ms. As for lights function, the minimum time was 40ms like the steering function, the maximum time was 113ms, and the average time was 78ms. Moreover, the settings and other functions were having 50ms as minimum time, 150ms as maximum time, and the average time was 100ms.

End-to-End check 600 500 500

400 300 300 180 Time Time (ms) 200 150 113 100 78 100 100 40 54 40 50 0 Steering function Sparrow Lights Others BlackPearl Functions

Min. Max. Avg.

Figure 5.11: End-to-End latency measurements.

87

6 Conclusion and Further Work

This work shows that the thesis challenges and problems were solved and implemented successfully. Based on the main requirements from table 2, the delivered solution fulfilled the main thesis requirements, which are the camera and data visualization, adaptivity and usability, easy setup and modification, adding new features and applications, and multiple user interactions. The implemented system was programmed mainly with JavaScript and running on NodeJS runtime environment, and Qt framework. The user interaction methods were extended to four methods instead of one method. These interaction methods are virtual joystick, Gamepad, voice commands (Sparrow), and touchscreen on Pi display. All BlackPearl components are communicating with each other using CAN-Bus and WiFi. Furthermore, new applications and features can be added via the web interface and via the framework application. Besides, the developers and the users have the capabilities to add custom applications and features without the need for redeployment. The camera server has low to semi-live video streaming. The main outcome of the thesis is delivering the Smart Queue and the BlackPearl Application Programming Interface (API). This platform can easily support future research applications such as image processing, autonomous driving, and remote system deployment hotspots.

In this chapter, there are the conclusion and summary of the thesis work in Section 6.1 and then recommendations with some upgrades and further work ideas in section 6.2.

6.1 Conclusion

In this section, the thesis goes through, summarizes, and shows the outcomes briefly. The following points are to be discussed:  Start-up  Design  Implementation

At the beginning of this thesis, it started with reviewing, analyzing, and an attempt to optimize the previous system. However, it was found that the programming language, web server, and interaction methods were the drawbacks in the previous system. Because of these drawbacks, the previous system was not optimized and had high latency, lack of usability, and adaptivity. Secondly, we proposed a design for the main system and the multiple User Interactions (UI). In the final part, the proposed concept and all tested application components were implemented comprehensively. Then analyzed the system performance to reduce and limit any latency and bugs. Altogether, 88 the application was successfully implemented, tested, analyzed, and then deployed on the BlackPearl. This thesis can sum up the outcomes in one sentence: lightweight, adaptive and usable system, which comes with multiple User Interactions (UI). In addition, an attempt to design the system to fulfill most of the standards in ISO 9241 was achieved.

6.2 Further Work

As further work, three upgrades can be applied:  Qt Deploy Hotspot.  Pi Display Applications  BlackPearl Hardware.

First, it started with the “Qt Deploy Hotspot”, in case the Raspberry Pi was being used, one could use Raspberry Pi 4 and install a core (OS) without the need to install any desktop distributions to use the memory and the CPU effectively. Furthermore, install the same custom Qt framework, which was built for this thesis, and configure it with a bulk or deployment application/server. The idea is to let the BlackPearl connect automatically to the Qt Deploy Hotspot, when the BlackPearl is near this hotspot, and check if there are any updates by comparing the application version. The Qt Deploy Hotspot will check if the MAC address of the BlackPearl Wireless device is registered in the Cars table. In case of successful connection and new available update, the Hotspot sends a notification to the connected BlackPearl and asks for permission and if the user of the BlackPearl or the developer accepted the update, the BlackPearl display will turn black and start the update. Figure 6.1 shows the general concept of the Qt remote deployment server (Hotspot). Also, in case of transfer interruption, there will be no critical effect on the installed version on the BlackPearl system, because the BlackPearl will download the newer version first and then check if all files were downloaded and compare the md5 file in case of fake update files. As a result, the BlackPearl will show again a notification to the user or the developer asking for update install confirmation. In addition, the installation and the transfer time should not exceed five to ten minutes.

89

Figure 6.1: Qt Deploy Hotspot.

Second, the addable part of the Pi display application needs to be extended from three to six and give the ability to remember the added or an installed application. This can be achieved with a simple JavaScript Object Notation (JSON) file and JavaScript. Third, there should be a consideration on upgrading the Raspberry Pi 3 with Raspberry Pi 4 to boost the performance and allow the developer to add more than six applications, as well as complicated applications, which can be used for autonomous driving. As an extra upgrade, a PC steering wheel can be used to control the BlackPearl in addition to the other remote controllers. A new feature in the BlackPearl Server could be included in order to import the Database Container (dbc) file directly in the Rules Table, after comparing the CAN-Bus IDs to avoid any duplications. This can be achieved using the upload field with JavaScript or C++ application in the backend. Additionally, the CMUSphinx with the new library Vosk can be replaced, which was developed by the same developers but they add more supported languages and features. Finally, Content Management System (CMS) can be implemented, so it will be easy to edit the web interface elements, without the need to update the HTML file. Furthermore, the JSON QML dynamic GUI application can be included in this system and the Graphical User Interface (GUI) can be created using drag and drop GUI builder. Taking everything into consideration, this thesis finished and successfully implemented a system that can be easily updated and changed without the need to recompile or redeploy the application to the BlackPearl as well as providing multiple Adaptive User Interaction methods, and data and camera visualization. 90

Bibliography

[1] BlackPearl: Extended Automotive Multi-ECU Demonstrator Platform, Cook Islands Renewable Energy, Engineers Australia, Engineering House, Barton, ACT, Russia, August 12-18, 2018. Accessed on: Aug. 4, 2020. [Online]. Available: http://www.mier.mn/wp-content/uploads/2020/01/2018pub.pdf

[2] S. Nanda, “Accelerated GUI development with Linux Qt,” VOLANSYS, 08-Jan- 2021. [Online]. Available: https://volansys.com/accelerated-gui-development- with-linux-qt/. [Accessed: 12-Feb-2021].

[3] “About Qt - Qt Wiki.” https://wiki.qt.io/About_Qt (accessed Dec. 16, 2020).

[4] T. Q. Company, “Qt Creator - A Cross-platform IDE for Application Development.” https://doc.qt.io/qtcreator/index.html (accessed Feb. 12, 2021).

[5] “What exactly is Node.js?,” freeCodeCamp.org, Apr. 18, 2018. https://www.freecodecamp.org/news/what-exactly-is-node-js-ae36e97449f5/ (accessed Feb. 12, 2021).

[6] “The Good and the Bad of Node.js Web App Development,” AltexSoft. https://www.altexsoft.com/blog/engineering/the-good-and-the-bad-of-node-js- web-app-development/ (accessed Feb. 12, 2021).

[7] N. Shmyrev, “CMUSphinx Open Source Speech Recognition,” CMUSphinx Open Source Speech Recognition. http://cmusphinx.github.io/ (accessed Feb. 12, 2021).

[8] A. Dhankar, "Study of deep learning and CMU sphinx in automatic speech recognition," 2017 International Conference on Advances in Computing, Communications and Informatics (ICACCI), Udupi, 2017, pp. 2296-2301, doi: 10.1109/ICACCI.2017.8126189.

[9] “The GTK Project - A free and open-source cross-platform widget toolkit.” https://www.gtk.org/ (accessed Feb. 12, 2021).

[10] www.felgo.com, “Felgo: Native Cross-Platform App Development and Professional Services.” https://felgo.com (accessed Feb. 12, 2021).

[11] “Welcome to Kivy — Kivy 2.0.0 documentation.” https://kivy.org/doc/stable/ (accessed Feb. 12, 2021).

[12] “Documentation | Electron.” https://www.electronjs.org/docs (accessed Feb. 12, 2021).

[13] “Node.js vs Django: Key differences, popularity, use cases and more.” https://www.simform.com/nodejs-vs-django/ (accessed Jan. 22, 2021). 91

[14] N. Rathnayake, D. Meedeniya, I. Perera and A. Welivita, "A Framework for Adaptive User Interface Generation based on User Behavioural Patterns," 2019 Moratuwa Engineering Research Conference (MERCon), Moratuwa, Sri Lanka, 2019, pp. 698-703, doi: 10.1109/MERCon.2019.8818825.

[15] Vlad Stirbu. 2010. A RESTful architecture for adaptive and multi-device application sharing. In Proceedings of the First International Workshop on RESTful Design (WS-REST '10). Association for Computing Machinery, New York, NY, USA, 62–65. DOI:https://doi.org/10.1145/1798354.1798388

[16] J. Hussain et al., “Model-based adaptive user interface based on context and user experience evaluation,” J Multimodal User Interfaces, vol. 12, no. 1, pp. 1– 16, Mar. 2018, doi: 10.1007/s12193-018-0258-2.

[17] “Welcome to DeepSpeech’s documentation! — DeepSpeech 0.9.3 documentation.” https://deepspeech.readthedocs.io/en/v0.9.3/index.html (accessed Feb. 09, 2021).

[18] “Intelligent Speech Interaction for Human-Computer Interaction - Alibaba Cloud.” https://www.alibabacloud.com/product/intelligent-speech-interaction (accessed Feb. 09, 2021).

[19] “Overview | Google Assistant SDK | Google Developers.” https://developers.google.com/assistant/sdk/overview (accessed Feb. 10, 2021).

[20] R. Wilinski, RafalWilinski/express-status-monitor. 2021.

[21] “Anwendungsmetriken mit Node.js-Apps verwenden.” https://cloud.ibm.com/docs/node?topic=node-metrics (accessed Feb. 12, 2021).

[22] “Qt für den Raspberry Pi – Kampis Elektroecke.” https://www.kampis- elektroecke.de/raspberry-pi/qt/ (accessed Aug. 13, 2020).

[23] “Profiling QML Applications” https://doc.qt.io/qtcreator/creator-qml- performance-monitor.html (accessed Aug. 15, 2020).

[24] Akiki PA, Bandara AK, Yu Y (2014) Adaptive model-driven user interface development systems. ACM Comput Surv CSUR 47(1):9.

[25] Meixner G, Paterno F, Vanderdonckt J (2011) Past, present, and future of model-based user interface development. i-com 10(3):2–11.

[26] G. E. Krasner and S. T. Pope. A cookbook for using the model-view controller user interface paradigm in smalltalk-80. J. Object Oriented Program., 1(3):26– 49, 1988.

92

[27] R. Fielding and R. Taylor. Principled design of the modern web architecture. ACM Transactions on Internet Technology (TOIT), Jan 2002.

[28] N. Englisch et all: “ YellowCar Automotive Multi-ECU Demonstrator Platform“ Lecture Notes in Informatics (LNI) - Proceedings, Volume P-275, IN INFORMATIK 2017, 15. GI Workshop Automotive Software Engineering, page 1517 - 1522, September 2017. ISBN: 978-3-88579-669-5, ISSN: 1617-5468.

[29] Jiming Liu, Chi Kuen Wong and Ka Keung Hui, "An adaptive user interface based on personalized learning," in IEEE Intelligent Systems, vol. 18, no. 2, pp. 52-57, March-April 2003, doi: 10.1109/MIS.2003.1193657.

[30] A. Amditis, A. Polychronopoulos, L. Andreone, and E. Bekiaris, “Communication and interaction strategies in automotive adaptive interfaces,” Cogn. Technol. Work, vol. 8, no. 3, pp. 193–199, 2006.

[31] S. Rogers, C. -. Fiechter and C. Thompson, "Adaptive user interfaces for automotive environments," Proceedings of the IEEE Intelligent Vehicles Symposium 2000 (Cat. No.00TH8511), Dearborn, MI, USA, 2000, pp. 662- 667, doi: 10.1109/IVS.2000.898424.

[32] Meixner G. et al. (2017) Retrospective and Future Automotive Infotainment Systems—100 Years of User Interface Evolution. In: Meixner G., Müller C. (eds) Automotive User Interfaces. Human–Computer Interaction Series. Springer, Cham. https://doi.org/10.1007/978-3-319-49448-7_1

[33] Albrecht Schmidt, Anind K. Dey, Andrew L. Kun, and Wolfgang Spiessl. 2010. Automotive user interfaces: human computer interaction in the car. In CHI '10 Extended Abstracts on Human Factors in Computing Systems (CHI EA '10). Association for Computing Machinery, New York, NY, USA, 3177–3180. DOI: https://doi.org/10.1145/1753846.1753949

[34] J. York and P. C. Pendharkar, “Human–computer interaction issues for mobile computing in a variable work context,” International Journal of Human- Computer Studies, vol. 60, no. 5, pp. 771–797, May 2004, doi: 10.1016/j.ijhcs.2003.07.004.

93

A. Appendix A – Custom Qt framework

The of compiling and building Qt for Raspberry Pi were taken from a step-by- step tutorial in the German language, but it showed how to install Qt 5.12.8 version. The thesis followed the basics and updated the command to compile the latest version of Qt 5.15.0, which this thesis is the first to compile the latest Qt version on Raspberry 3B and Raspberry Pi 4.

Preparing the Raspberry Pi: First, the Debian update package need to be added in the Raspberry Pi package update source list file “/etc/apt/sources.list”.

deb-src http://raspbian.raspberrypi.org/raspbian/ buster main contrib non-free rpi

Then update the raspberry pi: sudo apt-get update sudo apt-get dist-upgrade sudo rpi-update sudo reboot

Installing the Qt5 developing packages and libraries: sudo apt-get build-dep qt5-qmake sudo apt-get build-dep libqt5gui5 sudo apt-get build-dep libqt5webengine-data sudo apt-get build-dep libqt5webkit5 sudo apt-get install libudev-dev libinput-dev libts-dev libxcb-xinerama0- dev libxcb-xinerama0 gdbserver

Creating the missing Symlinks, to allow us building applications for the Raspberry Pi and deploy them remotely: sudo mkdir /usr/local/RaspberryQt sudo chown -R pi:pi /usr/local/RaspberryQt ln -s /opt/vc/lib/libEGL.so /usr/lib/arm-linux-gnueabihf/libEGL.so.1.0.0 ln -s /opt/vc/lib/libGLESv2.so /usr/lib/arm-linux- gnueabihf/libGLESv2.so.2.0.0 ln -s /opt/vc/lib/libEGL.so /opt/vc/lib/libEGL.so.1 ln -s /opt/vc/lib/libGLESv2.so /opt/vc/lib/libGLESv2.so.2

Developing Environment (virtual machine) The host machine was installed on a virtual machine using VirtualBox and Ubuntu 20.04 Desktop as Operating System (OS). After successfully installing the Operating System, there is a need for an update and installation of the required libraries and packages to build our custom Qt framework:

94

sudo add-apt-repository "deb http://security.ubuntu.com/ubuntu xenial- security main" sudo apt update sudo apt-get install -y flex libjasper-dev git cmake build-essential pkg- config libjpeg-dev libtiff5-dev libpng-dev libavcodec-dev libavformat-dev libswscale-dev libv4l-dev libxvidcore-dev libx264-dev libatlas-base-dev gfortran wget unzip libz-dev zlib1g-dev gcc g++ git bison python gperf gdb-multiarch qt5-default texinfo make python3-dev

Then create folders with full write permissions to copy the raspberry cross-build files: sudo mkdir /opt/RaspberryQt sudo mkdir /opt/RaspberryQt/build sudo mkdir /opt/RaspberryQt/sysroot /opt/RaspberryQt/sysroot/usr/opt/RaspberryQt/sysroot/opt sudo chown -R 1000:1000 /opt/RaspberryQt

Afterwards, create a SSH connection between the Raspberry Pi and the Host-System: ssh-keygen -t rsa -C root@ -P "" -f ~/.ssh/rpi_root_id_rsa ssh-keygen -t rsa -C pi@ -P "" -f ~/.ssh/rpi_pi_id_rsa cat ~/.ssh/rpi_root_id_rsa.pub | ssh root@< BlackPearl> 'cat >> .ssh/authorized_keys && chmod 640 .ssh/authorized_keys' cat ~/.ssh/rpi_pi_id_rsa.pub | ssh pi@< BlackPearl> 'cat >> .ssh/authorized_keys && chmod 640 .ssh/authorized_keys'

After creating a secure connection between the host and the Raspberry Pi, an installation the Qt source code in needed: cd /opt/RaspberryQt git clone https://github.com/raspberrypi/tools wget https://download.qt.io/archive/qt/5.15/5.15.0/single/qt-everywhere- src-5.15.0.tar.xz wget https://raw.githubusercontent.com/riscv/riscv- poky/master/scripts/sysroot-relativelinks.py

The above terminal commands got the Cross-Compiler files for the Raspberry Pi, the source code for the Qt framework need to be downloaded and then automatically created Symlinks between the source code and the pi sysroot files. Now the Qt source files and then the Cross-Compiler files “arm-linux-gnueabihf” need to be unpacked Tar xf qt-everywhere-src-5.15.0.tar.xz cp -R qt-everywhere-src-5.15.0/qtbase/mkspecs/linux-arm-gnueabi-g++ qt- everywhere-src-5.15.0/qtbase/mkspecs/linux-arm-gnueabihf-g++ sed -i -e 's/arm-linux-gnueabi-/arm-linux-gnueabihf-/g' qt-everywhere- src-5.15.0/qtbase/mkspecs/linux-arm-gnueabihf-g++/qmake.conf

Now, get the needed files from the Raspberry Pi by using rsync command: rsync -avz root@< BlackPearl >:/lib sysroot rsync -avz root@< BlackPearl >:/usr/include sysroot/usr rsync -avz root@< BlackPearl >:/usr/lib sysroot/usr rsync -avz root@< BlackPearl >:/opt/vc sysroot/opt

95

We are ready to start making and compiling the framework, but before we start we need to execute the python program to create automatic Symlinks to the downloaded sysroot from the Raspberry Pi: /opt/RaspberryQt/sysroot-relativelinks.py sysroot

Finally, we can start building our own Qt framework for Raspberry Pi: $ cd /opt/RaspberryQt/build

For the Raspberry Pi 3 ../qt-everywhere-src-5.15.0/configure -opengl es2 -device linux-rasp-pi3- g++ -device-option CROSS_COMPILE=/opt/RaspberryQt/tools/arm-bcm2708/gcc- linaro-arm-linux-gnueabihf-raspbian-x64/bin/arm-linux-gnueabihf- -sysroot /opt/RaspberryQt/sysroot -prefix /usr/local/RaspberryQt -opensource - confirm-license -no-gbm -skip qtscript -nomake tests -nomake examples - make libs -pkg-config -no-use-gold-linker -v

For the Raspberry Pi 4 ../qt-everywhere-src-5.15.0/configure -opengl es3 -device linux-rasp-pi4- v3d-g++ -device-option CROSS_COMPILE=/opt/Raspberry4Qt/tools/rpi-gcc- 8.3.0/bin/arm-linux-gnueabihf- -sysroot /opt/Raspberry4Qt/sysroot -prefix /usr/local/Raspberry4Qt -opensource -confirm-license -confirm-license - no-gbm -skip qtscript -nomake tests -nomake examples -make libs -pkg- config -no-use-gold-linker -v

Then after setting up the configuration file, we need to compile and install the Qt framework: $ make -j4 $ make install It took us 4 to 6 hours to finish installing the framework. Afterward, we need to update the Raspberry Pi with the compiled Qt libraries by synchronizing them using this command: $ cd /opt/RaspberryQt $ rsync -avz sysroot/usr/local/RaspberryQt root@< BlackPearl >:/usr/local

Now we need to install the GNU Debugger (GDB): sudo apt-get update sudo apt-get upgrade sudo apt-get install -y texinfo gcc g++ make python3-dev wget wget https://ftp.gnu.org/gnu/gdb/gdb-9.1.tar.xz tar xf gdb-9.1.tar.xz mkdir gdb-9.1/build export PYTHON=python3 export PYTHON_LIBDIR=$("${PYTHON}" -c "import sysconfig; print(sysconfig.get_config_var('LIBDIR'))") cd gdb-9.1/build ../configure --prefix=/home/pi/GDB/bin --target=arm-linux-gnueabihf -- with-python=${PYTHON} LDFLAGS="-L${PYTHON_LIBDIR}" 96

make -j4 make -C gdb install

Finally, we finished installing our own custom Qt framework (latest version 5.15.0). In case if we need to add new Qt modules, we need to compile and then install them by executing these commands: (for example, Qt charts module) Wget http://download.qt.io/official_releases/qt/5.15/5.15.0/submodules/qtchart s-everywhere-src-5.15.0.tar.xz tar xf qtcharts-everywhere-src-5.15.0.tar.xz cd qtcharts-everywhere-src-5.12.8 /opt/RaspberryQt/sysroot/usr/local/RaspberryQt/bin/qmake make –j4 make install

Then we have to update the Raspberry Pi by resynchronising “RaspberryQt” folder: rsync -avz /opt/RaspberryQt/sysroot/usr/local/RaspberryQt root@$< BlackPearl >:/usr/local

97

B. Appendix B – Remote Deploy

The Remote Application Deploy can be set easily after creating the SSH link from Appendix A. First, in case the Qt Creator was not installed, this can be installed using this command: $ cd ~/Downloads $ sudo chmod +x qt-unified-linux-x64-3.2.2-online.run $ ./qt-unified-linux-x64-3.2.2-online.run

Second, open options -> devices –> type “Generic Linux”; add the Raspberry Pi IP address and the login information like this screenshot below.

The last step is to create a Qt Kit, which includes the location of the compiler and the debuggers for the Raspberry Pi.

Pi GC /opt/RaspberryQt/tools/arm-bcm2708/gcc-linaro-arm-linux-gnueabihf- raspbian-x64/bin/arm-linux-gnueabihf-gcc Pi GCC /opt/RaspberryQt/tools/arm-bcm2708/gcc-linaro-arm-linux-gnueabihf- raspbian-x64/bin/arm-linux-gnueabihf-g++ Pi GDB /home/pi/Toolchain/RaspberryQt/GDB/bin/arm-linux-gnueabihf-gdb

98

The Raspberry Pi Kit should be looks like the screenshot below.

At this stage, the Raspberry Pi deploy option available to use.

99

C. Appendix C – Code Snippets

A. BlackPearl Server

"can": [ "wifi": [ { { "Id": "0x001", "name": "Camera", "name": "UltraF", "url": "ws://localhost:8000", "desc": "Ultrasonic Sensors Front", "desc": "Streaming", "access": [ "access": "image", "0x001", "R/W": "W", "sensors[0]", "status": 1 "sensors[1]", } "sensors[2]", ] "sensors[3]" ], "R/W": "R", "status": "1" } ] Code snippet 1: JSON Smart Queue Rules.

var wsensor = new WebSocket(CanRead); // A wsensor.onmessage = function (event) { SensorsStatus=true; // B var sensors=JSON.parse(event.data); // C

if(sensors[8]==1) // D { var USS_Front_Right = parseInt(parseFloat(sensors[2])/2.55); var USS_Front_Center = parseInt(parseFloat(sensors[1])/2.55); var USS_Front_Left = parseInt(parseFloat(sensors[0])/2.55); } } Code snippet 2: Ultrasonic Sensors.

Var CANlen= Object.keys(Config.can).length; // A for(var i=0;i

100

if(Config.can[i]['status']==1 && Config.can[i]['R/W']=="W") // C { var out = {} var Can_Access=[]; var accessLen= Object.keys(Config.can[i]['access']).length; for(var j = 0; j < accessLen; j++) { Can_Access [j]= "request.body."+Config.can[i]['access'][j]; } if(accessLen<8) { for(var x=j;x<8;x++) { Can_Access [x]=0; } } // D out.id = Config.can[i]['Id']; out.data = Buffer.from([eval(Can_Access [0]),eval(Can_Access [1]),eval(Can_Access [2]),eval(Can_Access [3]),eval(Can_Access [4]),eval(Can_Access [5]),eval(Can_Access [6]),eval(Can_Access [7])], 'hex'); if(out.id==request.body.CAN_id) // E { channel.send(out); console.log(out); } } } Code snippet 3: Smart Queue Write CAN-Bus Rules. if(Config.can[i]['status']==1 && Config.can[i]['R/W']=="R") // A { incoming.id = Config.can[i]['Id']; if(msg.id== incoming.id) // B { var Can_Access =[]; var accessLen= Object.keys(Config.can[i]['access']).length; for(var j = 0; j < accessLen; j++)

101

{ Can_Access [j]= msg.data[j]; } if(accessLen<8) { for(var x=j;x<8;x++) { Can_Access[x]=0; } } // C Can_Access [8]= incoming.id; wss.send(JSON.stringify(Can_Access)); Code snippet 4: Smart Queue Read CAN-Bus Rules.

var lights = left_indicator + (2 * right_indicator) + (4 * low_beam) + (8 * dlr); JSON.stringify({"CAN_id":"0x051",lights:lights}); JSON.stringify({"CAN_id":"0x051","red":red,"green":green,"blue":blue}); Code snippet 5: Lights Smart Queue Rule.

if(joystick1.xValue >= -0.5 && if(joystick1.yValue < -0.3) joystick1.xValue <= 0.5) { { if(joystick1.Dir === "C") joystick1.Dir ="C"; { } joystick1.Dir = "U"; if (joystick1.xValue > 0.5) } { else joystick1.Dir ="R"; { } joystick1.Dir += "U"; if (joystick1.xValue < -0.5) } { } joystick1.Dir ="L"; if(joystick1.yValue > 0.3) } { if(joystick1.Dir === "C") { joystick1.Dir = "D"; } else { 102

joystick1.Dir += "D"; } }

var Directions=right + (2 * left) + (4 * backward) + (8 * forward); var data = JSON.stringify({"CAN_id":"0x052","direction":Directions}); Code snippet 6: Joystick function.

B. Camera Server

const FPS = 30; // A const wCap = new cv.VideoCapture(0); // B wCap.set(cv.CAP_PROP_FRAME_WIDTH,300); // C wCap.set(cv.CAP_PROP_FRAME_HEIGHT,300); const WebSocket = require('ws'); const wss = new WebSocket.Server({ port: port }); wss.on('connection', function connection(ws) { ws.on('message', function incoming(message) { console.log('received: %s', message); }); setInterval(()=>{ // D const frame = wCap.read(); const img = cv.imencode('.jpg', frame).toString('base64'); ws.send(img); },100/ FPS); });

Code snippet 7: Live Camera Streaming.

C. QML Snippet

app.get('/page',(req,res) => { res.send([{ "Type":"t", "V":"true", "Access":"s", "Value":true, "X":55, "Text":"", "Y":155, "FontSize":0 "H":0, },{ "W":0, "Type":"sl", "V":"true", "Access":"s", "Value":5, "X":55, 103

"Text":"elooo", "Y":355, "FontSize":0 "H":0, },{ "W":0, "Type":"sw", "V":"true", "Access":"s", "Value":1, "X":155, "Text":"", "Y":155, "FontSize":0 "H":0, } "W":0, ]); Code snippet 8: JSON QML Dynamic GUI.

C++: Engine.rootContext()->setContextProperty(“commands”, &command)

QML: Timer { id:timer interval: 100 running: true repeat: true onTriggered: sparrow(command) } Function sparrow(direction) { Switch(direction) { } var Dir=right + (2 * left) + (4 * backward) + (8 * forward); var data = JSON.stringify({"CAN_id":"0x052","direction":Dir}); xhr.send(data);} Code snippet 9: C++ class integration in QML.

104

D. Appendix C – QR Codes

Virtual Joystick Voice Command

VW Dashboard GamePad

105

Selbstständigkeitserklärung

106

This report - except logo Chemnitz University of Technology - is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made. The images or other third party material in this report are included in the report’s Creative Commons license, unless indicated otherwise in a credit line to the material. If material is not included in the report’s Creative Commons license and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this license, visit http://creativecommons.org/licenses/by/4.0/.

Chemnitzer Informatik-Berichte In der Reihe der Chemnitzer Informatik-Berichte sind folgende Berichte erschienen:

CSR-14-01 International Summerschool Computer Science 2014, Proceedings of Sum- merschool 7.7.-13.7.2014, Juni 2014, Chemnitz CSR-15-01 Arne Berger, Maximilian Eibl, Stephan Heinich, Robert Herms, Stefan Kahl, Jens Kursten,¨ Albrecht Kurze, Robert Manthey, Markus Rickert, Marc Rit- ter, ValidAX - Validierung der Frameworks AMOPA und XTRIEVAL, Ja- nuar 2015, Chemnitz CSR-15-02 Maximilian Speicher, What is Usability? A Characterization based on ISO 9241-11 and ISO/IEC 25010, Januar 2015, Chemnitz CSR-16-01 Maxim Bakaev, Martin Gaedke, Sebastian Heil, Kansei Engineering Expe- rimental Research with University Websites, April 2016, Chemnitz CSR-18-01 Jan-Philipp Heinrich, Carsten Neise, Andreas Muller,¨ Ahnlichkeitsmessung¨ von ausgew¨ahlten Datentypen in Datenbanksystemen zur Berechnung des Grades der Anonymisierung, Februar 2018, Chemnitz CSR-18-02 Liang Zhang, Guido Brunnett, Efficient Dynamic Alignment of Motions, Februar 2018, Chemnitz CSR-18-03 Guido Brunnett, Maximilian Eibl, Fred Hamker, Peter Ohler, Peter Prot- zel, StayCentered - Methodenbasis eines Assistenzsystems fur¨ Centerlotsen (MACeLot) Schlussbericht, November 2018, Chemnitz CSR-19-01 Johannes D¨orfelt, Wolfram Hardt, Christian Rosjat, Intelligente Geb¨aude- klimatisierung auf Basis eines Sensornetzwerks und kunstlicher¨ Intelligenz, Februar 2019, Chemnitz CSR-19-02 Martin Springwald, Wolfram Hardt, Entwicklung einer RAD-Plattform im Kontext verteilter Systeme, M¨arz 2019, Chemnitz Chemnitzer Informatik-Berichte

CSR-19-04 Johannes G¨otze, Ren´eSchmidt, Wolfram Hardt, Hardwarebeschleunigung von Matrixberechnungen auf Basis von GPU Verarbeitung, M¨arz 2019, Chem- nitz CSR-19-05 Vincent Kuhn,¨ Reda Harradi, Wolfram Hardt, Expert System for Adaptive Flight Missions, Juni 2019, Chemnitz CSR-19-06 Samer Salamah, Guido Brunnett, Christian Mitschke, Tobias Heß, Synthes- zing gait motions from spline-based progression functions of controlled sha- pe, Juni 2019, Chemnitz CSR-19-07 Martin Eisoldt, Carsten Neise, Andreas Muller,¨ Analyse verschiedener Di- stanzmetriken zur Messung des Anonymisierungsgrades θ, Juni 2019, Chem- nitz CSR-19-08 Andr´eLanger, Valentin Siegert, Martin Gaedke, Informationsverwertung basierend auf qualit¨atsoptimierten semistrukturierten Datenbest¨anden im Wachstumskern “LEDS”, Juli 2019, Chemnitz CSR-20-01 Danny Kowerko, Chemnitzer Linux-Tage 2019 - LocalizeIT Workshop, Ja- nuar 2020, Chemnitz CSR-20-02 Robert Manthey, Tom Kretzschmar, Falk Schmidsberger, Hussein Hussein, Ren´eErler, Tobias Schlosser, Frederik Beuth, Marcel Heinz, Thomas Kron- feld, Maximilian Eibl, Marc Ritter, Danny Kowerko, Schlussbericht zum InnoProfile-Transfer Begleitprojekt localizeIT, Januar 2020, Chemnitz CSR-20-03 J¨orn Roth, Reda Harradi, Wolfram Hardt, Indoor Lokalisierung auf Basis von Ultra Wideband Modulen zur Emulation von GPS Positionen, Februar 2020, Chemnitz CSR-20-04 Christian Graf, Reda Harradi, Ren´eSchmidt, Wolfram Hardt, Automati- sierte Kameraausrichtung fur¨ Micro Air Vehicle basierte Inspektion, M¨arz 2020, Chemnitz CSR-20-05 Julius Lochbaum, Ren´eBergelt, Time Pech, Wolfram Hardt, Erzeugung von Testdaten fur¨ automatisiertes Fahren auf Basis eines Open Source Fahrsi- mulators, M¨arz 2020, Chemnitz CSR-20-06 Narankhuu Natsagdorj, Uranchimeg Tudevdagva, Jiantao Zhou, Logical Struc- ture of Structure Oriented Evaluation for E-Learning, April 2020, Chemnitz CSR-20-07 Batbayar Battseren, Reda Harradi, Fatih Kilic, Wolfram Hardt, Automated Power Line Inspection, September 2020, Chemnitz CSR-21-01 Marco Stephan, Batbayar Battseren, Wolfram Hardt, Object Localization in 3D Space for UAV Flight using a Monocular Camera, M¨arz 2021, Chemnitz CSR-21-02 Hasan Aljzaere, Owes Khan, Wolfram Hardt, Adaptive User Interface for Automotive Demonstrator, Juli 2021, Chemnitz Chemnitzer Informatik-Berichte ISSN 0947-5125

Herausgeber: Fakult¨at fur¨ Informatik, TU Chemnitz Straße der Nationen 62, D-09111 Chemnitz