DEGREE PROJECT IN COMPUTER SCIENCE AND ENGINEERING, SECOND CYCLE, 30 CREDITS STOCKHOLM, SWEDEN 2020

Design and development of a for the monitoring process of an automated guided vehicle fleet

JOHANNA PAUL

KTH ROYAL INSTITUTE OF TECHNOLOGY SCHOOL OF ELECTRICAL ENGINEERING AND COMPUTER SCIENCE

ABSTRACT

Many different autonomously driving mobile robots are used for industrial transports of materials or goods in the context of internal logistic processes because of different use cases. The problem for the users that need to monitor the robots is that each manufacturer provides its own graphical user interface (GUI) with different operating modes and visual designs, which requires different trainings and constant switching between software. Therefore, this paper shows the design and development process of a graphical user interface in the form of a web application for the monitoring process of a fleet of automated guided vehicles from different manufacturers and answers the following question: "What are the main criteria when designing a graphical user interface with high usability for the monitoring process of manufacturer-independent automated guided vehicle fleets?” To answer the question, existing graphical user interfaces from different manufacturers were analyzed and interviews with developers and end-users of the GUIs were conducted. Requirements were then derived, on whose basis sketching, wireframing and high-fidelity prototyping have been performed. Usability testing and a heuristic evaluation were chosen to improve the application and its usability continually. As a result, the following six main criteria could be derived that summarize the most essential points to consider when designing such a GUI: administrability, adaptiveness, observability, analyzability, robot and job awareness, and intervention.

SAMMANFATTNING

Många olika autonomt körande mobila robotar används för industriell transport av material eller varor i samband med interna logistiska processer till följd av olika användningsfall. Problemet för de användare som behöver överv aka robotarna är att varje tillverkare tillhandahåller sitt eget grafiska användargränssnitt (GUI) med olika driftsätt och visuella utformningar, vilket kräver olika utbildningar och ständig växling mellan mjukvara. Denna uppsats visar därför design- och utvecklingsprocessen för ett grafiskt användargränssnitt i form av en webbapplikation för övervakningsprocessen för en samling av automatiserade guidade fordon från olika tillverkare, och svarar på följande fråga: "Vilka är de viktigaste kriterierna vid utformningen av ett grafiskt användargränssnitt med hög användbarhet för övervakningsprocessen av automatiserade guidade fordonsamlingar, oboeroande av tillverkare?” För att svara på frågan analyserades befintliga grafiska användargränssnitt från olika tillverkare, samt intervjuer med utvecklare och slutanvändare av GUI:erna utfördes. Krav härleddes sedan, baserat på vilka skisser, wireframing och hifi -prototyper som har utförts. Användbarhetstest och en heuristisk utvärdering valdes för att kontinuerligt förb ättra applikationen och dess användbarhet. Som ett resultat kan följande sex huvudkriterier härledas, de sammanfattar de viktigaste punkterna att tänka på när man utformar ett sådant GUI: förmåga att administrera, anpassningsförmåga, observerbarhet, analyserbarhet, robot- och jobbmedvetenhet och intervention.

Design and development of a graphical user interface for the monitoring process of an automated guided vehicle fleet

Johanna Paul School of Electrical Engineering and Computer Science KTH Royal Institute of Technology Stockholm, Sweden [email protected]

ABSTRACT 2025 [2]. Despite the constant developments, most Many different autonomously driving mobile robots installations today are custom-made, application- and are used for industrial transports of materials or goods manufacturer-specific and not standard or mass- in the context of internal logistic processes because of products. Besides, the use of AGVs requires early hall different use cases. The problem for the users that need layout planning, time-consuming commissioning, and to monitor the robots is that each manufacturer navigation solutions are only usable in well-structured provides its own graphical user interface with different and mainly static environments. Since there are many operating modes and visual designs, which requires specific tasks to be performed in a production different trainings and constant switching between environment, different vehicles from different software. Therefore, this paper shows the design and manufacturers are used containing different master development process of a graphical user interface in the control systems that control the individual vehicles. form of a web application for the monitoring process of However, there is no exchange or interaction between a fleet of automated guided vehicles from different the systems, and the AGVs have little autonomy manufacturers and answers the following question: themselves [3]. "What are the main criteria when designing a graphical Fraunhofer Institute for Engineering user interface with high usability for the monitoring and IPA is currently developing software process of manufacturer-independent automated guided solutions called NODE. It is a “Plug & Play” solution for vehicle fleets?” To answer the question, existing the orchestration and autonomous navigation of mobile graphical user interfaces from different manufacturers robot fleets. Robots equipped with NODE can be used were analyzed and interviews with developers and end- for many applications, and vehicles from different users of the GUIs were conducted. Requirements were manufacturers can be integrated. By connecting the then derived, on whose basis sketching, wireframing vehicles with cloud/edge servers and with each other, and high-fidelity prototyping have been performed. the exchange of data is enabled, and through the use of Usability testing and a heuristic evaluation were chosen cooperative navigation and machine learning, this data to improve the application and its usability continually. is used to realize highly autonomous, self-optimizing As a result, the following six main criteria could be fleets (NODE Cooperation). NODE Orchestration derived that summarize the most essential points to extends conventional fleet management systems by consider when designing such a GUI: administrability, uniform the interface of the Manufacturing Execution adaptiveness, observability, analyzability, robot and job System (MES) to accept orders and to distribute them to awareness, and intervention. the fleet. This means that vehicles that are not NODE- Edge capable can also be integrated into the overall AUTHOR KEYWORDS system. Due to a high degree of autonomy, fast Automated Guided Vehicle (AGV); Autonomous Mobile commissioning, and flexible and efficient use in various Robots (AMR); User Experience Design; Interaction Design; Usability; User Interface Design; Graphical User dynamic manufacturing environments is possible without additional infrastructure [3]-[5]. Interface INTRODUCTION Research Problem and Aim Advanced robots, like Automated Guided Vehicles NODE has no graphical user interface (GUI) so far, (AGVs), are, among others, one of the leading but this is essential for the communication between the technological drivers of Industry 4.0, the fourth user and the robots in order to interact with and to industrial revolution [1], and their market volume is monitor them. Such interfaces are already available expected to increase by almost 550% from 2018 to from various AGV manufacturers. After Fraunhofer IPA

had examined and compared several, a lack of good user system optimization, potential for improvement is interface design was identified. However, this is of great uncovered and, if necessary, exploited. importance, as it makes a product user-friendly and BACKGROUND easy to understand and thus leads to a higher user This chapter presents definitions and theoretical acceptance [6]. Another factor is that the examined principles of the terms AGV and User Experience with interfaces were only designed for robots of the own its subsets Interaction Design, Usability, and User manufacturer and not for several robots from different Interface Design to create a sound basis for the thesis manufacturers. Therefore, the overall aim is to design [9], [10]. and develop a GUI for the monitoring process in the form of a web application with high usability, which Automated Guided Vehicles makes it possible to manage and interact with a fleet of When Automated Guided Vehicle Systems (AGVSs) manufacturer-independent AGVs primarily in the field were invented in 1953, the demand came mainly from of intralogistics. Thereby, maximum user acceptance, the automotive industry to decrease production costs productivity, and satisfaction should be achieved [7]. through automation [11]. Since then, they have been Usability is a criterion for quality and guarantees that employed in almost all areas of industry and production those goals are effectively, efficiently, and satisfactorily and are used primarily for material flow connections achieved [8]. The GUI should make the overall system [11], [12]. Today, an AGVS is a “fully automated more accessible to AGV manufacturers, system industrial transport system” [13] and they “consist of integrators, and, above all, to end-users who work one or several Automated Guided Vehicles (AGVs), a closely together with the robots. guidance control system, devices for determination of Based on the situation, the following research location and position sensing, devices for data question serves as a guideline for the work and will be transmission, as well as infrastructure and peripheral answered in the course of the research: "What are the units” [14]. AGVs can be described as driverless main criteria when designing a graphical user interface transport systems used to transport materials and with high usability for the monitoring process of goods to make logistics more efficient [2], [13]. manufacturer-independent automated guided vehicle AGVs have been greatly improved over time. From fleets?” expensive, inflexible vehicles with little autonomy that had to follow a fixed-path with additional infrastructure Delimitations Although the software modules are developed for such as visual lines, markings, wires or tapes, to less expensive, flexible vehicles with high autonomy that both public and industrial sectors, the focus in this follow an open-path or free navigation with laser, thesis is on industrial transport. Only AGVs that vision- or geo-guidance and are able to understand their transport material or goods within a factory in the environment in real-time, allowing them to operate in context of internal logistic processes are covered. In variable and dynamic environments around people [2], addition, as a GUI with all processes included would go [11]-[13], [15]. In addition, AGVs today make beyond the scope and time limitations of the paper, only production planning more flexible, require less one – the monitoring process – is considered as it is one personnel, and have become more affordable over the of the most repeated. Other processes are years due to fewer robot components necessary and commissioning, mapping, creation of missions, and more competition [2], [13]. Due to these improvements, administrative matters. Interaction designers have there are, at the same time, many researchers calling already designed the mapping process. Thus, a style those modern AGVs Autonomous Mobile Robots (AMRs) guide has already been worked out, which is used in and classify both as mobile robots with different this paper as well. Besides, monitoring is carried out by autonomy [16]-[18]. In this paper, the vehicles AGV manufacturers, system integrators, and end-users mentioned always refer to mobile robots with high for different purposes. However, only the needs of the autonomy. end-users defined as people working in industrial enterprises professionally with AGV monitoring will be User Experience considered as they have the least technical or computer User Experience (UX) Design is very personal and science background but do operational monitoring and perceived as good if it meets the individual need of the system optimization repetitively over a long period. user in the context it is used, if it offers relevant and Therefore, a GUI is most essential for them. Operational meaningful experiences for the user and if it is simple, monitoring means that information on operating status and a joy to own and use [9], [19]. The Interaction is transmitted, and error handling is initiated. During Design Foundation explains that UX Design considers not only the consumption of products, but also the

entire cycle of acquiring, owning, and troubleshooting. contains a customizable dashboard consisting of Aspects of design, usability, branding, and function are different widgets. If the standard dashboard is used, a included as well. Interaction Design, Usability, and User summary of the robot is visible directly on the start Interface Design are subsets of it. The goal of Interaction page, providing information such as the name, model, Design is to make interaction possible between people, battery status, and operating time. systems, and machines [10]. People should be The analytics submenu shows bar graphs about the supported in their everyday lives by achieving their goals in the best possible way through the created distance traveled by the robot and the accumulated interactive product [20], [21]. Usability is defined by distance for the chosen period. The second submenu is The International Organization for Standardization as called “System log”. Events that are logged by the “the extent to which a system, product or service can be operating system components are shown in a table. used by specified users to achieve specified goals with Besides the time, the system status which is indicated effectiveness, efficiency, and satisfaction in a specified by a color, the module concerned, and a short message context of use” [22]. “The term User Interface refers to which is not always understandable to the user, are the methods and devices that are used to accommodate displayed. The primary purpose of this table is to allow interaction between machines and the human beings, system supporters to troubleshoot the system. The users, who use them” [6]. User Interface (UI) Design has its focus on the style and look to come up with pleasurable and easy to use designs. One can distinguish between different interface types, such as a command- line, -driven, natural language, gesture, or graphical user interface [23], [24]. STATE OF THE ART To the best of knowledge, no research or GUI could be found equivalent to this study. However, a state of submenu “Error logs” contains all detected system the art analysis and a literature research were errors (see Figure 2). conducted, and the findings were included as Figure 2: Error logs with description, affected module, supplementary information for the GUI developed in time and download or delete possibility. this study. For the state of the art analysis the GUIs from In the submenu “Hardware health”, the status of the the manufacturers Mobile Industrial Robots (MiR) and hardware components is indicated through color-coded KUKA were examined regarding their structure, functionalities, information hierarchy, and interface dots. “Safety system” indicates in written and color- design concerning monitoring1. MiR was chosen as the coded form whether the scanners are "free" or not and researcher had full physical access to the AGV and the the status of the emergency stop . The last GUI. A reference guide was used as a supplement [25]. submenu point, “Mission log”, lists all missions that KUKA provides valuable and detailed information about have been performed by the robot. For each mission a their GUI which is freely accessible. list of the actions performed as well as a short message, the start time, duration and who it started can be Mobile Industrial Robots viewed. The responsive Robot Interface 2.0 can be accessed KUKA through a browser. Figure 1 KUKA offers KUKA Connect, a cloud-based analytics shows the menu structure. and intelligence platform. It is accessible via a browser, The GUI is only designed for and they provide in addition a mobile app [26]. KUKA one MiR robot and not for a Connect is not primarily for mobile robots, but it is fleet. Information about the nevertheless suitable to analyze. Kinbarovsky (n.d.) current status of the robot, explains that the platform was developed through such as operating status and following the design process discovery, strategy, battery percentage is always iteration, validation, scale. Interviews with industry Figure 1: Main menu and visible in the top bar. It experts were conducted, and factories visited to submenu items understand the users and their daily processes. Crucial

1 For an overview of the GUIs see Appendix A findings were that better insight into the data of the robotic systems are needed to conduct root-cause

analyses easily and that it must be clear which problem monitoring information [27], [33]. Figure 4 shows the occurred when. Repeated tests and improvements were map view. part of the process until KUKA Connect 2.0 was finally released. The UX team is still engaged in user research “with methodologies including remote testing, user collaboration workshops, site visits, stakeholder interviews, and app data analysis” [27]. The GUI is responsive [26] and usable for several robots of the own brand. An expandable menu, notifications, and alerts are always visible in the top menu [28]. After the login, a data dashboard is shown with all significant Key Performance Indicators (KPIs) for all robots assigned to the viewing person (see Figure 3). On the home page, “Robot Filtering”, all robots are listed assigned to the person, and advanced filtering is Figure 4: Submenu AGV monitoring: 2D representation of offered. When selecting a robot from the list, a “Robot the environment of the robots with the positions of the Details” page is shown with information such as the robots. When selecting them, a state description shows status, the operating hours, or the date of its up. Additionally, battery state statistics are offered. commissioning [28]. Literature research has revealed the following. Steinfeld (2004) did a study about the challenges, positive and negative aspects, and interface wisdoms regarding interfaces for fully and semi-autonomous mobile robots by conducting interviews with experts from the Institute. The findings could be grouped into the following categories: “safety, remote awareness, control, command inputs, status and state, recovery, and interface design.” For example, an emergency stop should be present, a map showing the robots and their environment, and video display should be there with supplementary information about the robot near to it. Furthermore, the user should be able to Figure 3: Data dashboard consisting of different control the robot and to identify the health of the robots customizable widgets [26]. and motion characteristics easily. Color changes or pop- ups could draw attention to a change of gauges or state From both, the Robot Filtering and the Robot Details information, and a summary about health and errors page, submenus can be accessed [28]. The “Alerts” should be centrally placed [34]. submenu shows graphically what robots cause most The most similar research is the one from Poli alerts, what type of alerts are most common and an (2013) where a multi-robot user interface was alert log lists the type of alerts continually [27], [29]. In developed by passing research, design, construction and the “Maintenance” submenu, a maintenance log lists all evaluation cycles. The usage of a main map showing the maintenances, and a timeline is shown [27], [30]. positions of the robots and a control panel with a “Condition monitoring” demonstrates different states control, a display panel for robot data and a mini map such as the battery state or CPU load, and historical are the key components. A list with a state summary of analysis [27], [31]. The “Change Log” submenu displays each robot is offered as well to increase mission software, hardware, and configuration changes, and a awareness. Furthermore, a message bar was timeline shows the change events [32]. “AGV implemented to inform the user with important system Monitoring” is divided into three sub-menus: state bar, information in written and audible form. Sound was fleet data, and map view. The state bar shows important generally used as feedback mechanism. Besides, color status and safety information. In the fleet data section, codes were used, the possibility to disable features, a graphs are displayed with additional condition clean and straightforward design, and icons. Furthermore, a robot state summary and a

prioritization of it inform the user about the states, and industry and have a good overview and knowledge of data features can be hidden to keep the flexibility high. the processes there. They lasted around 45 minutes A combination of user tests and heuristic evaluation each. On the other hand, with six end-users of two was chosen in the evaluation phase [35]. internal companies (company A and B) working with AGV monitoring, to understand how they use current The following two studies defined several standards. solutions and what their problems and needs are (see Nakamura et al. (1998) specified the following data as Appendix B.4) [42]. In company A, three individual important for users when monitoring groups of interviews took place at the participants’ workplaces, multiple homogeneous mobile robots: the robot states followed by a tour of the factory, through which it was through colors, the commands for the robots, their possible to view the robots, to investigate two more directions, positions and velocities, sensory information GUIs, and to directly get feedback about positive and and contact status [36]. Shenoy et al. (2002) compared negative aspects from their perspective. They lasted different GUIs and suggested that in GUIs for mobile approximately 60 minutes each. At company B, a 90- robots a zoom able real-time map should be used minute group interview was conducted with three showing the robots, their health, and their waypoints. participants via video conference. Such interviews with Data from sensors are preferably displayed graphically. industry experts were also conducted during the Alerts about sensor readings that are outside the safe development of the GUI from KUKA [27]. The interviews range should be communicated through colors. were conducted in a semi-structured way [43]. Furthermore, the energy and power level of the robots Therefore, interview guides were prepared with pre- should always be visible and an alarm sound should be defined questions and topics to be covered (see used when the level falls under a certain point. Another Appendix B.1) [43]. An alternative to the interviews important factor is the possibility to control robots would have been a focus group. However, it was remotely through the interface and it should always be decided to conduct individual interviews if possible possible to stop the robot [37]. because, in a focus group, the participants might have METHODOLOGY influenced each other too much, or not everyone would To be able to answer the research question, a design have had a say. process based on the iterative design process for As this was a small-scale research and qualitative interaction design proposed by Shneiderman et al. data was gathered [44], a sample, a small group of the (2018) was applied with four phases. These phases research population, was selected [43]. For the were also used in the research from [35]. The third interviews, an exploratory sample was applied as a phase, “build and implementation”, was left out as this strategy [43]. To select the sample, a non-probability is not part of this paper [39]. To carry out the design technique – purposive sampling – was applied because process, the design framework "User-Centered Design" a small number of people with a direct link to the was applied to develop a usable, effective, and efficient research question were deliberately asked to contribute user interface (see Figure 5) [39], [40]. Early their experience [43]-[45]. During the interviews, an involvement of users leads to reduced development audio recording was made, if allowed, and additionally, time and cost [39]. notes were taken. Afterwards, the field notes were written up and supplemented by audio recordings, if available, to transform this data into meaningful information [44], [45]. The approach thematic analysis was used for this to identify, analyze, and report patterns or themes [46], [47].

Phase 2: Preliminary and detailed design Figure 5: User-centered design cycle [41] In the second phase, the identified requirements need to be realized [39], [45]. Shneiderman et al. (2018) Phase 1: Requirements analysis explain that in the first stage, the preliminary or In the first phase, the requirements which are architectural design stage, high-level design such as the necessary for the interactive system were gathered user, the interface displays, navigation, controls, and the through interviews with a specification describing the overall workflow are determined. In the detailed design users and the tasks they perform as an outcome [39]. stage, the details of each interaction between the One-to-one interviews took place personally on the one interactive system and the user are planned out to such hand with two developers working on the backend of an extent that only the technical details and the software, as they are in close contact with the implementation are left [39].

For this phase, the following design methods were AGVs, but only one of them saw or interacted with other applied: sketching with pen and paper, wireframing GUIs of AGVs already. with balsamiq2, and high-fidelity prototyping using the As with the interviews, an explorative sample and User Experience Design-Tool Adobe XD3 [39], [48]. purposive sampling applied as strategies for the Phase 3: Evaluation evaluations [43]-[45]. The thoughts spoken aloud Evaluation is carried out in order to evaluate how during the evaluations were recorded in the form of well the interface solves the problem, how well it fulfills notes which were supplemented by the researcher's the requirements, and to identify improvements in observations. order to refine it [45]. Baxter et al. (2015) describe that Research Ethics usability evaluations can be done through formative or Ethical issues may arise in the conduct of research summative evaluations. In this study, formative and must be considered when involving humans [43], evaluation was carried out, and usability testing was [44]. One ethical principle is the protection of the chosen as a method to test whether and how well users interests of the participants: „no-one should suffer harm can perform the intended tasks with the product and to as a result of participation in the research” [43], [55]. identify usability issues [20], [49], [50]. Furthermore, it needs to be guaranteed that the The formative evaluation took place once with participants participate voluntarily, and they should wireframes in order to test the design idea early [48] provide informed consent [43], [45]. Whenever data and once with a high-fi prototype. The wireframes were was collected during the process, every participant tested by six people, which is confirmed by [ 51] as a signed an informed consent form [44], [56]. sufficient number of people (see Appendix B.4). Four of them were developers and two designers with a focus RESULTS This chapter describes how the three phases of the on UX and UI. All of them already worked in some way design process were applied and presents the findings with AGVs before and saw or interacted with GUIs of obtained. AGVs. The testing took, on average, 50 minutes each and was conducted remotely through a video call. The Phase 1: Requirements analysis participants interacted with the application and attempted to complete tasks pre-defined in a task list Interviews (see Appendix B.2) while the method think-aloud was Through the interviews, it was possible to define the used [20], [42], [49]. The participants answered pre- end-users more precisely. They know the processes and study questions, and after the completion of the tasks, a requirements on-site well, have general technical skills post-study question was asked [52]. After every test, the up to a technical affinity but not so much a computer wireframes were incrementally improved based on the science affinity, and are not technically deep into what feedback, which is why it resulted in an iterative the robot can do or about how the algorithms behind process [20]. After the two designers got to know the work. This is why a GUI is required for which no interface in detail through the usability testing, they technical or computer science expertise is necessary. participated in a heuristic evaluation as well, which They can be divided primarily into production took on average additional 30 minutes each. Thereby managers, logistic experts, and process planners. they compared the wireframes with a list of usability Production managers make decisions, take actions, are principles (see Appendix B.3) to examine their responsible for organizational purposes, and correct compliance and discussed the violations together with operation of the overall plant. Logistic experts and the researcher afterwards [53], [54]. Poli (2013) chose process planners deal with process monitoring and as well a combination of usability testing and heuristic quality assurance mostly from their workplaces. All of evaluation in the evaluation phase [35]. After a high-fi them have other tasks as well, but always keep an eye prototype with Adobe XD could be developed based on on the GUI. It was already identified in a study from these results, it was also evaluated with four Steinfeld (2004) that “monitoring the robotic system’s developers. The testing lasted 55 minutes each on behavior will not be the operator’s primary task […]. As average. The testing procedure and setting were the such, there is a growing need for increased research on same as with the wireframes. All of them worked with successful methods for human interaction with autonomous and semi-autonomous mobile robots” [34]. Thus, they either notice themselves when they need to

2 https://balsamiq.com/ intervene or are called by the employees on the shop 3 https://www.adobe.com/products/xd.html floor, who can notice errors as they pass by, and

sometimes can even solve problems by themselves. For be accessible through every device, browser and this reason, shopfloor employees or people doing operating system. This is stated in the non-functional maintenance could be further possible users of the GUI. requirements. For a full list of requirements, see Appendix C.1. Company A confirmed the problem that "the systems have different GUIs and […] in the best case there is only Phase 2: Preliminary and detailed design one GUI with a map display in form of a web application To develop a design out of the requirements, sketching, to be usable with any device and browser. It can also be wireframing, and high-fidelity prototyping were with different layers, but that I, as a user, simply have an applied. interface where I can set up all my things and don't have to switch systems. Because with the current solutions you Preliminary design always have to get used to it and change things around The devices used by the users vary. Priority is given because every software has its own handling. […] And to desktop computers. Therefore, the sketches and because it is maintained by different people, also due to wireframes were created only for a desktop view but physical distances, you could complement each other since the goal is to create a responsive web application, much better. At the moment everyone is fighting for the high-fidelity prototype provides a tablet and mobile himself” (P3). Additionally, they encounter partly view as well. The overall workflow and navigation unsatisfying usability and visual design and high mechanism works as follows. As soon as users are complexity of the GUIs. Company B doesn’t use AGVs in logged in, they will find themselves on their personal parallel in a building, which is why people do not need dashboard which serves as the home page. Since it is a to deal with different GUIs. However, this scenario is multi-page application, the user can access further planned for the future, “[…] and then it would be of pages through a top navigation bar where the course our wish to have only one GUI because the navigation options are displayed as a horizontal list of different handlings also mean different trainings” (P7). links as recommended by [61]. This should give the user Themes that could be identified through the orientation and a good overview of the application. In interviews were administration, information addition, the currently visited page is always presentation map, robot control, robot information, job highlighted in color (see Figure 6) [62]. information, historical data, customization, and visual design. Figure 6: Desktop computer – Top navigation bar Requirements When the browser gets smaller, the top Requirements should be used as a guide for the navigation bar changes (see Figure 7 and 8). design and development of the artifact [45]. The following requirements could be derived primarily out of the interviews but also out of the state of the art analysis, the literature research and the user interface Figure 7: Tablet – Top navigation bar. The navigation guidelines from [57]-[59]. The functional requirements options can be accessed through a hamburger as well as are represented as use cases in this paper [39], [60]. An the settings and profile icons. administrative matter is that robots should be formed to groups, users get roles assigned with different rights (admin, manager, worker, visualization), and robot Figure 8: Mobile – Top navigation bar and a bottom groups get assigned to users. Users need to obtain bar navigation where icons with text labels are used information about the robots and their locations, need because most people hold their phone one handed and to be able to interact with and control them, need therefore can’t easily reach the top [39], [61], [63], [64]. information about their jobs, need to get always informed about errors and alerts, and access historical The focus of the GUI lies on a central view in the form data for root-cause analysis. It could be identified that a of a map on which the robots are visually represented in clean, simple, configurable, and concise GUI with the real-time. Filtering and customization is offered opportunity to access different pages easily at all time is whenever possible to give the user the possibility to required to perform such functions. Additional partly determine the level of details by himself, to requirements for the GUI are an overview of the reduce or increase the amount of information and to vehicles through a map and also intuitive access to adapt the type of information needed [65]. Besides, them, to be self-explanatory, filterable, customizable, breadcrumbs are displayed when the user goes one understandable and appealing. Furthermore, it should level deeper into the page to give the user a better

orientation and an additional possibility to go back selected, the user gets information about the robot, such easily next to the browser arrows. as the name, ID, manufacturer, the congruence to the map in percent, the battery status, and the connectivity Detailed design strength to the Wi-Fi. In addition, the user can pause or The icons used in the GUI were created with Adobe stop the individual robot, can localize it manually “[…] Illustrator4. Parts were already existing and therefore because they often loose their orientation, that’s the most re-used due to the already created mapping process. common problem we have” (P4), and can click on “Robot Although different user roles are offered - admin, analysis”. Information about the job contain if the robot manager, worker, and visualization - only the admin is loaded or unloaded, how many minutes it takes until role has been worked out5. Users with this role have full the current job is finished, the transport order number, rights and access to all functions. For the other roles, what kind of activity it is performing, the total the GUI will be almost the same, with the difference that approximate duration for the job and how many rights will have to be reduced and functions partially kilometers it has already been driven in this job. If the hidden. robot has an error or alert, the message is also displayed here. For more detailed information, the user The top navigation bar offers actions to functions can click the button “Job details” and gets redirected to that should be accessible anytime. It is possible to pause the “Jobs” page. A further activity which can be or stop all robots at once from there. The pause function performed by clicking on a robot is to take control over was added later as an addition based on feedback from the robot and to drive it manually with a virtual evaluators. To avoid any mistake, the user must confirm joystick. If a robot has a problem, it is indicated by an these actions before they are executed. The user also orange alert or a red error sign depending on the gets informed about robot errors and alerts, can search severity. If an error appears in combination with an for help, change settings, and access the personal obstacle, real-time scan data is displayed to “[…] get a profile. The search also contains a link to the user first impression about the possible cause” (P5). To ensure manual. The meanings are explained to the user that alerts and errors are not overlooked, errors are when hovering over them. highlighted directly above the map as well. Short error The home page “My Dashboard” consists of different and alert messages are always provided. To help the widgets (see Figure 9). Those widgets represent all user to solve the problem, they can be clicked on and components available in the GUI in a smaller format. are pre-written in the search field in the top navigation bar. A solution is then presented directly. This feature emerged out of the evaluation. Next to the map view, two additional views are offered. Between those three views, the user can easily navigate without leaving the “Robots” page. The tile view shows the same information the user gets when clicking on a robot on the map, but as an overview over all at once. The list view is the result of a request from an evaluator and shows this information shortened in a listed form to offer suitable views for different fleet Figure 9: My Dashboard page with a pre-defined, sizes. When the user leaves the “Robots” page and customizable selection of widgets: robots map view, returns afterwards, the view from which the page was analysis, error log. left is displayed to ensure that the preference is The next link in the navigation bar leads the user to maintained and does not need to be changed by the user “Robots”, where a map of the environment gives an each time. overview of the live locations of the robots. “The maps The information displayed on the “Jobs” page offers a are difficult to understand so it would be great if it would structured overview over current and past jobs. It is be very abstract at first with as less details and rules as similar to the tile view on the “Robots” page, but with possible but zoom able to get this information when more job details informing the user about the start and required” (P4). Additionally, the map can always be end station of the job and by whom and when the job transformed into full-screen. When the robots are was created (see Figure 10). The “Analysis” page shows important KPIs in “an 4 https://www.adobe.com/products/illustrator.html 5 For an overview of the prototype see Appendix D easy to understand graph format” (P6). Like the dashboard it consists of several widgets that allow

Figure 10: Development Jobs page. A job summary is displayed on the top, followed by interrupted, current and upcomi ng jobs. Upcoming jobs can be deleted or moved in priority, deleted and completed jobs are not displayed by default. customization. It is also possible to customize widgets that allows pausing or stopping the robots and with individually and to download them. An overview of the icons which could be remedied through design and icon robots, manufacturers, and jobs is shown as well as changes. Inconsistencies were noticed where different several more detailed statistics. If the “Analysis” page is terms for the same purpose and inconsistent forms accessed through one of the “Robot analysis” buttons on were used. After the test, the participants were asked to the “Robots” or “Jobs” pages, the user is already one rate their satisfaction with the interface. The average level deeper and the analysis is related to the robot for score reached was 5.5 (1 = not at all; 7 = very). Positive which the user wants to view the analysis. aspects were the navigation which “gives a quick overview over the whole application” (P9) and the icons The “Logs” page shows a list with all log types “I like the usage of icons and not only text. They are easy arranged chronologically. Active errors and alerts are to understand” (P13). “The possibility to directly click on highlighted. For every log more details can be accessed, robots in the map is definitely the best feature for me” and they can be downloaded or deleted. When deleted, (P12). Besides, the color highlights of the user's the log is no longer displayed in the log, but can still be position, the customization and filtering options, which accessed through the filter function. The six different “[…] give the freedom to set up everything in the needed log types (error, alert, job, system, maintenance, and individual way” (P9). Lastly, the three different robot change log) can also be viewed individually. The views, which “[…] give the choice to self-select the most analysis and log pages serve the purpose of easy root- suitable view” (P11), the possibility to zoom, the easy to cause analyses which was identified by [27] as a crucial understand analysis widgets, and in general the clear need and requested by the interviewees as well. structure, the minimalistic design and the easy handling Phase 3: Evaluation were mentioned positively. The following was The participants testing the wireframes were mentioned negatively. Besides the mentioned points provided with them and shared their screen while they during the test, a summary of the robots and jobs was carried out tasks given to them live by the researcher. missed, and it was pointed out that attention should be The tasks were based on real world usage and the focus paid to highlight important information and to keep the was on user experience, user flow, and information GUI minimalistic. architecture. It was tested on whether the defined use In the heuristic evaluation, violations of four cases can be completed successfully, whether their heuristics could be found (see Appendix C.3). The expectations can be met, and on the comprehensibility addition of breadcrumbs, a personal dashboard, and of the structure, navigation, language, and icons. The icon hovering for explanations was proposed. comments of the participants could be divided into Additionally, a re-naming of a description was general improvements, incomprehensibility, and suggested to meet the expectation and to change some inconsistencies (see Appendix C.2.1). General icons for better understandability. improvements were, for example, suggestions to change labels, to move or add functions and icons, and to The usability testing procedure for the Adobe XD change elements for clearer highlighting or clarification. prototype was the same as with the wireframes. There Incomprehensibility arose especially with the function the focus was on the user interface to test the visual

design. The comments of the participants could be level of details divided into the same three categories (see Appendix personal dashboard C.2.2). General improvements were, for example, to different rights/functions/details provide an easier access to the joystick by adding an for different user roles additional joystick next to the map “[…] so it doesn’t information on demand overlay the map and the other robots” (P18), to visually separate the pause and stop icons in the top navigation usable on different devices bar, and to add a security warning the user needs to highly abstracted map with robots confirm when clicking them. Further suggestions live location as central element concerned minor display errors, shadow reductions, observability (color-coded) and color changes. Incomprehensibilities included error and alert notifications always wording and robot color changes, to not show the visible deleted logs directly, and to make it visually clear when logs where every incident is saved it is possible to zoom the map and when not. information from logs in graph form Inconsistencies noticed were to add a hover for all the icons instead of only a few, to adjust some margins and analyzability on a separate analysis page for heights, to move the dropdown on the Logs pages into straightforward interpretation the filter, and to use an icon instead of the customize root-cause analyses button. The average satisfaction score reached was 5.6. robot and job information directly Positive aspects highlighted were the high usability, on map through click on robot robot and job usage of well understandable icons, the widgets, the robot and job information all at customizability and the top navigation bar because of awareness once as quick overview through the good overview over all pages. “I really like how additional views everything is arranged. It is very organized and clear which makes it easy to understand and suitable for the pausing and stopping robots (all at intended use” (P15). The only negative aspect besides intervention once or individually) the suggested improvements during testing was to add driving robots manually more colors. In comparison to the recommendations identified DISCUSSION through the literature research and to the other GUIs The aim of this study to design and develop a that were examined, this GUI resembles and differs in graphical user interface with high usability for the the following ways. Common features are that it is a monitoring process of a cross-manufacturer AGV fleet responsive web application suitable for different could be successfully realized. Out of the design process devices, where robots can be assigned to users. The web main criteria could be derived which is why the application concept is also used by MiR and KUKA [25], research question could be answered successfully. [26]. Furthermore, it is possible to stop the robots from Table 1 summarizes what should be considered when any page as it is also recommended by [34] and [37]. designing such a GUI in this domain. The criteria are Notifications are centrally located to not be overseen partially similar to the categories found by [34] but and additionally color-coded to indicate the importance differ at the same time as they are specifically tailored of the message as [34] and [35] found out as well. [34], to the monitoring process. In addition to these, more [35], and [37] recommend the usage of a map showing general criteria such as intuitiveness, simplicity, the live locations of the robots and their environment consistency, productivity, attractiveness and which is the central element in this GUI as well. The informativeness must be considered as well. movements, health, the status and further Table 1: Design criteria supplementary robot information can be easily Criteria Description identified there and monitored [34], [36], [37]. From the map, the robots can be controlled by the user which forming robot groups is pointed out by [34], [35], and [37] as well. [34] and assigning roles to users for different administrability [35] make aware of the importance of robot health and purposes status summaries to increase mission awareness which assigning robot groups to users is also offered in this GUI. The suggestions of simplicity, filtering the possibility to disable features and the use of a clean adaptiveness customizing views and straightforward design by [35] could be kept by zooming the map to get different using a central top navigation bar which gives a clear

overview, by providing the users only with the rights same information as the other pages but with the and functions assigned to them, by the possibility to advantage to monitor different and also independent customize and filter which allows the display of less or factors at once. Important information and statistics are more information, and by using well-known icons. The summarized on the analysis page which can be used for filtering options also play a major role in the MiR and a quick and easy to understand overview, but also for KUKA GUIs [25], [28]. Color codes and color changes are root-cause analyses for which eventually the more used to inform about robot states and to attract detailed logs are needed in addition. attention as described by [35], [36], and [37], although Finally, it has a clear structure, which makes the intentionally very subtle with only red and orange. The application easy to understand, supported by the fact dashboard with customizable widgets is also used in the that the user does not need to go deep into the pages, GUIs from MiR and KUKA [25], [28]. It does not show all which prevents long click distances and a loss of significant KPIs in this GUI as this is included in the overview and orientation. These factors result in the analysis but serves the purpose of keeping an eye on user being able to perform his tasks effectively, several pages at once, depending on individual needs. efficiently, and satisfactorily. In summary, elements Analysis are offered as well but with a more extensive from existing solutions could be partially reused, selection of statistics presented in an easy to adapted, and combined, but also new elements could be understand way. Graphs are used for an easy to invented to create an innovative, user-friendly understand illustration [25]. Logs have also been application. Nevertheless, it must be taken into account integrated, but with clear statements for the standard that the developed GUI shows one way of designing and user and the possibility to go into more detail for more developing a GUI for this particular case, but not the advanced users [25] to satisfy the need of root-cause only possible one. analyses stated by [27]. The GUI differs in the following points. The most Methodology and limitations significant difference is that it is designed for several The chosen methods interview, usability testing, and and cross-manufacturer robots. This automatically heuristic evaluation, entail the risk that the participants leads to some differences, such as that information like may not answer entirely truthfully. Especially in the current status of one robot or the energy level are usability testing and heuristic evaluation, the not placed in the navigation bar as it is done in the MiR participants may not want to criticize the researcher in GUI [25] but it was ensured that this can be always seen his work. In addition, there is a risk, particularly in at a glance at the different pages. The use of alarm interviews, that the information received may not be sounds as recommended by [35] and [37] was not objectively recorded by the interviewer. implemented because of the industrial environment the Due to the COVID-19 virus, the following limitations GUI is used in, but it is designed in a way that this is not have arisen. It prevented further interviews with end- crucial for the user. The use of videos sent by the robots users at trade fairs. Furthermore, the evaluations had to has also been neglected although [34] suggested it, as it be conducted remotely, which made them more is a hardware matter to equip all of them, with a impersonal and the participants and their body camera. Instead, the GUI is characterized by the language could not be obs erved but only their actions following facts. Every page can be accessed through the on the screen. It also made it not possible to include real top navigation bar, which gives a quick overview of the end-users in the evaluations as intended in user- whole application. The use of an abstract map gives a centered design. However, since the GUI should be simple overview and prevents information overload. designed in a way that anyone can use it, other people However, the option to zoom-in and zoom-out offers the with AGV knowledge were consulted. Overall, based on possibility for details on demand. The possibility to the number of interviewees and evaluation participants directly click on the robots from the map can be used, it must be considered that the results cannot be but does not have to, as this information is also generally applied. available at a glance in the other two views. The combination of the three views for the robots gives the Future research user, on the one hand, more options according to his If the work had been more extensive, the following taste, and on the other hand, it has been taken into other approaches would have been conceivable and account to make the monitoring of different fleet sizes could be considered for future work. A focus group as comfortable as possible. This adaptivity is also could have been carried out additionally to the reflected in the central filter element and the interviews, because other issues might have come up customization options to provide the user only with this way through the group setting. Also, before the needed information. The personal dashboard offers the interviews with the end-users, a quantitative survey

could have been carried out to obtain quantifiable data. https://www.ipa.fraunhofer.de/de/presse/presseinformationen Then the most frequently mentioned points could have /FTS-Forum-2019.html [5] Fraunhofer Institute for Manufacturing Engineering and been discussed in more detail in personal interviews Automation IPA (2020b). Technology. Retrieved from with some end-users. Alternatively, on an even larger https://www.node.fraunhofer.de/en/technology.html#tabpanel scale, qualitative interviews could have been conducted -1825435216 with end-users first in order to create more specific [6] Chova, L. G., Belenguer, D. M., & Martinez, A. L. (Eds.) (2011). THE IMPORTANCE OF GRAPHIC USERS INTERFACE, ANALYSIS OF surveys based on the results and to discuss the survey GRAPHICAL USER INTERFACE DESIGN IN THE CONTEXT OF results afterwards in-depth with end-users again in the HUMAN-COMPUTER INTERACTION. 3rd International Conference form of interviews. Evaluations could have been on Education and New Learning Technologies. Barcelona, Spain: performed more often, such as performing a usability International Association of Technology, Education and Development (IATED). test directly with the paper and pen prototype and [7] Gorecky, D., Schmitt, M., Loskyll, M., & Zuhlke, D. (2014). Human- doing testing with the responsive versions of the Adobe machine-interaction in the industry 4.0 era. In 2014 12th IEEE XD prototype. In future, when the GUI has been fully International Conference on Industrial Informatics (INDIN) developed, a large-scale summative evaluation would (pp. 289–294). IEEE. https://doi.org/10.1109/INDIN.2014.6945523 be advisable before it will be launched. It could also be [8] International Organization for Standardization (2018). ISO evaluated whether a mobile application should be 9241-11:2018(en) Ergonomics of human-system interaction: offered in addition to the web application to make use Part 11: Usability: Definitions and concepts. Retrieved from of additional functions like for example push https://www.iso.org/obp/ui/#iso:std:iso:9241:-11:ed-2:v1:en [9] Interaction Design Foundation (n.d.b). What is User Experience notifications which could be helpful if the robots (UX) Design? Retrieved from https://www.interaction- operate autonomously 24 hours a day, 7 days a week in design.org/literature/topics/ux-design the future. Then an even more sophisticated support [10] Saffer, D. (2010). Designing for interaction: Creating innovative and monitoring system will be necessary when no one applications and devices (2nd ed.). Voices that matter. Berkeley, CA: New Riders. is directly on site. In this case, such a feature could be [11] Ullrich, G. (2015). Automated Guided Vehicle Systems: A Primer helpful or even necessary. with Practical Applications. Berlin, Heidelberg: Springer-Verlag. Retrieved from http://dx.doi.org/10.1007/978-3-662-44814-4 CONCLUSION [12] Schulze, L., & Wüllner, A. (2006). The Approach of Automated The GUI developed and the design criteria derived Guided Vehicle Systems. In 2006 IEEE International Conference from it pave the way for the factory of the future. Since on Service Operations and Logistics, and Informatics (pp. 522– 527). IEEE. https://doi.org/10.1109/SOLI.2006.328941 the monitoring of vehicles is an essential part of the [13] Fazlollahtabar, H., & Saidi-Mehrabad, M. (2015). Autonomous daily work of the end users, the GUI makes their daily Guided Vehicles: Methods and Models for Optimal Path Planning handling much more manageable and comfortable. On (Vol. 20). Switzerland: Springer International Publishing. the one hand by combining several GUIs into one and on [14] VDI Verein Deutscher Ingenieure e.V. (2013). Automated guided vehicle systems (AGVS): Safety of AGVS. the other hand by the optimal handling due to the high [15] Yao, X. (2018). Industry 4.0 in Logistics (Master's dissertation). usability. POLITECNICO DI TORINO, Torino. Retrieved from https://pdfs.semanticscholar.org/32f7/c1f84d30c4fb15d3083e Especially in the field of intralogistics it could be 84fff2f236d3254a.pdf shown that more and more mobile autonomous robots [16] Clark, J. (2019). Automated Guided Vehicles (AGVs) vs. are used, and therefore such a GUI becomes more and Autonomous Mobile Robots (AMRs): Debunking the Myths. more essential. The results contribute to such a Retrieved from https://www.dematic.com/en-us/downloads- and-resources/white-papers/agvs-vs-amrs/ realization and are also suitable as a starting point for [17] Koseoglu, M., Celik, O. M., & Pektas, O. (2017). Design of an the transfer to other areas than intralogistics or to other autonomous mobile robot based on ROS. In IDAP'17, robot types. International Artificial Intelligence and Data Processing Symposium (pp. 1–5). Piscataway, NJ: IEEE. https://doi.org/10.1109/IDAP.2017.8090199 REFERENCES [18] Nehmzow, U. (2000). Mobile Robotics: A Practical Introduction.

[1] WORLD ECONOMIC FORUM (2016). The Future Of Jobs: London: Springer London. Employment, Skills and Workforce Strategy for the Fourth [19] Norman, D., & Nielsen, J. (n.d.). The Definition of User Experience Industrial Revolution. Retrieved from (UX). Retrieved from http://www3.weforum.org/docs/WEF_Future_of_Jobs.pdf https://www.nngroup.com/articles/definition-user-experience/ [2] Murphy, A. (2017). AGV Deep Dive: How Amazon’s 2012 [20] Rogers, Y., Sharp, H., & Preece, J. (2011). Interaction design: Acquisition Sparked a $10B Market. Retrieved from Beyond human-computer interaction (3rd ed.). Safari Tech Books https://loupventures.com/agv-deep-dive-how-amazons-2012- Online. Chichester: Wiley. acquisition-sparked-a-10b-market/ [21] Teo, & Yu Siang (2020). What is Interaction Design? Retrieved [3] Fraunhofer Institute for Manufacturing Engineering and from Automation IPA (2020a). Application & Use. Retrieved from https://www.interaction-design.org/literature/article/what-is- https://www.node.fraunhofer.de/en/application.html interaction-design [4] Fraunhofer Institute for Manufacturing Engineering and [22] International Organization for Standardization (n.d.). ISO 9241- Automation IPA (2019). 8. Technologieforum Fahrerlose 11:2018(en), Ergonomics of human-system interaction — Part Transportsysteme und mobile Roboter. Retrieved from

11: Usability: Definitions and concepts. Retrieved from [41] Interaction Design Foundation (n.d.). User Centered Design: UCD https://www.iso.org/obp/ui/#iso:std:iso:9241:-11:ed-2:v1:en is an Iterative Process. Retrieved from https://www.interaction- [23] Interaction Design Foundation (n.d.a). User Interface (UI) Design: design.org/literature/topics/user-centered-design What is User Interface (UI) Design? Retrieved from [42] Lazar, J., Hochheiser, H., & Feng, J. H. (2017). Research methods in https://www.interaction-design.org/literature/topics/ui-design human-computer interaction (2nd ed.). Cambridge, MA: Morgan [24] Razzaq, M. A., Qureshi, M. A., Memon, K. H., & Ullah, S. (2017). A Kaufmann. Survey on User Interfaces for Interaction with Human and [43] Denscombe, M. (2010). The good research guide: For small-scale Machines. International Journal of Advanced Computer Science social research projects (4th ed.). Open UP study skills. and Applications,, 8(7). Retrieved from Maidenhead, England: McGraw-Hill/Open University Press. https://pdfs.semanticscholar.org/a5ab/e99363013237cbc0cc27 [44] Bryman, A. (2012). Social research methods (4th ed.). New York: 8aa3724fe2924c9d.pdf Oxford Univiversity Press. [25] Mobile Industrial Robots (2018). MiR Robot Interface 2.0: [45] Johannesson, P., & Perjons, E. (2014). An introduction to design Reference Guide. Retrieved from https://www.mobile- science (1st ed.). Cham: Springer. industrial-robots.com/media/2806/mir-robot-interface-20- [46] Braun, V., & Clarke, V. (2006). Using thematic analysis in reference-guide-v13-en.pdf psychology. Qualitative Research in Psychology, 3(2), 77–101. [26] KUKA AG (n.d.). KUKA Connect: Robot data transformed into https://doi.org/10.1191/1478088706qp063oa actionable insights. Retrieved from [47] Mortensen, D. (2020). How to Do a Thematic Analysis of User https://connect.kuka.com/en/ Interviews. Retrieved from [27] Kinbarovsky, J. (n.d.). KUKA Robotics. Retrieved from https://www.interaction-design.org/literature/article/how-to- http://www.ilikejesse.com/kuka-robotics do-a-thematic-analysis-of-user-interviews [28] KUKA AG (n.d.). Asset Management. Retrieved from [48] Nielsen, J. (2003). Paper Prototyping: Getting User Data Before https://connect.kuka.com/HelpCenter/en/Asset_Management.h You Code. Retrieved from tml https://www.nngroup.com/articles/paper-prototyping/ [29] KUKA AG (n.d.). Messaging and Alarming. Retrieved from [49] Baxter, K., Courage, C., & Caine, K. (2015). Understanding your https://connect.kuka.com/HelpCenter/en/Messages_Alarming.h Users: A Practical Guide to User Research Methods (2nd ed.): tml Elsevier. [30] KUKA AG (n.d.). Routine Maintenance. Retrieved from [50] Cooper, A., Reimann, R., Cronin, D., & Noessel, C. (2014). About https://connect.kuka.com/HelpCenter/en/Routine_Maintenance face: The essentials of interaction design (4th ed.). Indianapolis, .html IN: Wiley. Retrieved from [31] KUKA AG (n.d.). Condition Monitoring. Retrieved from https://ebookcentral.proquest.com/lib/subhh/detail.action?doc https://connect.kuka.com/HelpCenter/en/Condition_Monitorin ID=1762072 g.html [51] Nielsen, J. (2000). Why You Only Need to Test with 5 Users. [32] KUKA AG (n.d.). Change Log. Retrieved from Retrieved from https://www.nngroup.com/articles/why-you- https://connect.kuka.com/HelpCenter/en/Change_Log.html only-need-to-test-with-5-users/ [33] KUKA AG (n.d.). Automated Guided Vehicle (AGV) Monitoring. [52] Laubheimer, P. (2018). Beyond the NPS: Measuring Perceived Retrieved from Usability with the SUS, NASA-TLX, and the Single Ease Question https://connect.kuka.com/HelpCenter/en/AGVmonitoring.html After Tasks and Usability Tests. Retrieved from #AGVTab https://www.nngroup.com/articles/measuring-perceived- [34] Steinfeld, A. (2004). Interface lessons for fully and semi- usability/ autonomous mobile robots. In IEEE International Conference on [53] Nielsen, J. (1994). How to Conduct a Heuristic Evaluation. Robotics and Automation, 2004. Proceedings. ICRA '04. 2004 (Vol. Retrieved from https://www.nngroup.com/articles/how-to- 3). IEEE. https://doi.org/10.1109/ROBOT.2004.1307477 conduct-a-heuristic-evaluation/ [35] Poli, L. (2013). User Interface for a Group of Mobile Robots. [54] Wong, E. (2020). Heuristic Evaluation: How to Conduct a University of Western Australia, Australia. Retrieved from Heuristic Evaluation. Retrieved from https://www.interaction- http://robotics.ee.uwa.edu.au/theses/2013-Robotics-Interface- design.org/literature/article/heuristic-evaluation-how-to- Poli.pdf conduct-a-heuristic-evaluation [36] Nakamura, A., Kakita, S., Arai, T., Beltran-Escavy, J., & Ota, J. [55] Social Research Association (2003). Ethical guidelines. Retrieved (1998, May). Multiple mobile robot operation by human. In 1998 from IEEE International Conference on Robotics and Automation https://the- (pp. 2852–2857). Piscataway: IEEE. sra.org.uk/common/Uploaded%20files/ethical%20guidelines% https://doi.org/10.1109/ROBOT.1998.680614 202003.pdf [37] Shenoy, S. A., Viswanadha, L. N., & Agah, A. (2002). GRAPHICAL [56] WHO Research Ethics Review Committee (n.d.). Templates for USER INTERFACES FOR MOBILE ROBOTS. Lawrence, Kansas. informed consent forms: Informed consent for qualitative Retrieved from studies. Retrieved from https://www.who.int/ethics/review- https://www.ittc.ku.edu/publications/documents/Shenoy2002_ committee/informed_consent/en/ ITTC-FY2003-TR-27640-01_Report.pdf [57] Nielsen, J., & Molich, R. (1990). Heuristic evaluation of user [39] Shneiderman, B., Plaisant, C., Cohen, M., Jacobs, S., & Elmqvist, N. interfaces. In J. C. Chew, J. C. Carrasco, & J. Carrasco Chew (Eds.): (2018). Designing the user interface: Strategies for effective Vol. 1990. Human factors in computing systems, Empowering human-computer interaction (6th ed.). Boston, Columbus, people: CHI '90 conference proceedings (pp. 249–256). Reading, Indianapolis, New York, San Francisco, Hoboken, Amsterdam, MA: Addison-Wesley. https://doi.org/10.1145/97243.97281 Cape Town, Dubai, London, Madrid, Milan, Munich, Paris, [58] Shneiderman, B. (1987). Designing the user interface: Strategies Montréal, Toronto, Delhi, Mexico City, Sao Paulo, Sydney, Hong for effective human-computer interaction (Repr. with corr). Kong, Seoul, Singapore, Taipei, Tokyo: Pearson. Reading, MA: Addison-Wesley. [40] Adams, J. A. (2002). Critical Considerations for Human-Robot [59] Shneiderman, B., & Plaisant, C. (2009). Designing the user Interface Development. 2002 AAAI Fall Symposium: Human Robot interface: Strategies for effective human-computer interaction Interaction Technical Report FS-02-03. Retrieved from (5th ed.). Boston: Pearson Addison-Wesley. https://www.cs.rit.edu/~jaa/Papers/AAAISymposium.pdf [60] Department of Health and Human Services (2013). Use Cases. Retrieved from

https://www.usability.gov/how-to-and-tools/methods/use- cases.html [61] Pernice, K., & Budiu, R. (2016). Hamburger Menus and Hidden Navigation Hurt UX Metrics. Retrieved from https://www.nngroup.com/articles/hamburger-menus/ [62] Pernice, K., & Budiu, R. (2016). How to Make Navigation (Even a Hamburger) Discoverable on Mobile: Guidelines for Navigation on Mobile. Retrieved from https://www.nngroup.com/articles/find-navigation-mobile- even-hamburger/ [63] Budiu, R. (2015). Basic Patterns for Mobile Navigation. Retrieved from https://www.nngroup.com/articles/mobile-navigation- patterns/ [64] Hoober, S. (2013). How Do Users Really Hold Mobile Devices? Mobile matters. Retrieved from https://www.uxmatters.com/mt/archives/2013/02/how-do- users-really-hold-mobile-devices.php [65] Moran, K. (2018). Helpful Filter Categories and Values for Better UX. Retrieved from https://www.nngroup.com/articles/filter- categories-values/

APPENDICES

APPENDIX A AGV GUIS

A.1 Mobile Industrial Robots (MiR)

MiR Analytics

MiR System log

MiR Error logs

MiR Hardware health

MiR Safety system

MiR Mission log

A.2 KUKA

KUKA Data Dashboard

KUKA Robot Filtering

KUKA Robot Details

KUKA Alerts

KUKA Condition monitoring

KUKA AGV Monitoring

APPENDIX B DATA COLLECTION

B.1 Interview guides

B.1.1 Developers

Participant: ______

1 Introduction

Welcome - Self-presentation (university, course of studies) - Reason for the focus group (Master Thesis, Robotics and Assistive Systems, Supervisor Fraunhofer: Falk Engmann) - The topic of the Thesis

Problem and aim of the Thesis - Problem, aim, background, results literature research - Achieving aim through: Interviews with developers (Fraunhofer IPA employees), interviews with end-users and usability tests

Rights explanation - Information sheet - Certificate of Consent

Introduction of the interviewees

- Education / Background - Career at the company (departments, positions)

Notes:

2 Main Part: Prerequisites

WHO? 1) Who are the users that deal with the monitoring process (background, degree of technical affinity, roles of the people) 2) Who monitors what robot? One person every robot or are robots assigned to people?

Notes:

WHAT?

1) What needs to be monitored (also key figures)? From the perspective of the system administrator (which is what Fraunhofer also does) and from the perspective of end users. 2) What input device is suitable to do this?

3) What are the key figures by which the success is measured?

Notes:

WHEN? 1) When is monitoring carried out? 2) When do they need what information (navigation/information architecture)?

Notes:

WHERE? 1) Where do they do monitoring (environment, etc.)?

Notes:

HOW?

1) How do they identify the robots? By name, ID…? 2) How do they interact with the interface/control it? 3) How can manufacturer independence be guaranteed? 4) How often do they do monitoring?

5) How could a perfect monitoring GUI look like?

Notes:

3 End - Do they want to add anything else? - Thank you - For further information/results contact me

B.1.2 End-users

Company: ______

Participant: ______

Number: ______

1 Introduction

Welcome - Self-presentation (university, course of studies)

- Reason for the interview (Master Thesis, Robotics, and Assistive Systems, Supervisor Fraunhofer: Falk Engmann) - The topic of the Thesis

Problem and aim of the Thesis - Problem, aim, background, results literature research - Achieving aim through: Interviews with developers (Fraunhofer IPA employees), interviews with end-users and usability tests

Rights explanation - Information sheet - Certificate of Consent

Introduction of the interviewee

- Education - Career at the company (departments, positions) - Age

Notes:

1) Preliminary question 1) Are you using AGVs already in your plant? If yes, for how long, what is your strategy, do you want to expand it, are you using AGVs from different manufacturers? If no, have you planned to use them in future? Notes:

2 Main Part

WHO? 1) Who are the people in your company that deal with the monitoring of AGVs/AMRs (background, degree of technical affinity, roles of the people) 2) Different roles needed for monitoring? (administrator / user / etc.?) 3) Who monitors what robot? One person every robot or are robots assigned to people? 4) Who monitors what part? Is someone monitoring everything or is this separated?

Notes:

WHAT? 1) What is your task / what do you monitor?

2) What skills are needed to do this? 3) What functions do you need to be able to do this? 4) What input device is suitable for you to do this? Different devices for different tasks/people? 5) What AGV manufacturer(s) do you use in your logistic environment? 6) Do you have trust in such robots that move autonomous and with AI? If not, what information (or similar) do you need to increase the trust? 7) Do you have problems/frustrations with the actual monitoring process/software? If any: Is it just inefficient or also not effective? Can you accomplish all your goals or is it not possible when manual integrating autonomous systems of different manufacturers? What needs to be improved to overcome these problems (wishes, expectations, needs)? 8) What do you like about the actual situation / about the actual monitoring process/software?

Notes:

WHEN? 1) When is monitoring carried out? 2) When do you need what information (navigation / information architecture)?

Notes:

WHERE? 1) Where do you do monitoring (environment, etc.)?

Notes:

WHY? 1) Why is monitoring important for you / the company?

Notes:

HOW?

1) How do you identify the robots? By name, ID…? 2) How often do you do monitoring? (daily, weekly etc.; is someone permanently working all the time with the data) 3) How do you need the information? (over what time horizon do you look at the data) 4) How can they be displayed in the best way? / How would your perfect monitoring GUI look like?

Notes:

3 End - Do you want to add anything else?

- Thank you - For further information you can always contact me

B.2 Usability Testing - Interview guide and task list

Participant: ______

Number: ______

1 Introduction

Welcome - Self-presentation (university, course of studies) - Reason for the usability testing (Master Thesis, Robotics, and Assistive Systems, Supervisor Fraunhofer: Falk Engmann) - The topic of the Thesis

Problem and aim of the Thesis - Problem, aim, background, results literature research and data collection through interviews - Currently: Development of prototypes. First, wireframes were developed and then a high-fi prototype with Adobe XD

Rights explanation - Information sheet

- Certificate of Consent

Introduction of the interviewee - Education - Career at the company (departments, positions) - Age

Notes:

1) Pre-test questions

1) Have you already participated in a usability test before? Balsamiq: Explanation usability testing and highlighting that this is the first step of testing with wireframes developed based on the data gathered so far. This testing is more informal and the focus lies on the functions and on perception. Based on your feedback I will develop a more advanced prototype. Adobe XD: Explanation usability testing and highlighting that this is the second step of testing. The first testing took place with wireframes developed based on gathered data. This second testing takes place with a high-fi prototype developed with Adobe XD and is a further developed version of the wireframes with the feedback included. The goal of this second testing is to test the visual design.

2) Do you know what AGVs/AMRs are? If no, explanation of the terms

3) Have you dealt with them before? 4) Have you ever seen a GUI of a mobile robot?

Notes:

2 Main part – Task list Now I would like that you complete a number of tasks. Meanwhile, please think aloud the whole time. Tell me what you do, why you do it, why you can do something w ell and why something is not so good or easy to complete. It is not you as a person who is tested, you are now testing the interface and your feedback is very valuable for me. Therefore you are encouraged both to criticize the interface and to tell me the good aspects. 1) How would you describe what the start page (my dashboard) shows you? (without describing the widgets in detail) 2) Please switch to “Robots” 3) What does the map show you? 4) Now you want to zoom-in to view this map in more detail. 5) You want to zoom-in even more. 6) Now you want to return back to the default zoom setting (with one click). 7) You want to transform this view now into full-screen.

8) You want to return back to the initial screen. 9) Now you are interested in seeing only robots from a special manufacturer. 10) Please close the filter. 11) Now you want to set all robots in pause mode (and start them afterwards again) 12) You want to check out now which robot / robots have an error. Before you click something: Which possibilities do you have? (Please choose the one with the notification) 13) You want to investigate this error in more detail. (Closing it through clicking on notification icon) 14) Now you want to check out quickly a few information about a robot which has the status “ok”. 15) You want to drive this robot around manually to overcome an obstacle. 16) Now you want to check out quickly a few information about a robot which has the status “error”.

17) Can you see a short information about what error occurred? 18) Now you want to look at the robots in tile form instead of observing them on the map. 19) Now you want to look at the robots in list form. Please scroll to the right. 20) You want to change back to the tile view.

21) What information do you get about HULK? 22) You want to check out his job in more detail. 23) What additional information / functions do you get? 24) Now you want to filter the jobs to search more specifically. 25) Now, check out some analysis about HULK itself. 26) Now you want to have a look at the logs. 27) What kind of log types can you see here? 28) You want to go into more detail of a log?

29) What indicate the other icons? 30) You want to explore now the other log types. 31) Please have a look at the log you choose and explain what you see. 32) You want to explore even more log types. Please explain the chosen one to me.

33) Now you want to have a look at some Analysis of the robots that are assigned to you. 34) You want to download the analyses. 35) You want to customize the widgets. 36) You want to check what filter is applied. 37) You want to know which robot had the most errors so far.

38) You want to return back to the Robots view.

Notes:

3 Post-test questions 1) How satisfied were you with this interface? 1 not at all ------7 very Supporting: ease of use, understanding of navigation / icons / buttons / language, consistency of design, necessity of functions, improvements

2) Why did you give a score of [X]?

Notes:

3 End - Do you want to add anything else? - Thank you

- For further information you can always contact me

B.3 Heuristic evaluation

Usability heuristics from Nielsen and Molich (1990) “1) Match between system and the real world. Designers should endeavor to mirror the language and concepts users would find in the real world based on who their target users are. Presenting information in logical order and piggybacking on user’s expectations derived from their real-world experiences will reduce cognitive strain and make systems easier to use. 2) User control and freedom. Offer users a digital space where backward steps are possible, including undoing and redoing previous actions. 3) Consistency and standards. Interface designers should ensure that both the graphic elements and terminology are maintained across similar platforms. For example, an icon that represents one category or concept should not represent a different concept when used on a different screen. 4) Recognition rather than recall. Minimize cognitive load by maintaining task-relevant information within the display while users explore the interface. Human attention is limited and we are only capable of maintaining around five items in our short-term memory at one time. Due to the limitations of short-term memory, designers should ensure users can simply employ recognition instead of recalling information across parts of the dialogue. Recognizing something is always easier than recall because recognition involves perceiving cues that help us reach into our vast memory and allowing relevant information to surface. […].

5) Flexibility and efficiency of use. With increased use comes the demand for less interactions that allow faster navigation. This can be achieved by using abbreviations, function keys, hidden commands and macro facil ities. Users should be able to customize or tailor the interface to suit their needs so that frequent actions can be achieved through more convenient means. 6) Aesthetic and minimalist design. Keep clutter to a minimum. All unnecessary information competes for the user's limited attentional resources, which could inhibit user’s memory retrieval of relevant information. Therefore, the display must be reduced to only the necessary components for the current tasks, whilst providing clearly visible and unambiguous means of navigating to other content. 7) Help users recognize, diagnose and recover from errors. Designers should assume users are unable to understand technical terminology, therefore, error messages should almost always be expressed in plain language to ensure nothing gets lost in translation. 8) Help and documentation. Ideally, we want users to navigate the system without having to resort to documentation. However, depending on the type of solution, documentation may be necessary. When users require help, ensure it is easily located, specific to the task at hand and worded in a way that will guide them through the necessary steps towards a solution to the issue they are facing.” [57]

B.4 Overview participants

Interviews P1-P2: developer

P3-P8: end-user

Evaluation wireframes P9-P12: developer P13-P14: designer with focus on UX and UI

Evaluation prototype P15-P18: developer

APPENDIX C COLLECTED DATA

C.1 Requirements specification

Functional requirements

Pre-condition for all use cases: the GUI URL is accessed via a browser and the user is logged in to the web application.

Use case 1: Forming robot groups Actor(s): User with role admin Use case description: At system launch the admin creates groups of robots. Then the admin assigns robots to the defined groups. The system responds by showing the group names with the respective robots. Alternative 1: If a new robot is added to the system the admin adds the robot to a defined group. Alternative 2: If a robot group changes the admin removes the robot from its group and adds it to his new group. Alternative 3: If a new robot group needs to be created, the admin creates a new robot group and assigns robots to it by changing the other groups or adding a new robot to the system.

Use case 2: Assigning user roles and robot groups Actor(s): User with role admin Use case description: The admin creates a new user and assigns a user role (visualization, worker, manager, admin) which entails different rights and different levels of details to guarantee no overload of unnecessary functions. The system responds by adding this new user to a list of all users. Then the user assigns robot groups to users. Alternative 1: If the user rights need to be changed the admin changes the user role of a user. Alternative 2: If the robot responsibilities for a user need to be changed the admin changes the robot groups assigned to this user.

Use case 3: Obtaining robot information Actor(s): User with role admin, manager, worker, visualization Use case description: The user checks the real-time location of his robot(s) by looking at a highly abstracted map without any colors or street rules representing the environment in which the robot moves. On this map the robots are visually displayed and colors indicate at a glance the state of the robot(s): ok, alert, error. If there is an error real -time scan data is provided by the system. To obtain more information about the robot(s), the admin clicks on the visual represented robot and the system provides information about: name, manufacturer, serial number, status of the robot (busy, available, unavailable, ok, error, alert), activity status (driving, standing, waiting, parking), current job information, availability of the robot (1- (time in error/time in operation), battery status, congruence between map and actual localization data in %, operating time, date of its commissioning, distance already driven in km by choosing a given or self-specified time period, loading status, capacity utilization: 1 – (waiting time / operating time), velocity, overdue and upcoming maintenance, wifi connection strength, the last trip (time (in sec) and date), time since last reset, localize button (always visible), indication of the hardware health. Alternative 1: If the user wants to see only specific robot(s), the user applies a filter and the system adapts the view.

Pre-condition: The robots have undergone a mapping process and therefore know their environment.

Use case 4: Changing the appearance of the map Actor(s): User with role admin, manager, worker, visualization Use case description: The user transforms the map showing the robots into full-screen in order to display it on a big monitor. Therefore, the system hides all other information except the map and the robots. In order to change the detail level of the map, the user zooms into the map or zooms out of the map. The user gets more and more details with higher zoom factor.

Use case 5: Obtaining job information Actor(s): User with role admin, manager, worker, visualization Use case description: The user switches the view to get the following information from the system about the jobs of the robots: status (interrupted, cancelled, in-progress, pending, completed), distance driven for a particular job, time spent at a particular job (throughput: x * (product/time), jobs of the present and past, which job is next, the processing status of the orders (start and finish), the transport order number, capacity of the robot Alternative 1: If the user wants to see only specific job, the user applies a filter and the system adapts the view. Alternative 2: If the user wants to delete a job, the user clicks the delete function next to the job and the system deletes it (only admin, manager, worker)

Alternative 3: If the user wants to prioritize a job, the user changes the order and the system moves the specific job in the queue upwards (only admin, manager, worker)

Use case 6: Error and alert notification Actor(s): User with role admin, manager, worker, visualization

Use case description: The system receives an error or alert and immediately displays this error or alert. The user gets this information directly no matter which action the user is performing. Alternative 1: If the user does not understand the error/alert message the admin accesses the user manual directly no matter which action the user is performing.

Use case 7: Access historical data Actor(s): User with role admin, manager, worker, visualization Use case description: The user inspects graphs on a dashboard to easily analyze historical data for root-cause analysis in order to be able to take protective measures at an early stage. Alternative 1: The user accesses the historical data in form of a log (error log, alert log, job log, maintenance log, change log, system log with events that are logged by the operating system components) and investigates the actions in detail.

Use case 8: Setting robots in pause/stop

Actor(s): User with role admin, manager, worker

Use case description: The user clicks on a pause/stop button and the system sets all ro bots from a driving state into a pause or stop state. Use case 9: Manual takeover

Actor(s): User with role admin, manager, worker Use case description: The user takes over the control of the robot and drives it around manually.

Non-functional requirements

The GUI shall…

Structural requirements - be clean, simple, concise: no redundant components, using the minimum of necessary functions, use of standards - be coherent: the different parts of the GUI are related (logically, orderly, consistently) - be configurable and modular: the GUI is divided into several components when possible to make it self-assembling for the different needs of the users - give the opportunity to access the different pages at all time - give an “every time overview” but also intuitive access to individual vehicles - offer informative and continuous feedback that the user feels confirmed and understood in his actions - offer easy reversal of actions

- reduce short-term memory load of people - offer recognition rather than recall - be easy to access/use/learn/remember how to use - be self-explanatory or offer additional information to not be forced to look obscurities up in the manual - introduce users as intuitively as possible to the use of the software

- offer the needed functions and information to the user at the right time

Environmental requirements - be accessible via computer, laptop and mobile phone - be accessible from the browsers Chrome, Firefox, Internet Explorer and Safari - be accessible from every operating system - be usable in landscape and portrait view on mobile devices

Usage qualities:

- be customizable - be filterable

- be intuitive - be understandable (no need to look up the user manual) - be suitable - be accessible 99.9% of all time - be elegant (aesthetics) - be fun and attractive to use - be enjoyable to use - use well-known icons

- use colors - be zoom able per mouse or fingers (depending on the device) - use non-technical language which needs to be looked up - make use of clearly understandable (error) messages

- allow the achievement of the user goals

Management qualities: - be accountable - be maintanable

- be flexible

Generic environmental qualities: - be effective - be efficient

- be complete

C.2 Usability Testing

C.2.1 Wireframes Balsamiq

1. General improvements

1.1 Top navigation bar - renaming “Home” to “Robots” as all the information there is robot information

1.2 Robot map/tile/list view - move the three icons indicating map/tile/list view vertically below the filter because otherwise it could indicate that they belong to the filter (visual separation) - add a notification box directly in the different view which indicates if an error occurs because the “red 1” icon next to the notification icon does not highlight enough that an error occurs and especially at the tile and list view it is not directly visible that there is an error without scrolling

1.3 Robot map view - make sure in the next more developed prototype that it is also possible to zoom with the mouse wheel and the fingers on the touchpad - include an icon through which it is possible to return back to the initial zoom setting with one click

- make the circle which indicates the state of the robot more prominent because it is very small, thin and almost hidden so it seems to be unimportant - change the visual appearance of the robots (same form for robots from the same manufacturer) - add all robots to the map and highlight the ones the user is responsible for - move the information box which appears when clicking on a robot to another part of the map (e.g. to the right) instead of directly opening it below the robot because then the map is partly hidden but it should always be visible to the user - add the name and a short job information directly on the robot that this is visible without clicking on it

1.4 Robot tile view - provide a solution with different levels of details through adding a list view (similar to the logs) with fewer information because it could be a problem to get a good overview without scrolling a lot if there are 20 robots or more - joystick: redirect the user back to the map view when clicking on the joystick in the tile view because otherwise the tile next to needs to be shifted to make room for it

1.5 Robot information - joystick: include an auto and manual button that the person can switch to manual mode and can indicate that the robot is taken over manually now

- joystick: the arrow is good otherwise it is maybe not clear that it can be clicked but move it below the joystick icon instead of placing it at the edge of the rectangle - don’t separate the information in how many minutes the job is completed and his actual status visually from the other job information - add an information in addition to the error message which informs the user how to solve this error

1.6 Jobs - separate the jobs belonging to the same heading more clearly through for example a line or box - make the different parts of the summary at the top of the page clickable so that the user can directly jump to the job status of interest. This entails that it needs to clear that this information is clickable and not only text

1.7 Analysis

- add a general download button next to the customize button in order to give the possibility to download everything at once instead of only giving the possibility to download single widgets

1.8 Logs - add an indication to the error and alert logs whether the error/alert is still there or if it is already resolved - highlight current errors/alerts in “all log types”

2. Incomprehensibility 2.1 Top navigation bar - change the play button into a pause button in order to show the action that can be performed and not the actual state (adapting this to standard use as it is for example well-known from music). This entails that the small dropdown is not suitable anymore and there need to be two icons (pause and stop) in order to indicate that those two options are available - add a description to the stop icon otherwise it is not clear how the pause and stop icons differ in their meanings - use a charging or home icon instead of the stop icon and write “back to charger” or “back home” instead of “back to station” otherwise it is unclear where the robot goes to

2.2 Robot map/tile/list view - change the list icon as it was misunderstood as a where more information can be shown/hide - change the map icon as it couldn’t be identified easily because it looks more like a letter

- remove the opening arrow from the filter icon or directly write “Filter”

2.3 Robot information - add a description to the percentage figure as it is unclear what this is meant to say

- make the joystick icon itself also clickable and not only the arrow - make it clearer that the information shown (e.g. completed in x minutes) is only related to the job the robot is actually performing and that the information is not related to all of his jobs

2.4 Jobs - change the arrow icon as this was misunderstood and interpreted as downloading instead of prioritizing - add the possibility to directly click on a row in order to get more information about the log - add “all log types” to the dropdown otherwise it is not clear to the user how to return to this view

2.5 Logs - change the meaning of the eye icon from “details” to observing/watching because it could be interesting to put logs on a watchlist to keep an eye on them (adding a “details” button for the details instead)

3. Inconsistencies 3.1 Top navigation bar - unify the buttons so that not sometimes “more” and sometimes “details” is used for the same purpose - rename “missions” to “jobs” in the top navigation bar, buttons and headings as those two words are both used for the same purpose

3.2 Jobs - replace the arrow between the two stations which are indicating start and end point of the job by another arrow or line because the same symbol is already used for the play mode

- add a second box on top of the existing one in the list of completed jobs with information about the robot which has executed the job to be consistent with the design of the current jobs

3.3 Logs - all log types: to be consistent, use dashes for empty entries instead of just leaving the entry empty as this is also the case for log entries that only contain one specific log type - for all logs: change Page 1 to Page 1 of X that the user has an idea how many pages there are - change the eye icon for more information into a “detail” button otherwise it is inconsistent how to get more detailed information

Average overall score: 5.6 out of 7 for the following reasons Positive aspects - the use of icons instead of lots of text - icons are easy to understand and well-known

- the top navigation bar is clearly described - the highlighted texts show clearly where the user is located - compared to other interfaces it is intuitive and easy to navigate - the possibility to customize the analysis is very good so that everyone can create a personal dashboard with the needed widgets - generally easy to use - the overall flow - always having the possibility to filter

- clear structure - the expectations could be fulfilled - error messages are defined and defined with clearly understandable words - the possibility to get information about the robot by clicking on it in the map view - the offering of three different views (map/tiles/list) - possibility to zoom in the map view to change the level of details - possibility to move flexible and quick - that you never really have to go deep into the pages - tidy, orderly, minimalist

- analysis

Negative aspects - the suggested improvements already mentioned during the testing - highlight information that is more important than other - “Jobs” and “Robots”: a summary is missing to get information directly on one view with no need to scroll - be careful with the final design: keep it as minimalistic as possible (also with the functions)

C.2.2 Prototype Adobe XD

1. General improvements 1.1 Top navigation bar

- reduce the shadow - visually separate the pause and stop icon from the other icons in the top navigation bar, as these are related to the user and settings and the other two to the robots - add a security warning to the pause and stop icons that the user needs to confirm before the action can be performed

1.2 My Dashboard - make the filter icons smaller

1.3 My Dashboard and robot map view - error message: move the arrow and the circle a bit up that it is in line with the rectangle

1.4 Robot map view - when clicking on FRODO some information from HULK below are shining through - it is not possible to close the joystick - make a border around the map when no zoom is applied to make it clear that the complete map is displayed and the user is not only looking at a part - add a timer function to the full-screen mode that the hover information also disappears when the mouse is not moved for a short time

1.5 Robot list view

- prevent horizontal scrolling (idea: make two different lists – one containing job and one robot information although then those information can’t be displayed together or write the information which doesn’t fit in the line below although the line height must be increased then)

1.6 Jobs - move interrupted jobs up that it comes first instead of current jobs - reduce the shadow of the separation lines

1.7 Analysis

- manufacturer widget: make it clearer where the lines belong to - the usage of the color blue for the customizations within a widget highlight those settings too much and distract from the important graphs

1.8 Different views - change the horizontal scrolling from “click” to “wish” if possible - open filter: move the search icon from the left to the right - change the information icon behind the error messages to a search icon in order to establish directly a connection to the search in which the user gets suggested how to solve the error - joystick: to make it easier to access the joystick with less clicks and to use it also remotely add it to the right side otherwise there is a huge box open below the robot in the map view especially when driving the robot manually

2. Incomprehensibility 2.1 Top navigation bar - change the text from “Details” to “Error log” in the opened notification

2.2 Robot map view - make it clearer where the front of the robot is (even though this might be clear when they are actually moving on the map) - zoom: change color of the robot. Now it has the same color as the street so it could be thought that the colored path is the path of the special robot which is actually driving there but not a generally a one-way street - zoom: change the plus and minus icons to grey when they cannot be clicked (plus icon cannot be clicked when the highest zoom factor is already applied and minus icon cannot be clicked when the user is located on the map without any zoom which can’t be zoomed out)

2.3 Jobs - make the separation between the different sections more clear (idea: make a box around instead of online using the separation line)

2.4 Logs

- remove the deleted logs from the list and put it either at the bottom or in an archive for example but don’t show them as deleted in the log itself anymore

2.5 Different views - change the icon indicating that the robot is loaded (make it more look like it is transporting a package otherwise it looks like a normal robot)

3. Inconsistencies 3.1 Top navigation bar

- add a hover for the pause icon (same as for the home icon)

3.2 Robot map view - make the home positions the same size, they have different heights at the moment

3.3 Analysis - adjust the margins between the two first widgets to the same size as the other widgets (they are bigger)

3.4 Logs - put the content of the dropdown (the log type selection) in the filter to be consistent with the other pages - keep the log type column everywhere instead of just having it in the “all log type” log that it is even more clear which log type the user is currently looking at

3.5 My Dashboard and Analysis - change the customize button into an icon (e.g. a pencil) because for everything else icons are used

3.6 Different views

- uniform error message: it can sometimes be folded out and sometimes it is directly visible

Average overall score: 5,6 out of 7 for the following reasons Positive aspects - very high usability - clear / clearly arranged - top navigation bar gives a good overview - icons are well understandable - user gets all the information he is looking for - good idea with the widgets and the customizability on the dashboard and analysis views

- design is just right: orderly and organized

Negative aspects - the suggested improvements already mentioned during the testing - add more colors

C.3 Heuristic Evaluation

Heuristics violated: 1, 2, 3, 4

Heuristic 1) - rename “Robot details” to “Robot analysis” as the expectation is another when it is called “Robot details”.

Heuristic 2) - add Breadcrumbs in order to make backward steps more easily possible - add a new site called “My Dashboard” which can be customized in the same way as “Analysis” in order to give the user the possibility to customize an own personalized view with different components from the whole interface - add back/forth buttons in the interface instead of only giving the user the possibility to use those from the browser

Heuristic 3) - change icon to reset the zoom into the one which is used for full screen as the one chosen is commonly used to express updating/refreshing - change icon for full screen into another one which is commonly used for this purpose

- change the eye icon slightly because this can be used for the purpose of observing but is more commonly used to show/hide things - add breadcrumbs as this is a standard to make the current position more clear to the user

Heuristic 4) - offer the user information about the meaning of the icons etc. when hovering over them

APPENDIX D DEVELOPED GUI PROTOTYPE

Prototype - My Dashboard

Prototype - Robots map view

Prototype - Robots map view (zoom example)

Prototype - Robots tile view

Prototype - Robots list view

Prototype - Jobs

Prototype - Analysis

Prototype - Logs

TRITA -EECS-EX-2020:506

www.kth.se