Visualization and Validation of Safety Parameters for Industrial Robots Using the Hololens 2
Total Page:16
File Type:pdf, Size:1020Kb
Linköping University | Design Master thesis, 120 credits| Design Spring 2020| ISRN-number: LiU-ITN-TEK-A--20/014--SE Stefan Nikolov Supervisor: Ahmet Börütecene Company Supervisor: Elina Lindqvist Examiner: Jonas Löwgren Visualization and Validation of Safety Parameters for Industrial Robots Using the HoloLens 2 1 Abstract This thesis focuses on a technical and conceptual exploration of interactions that can help engineers configure sensitive, precise data for industrial ABB robots using holographic technology. The purpose is to improve the precision of the process and speed up the commissioning of robots for clients. The author presents an analysis of various approaches to using holograms for industrial purposes which were trialed and tested using a holographic display headset called the HoloLens 2. Through a workflow of sketching, concept rendering, remote meetings with employees of the robotics company ABB and technical implementation of all ideas on the Unity 3D Engine, the author was able to deliver and present a complete demo of an application for the HoloLens 2 that allows an engineer to superimpose a robot hologram on top of a real world robot of the same model and verify that the robot passes all safety requirements necessary for it to be commissioned for shipping. This demo was showed off to the Design Master class of 2019 at Linkoping University, colleagues at ABB who don't have experience with verifying safe zone parameters, software engineers who actively work with robot safety parameters and designers who create the user interface for updated iterations of the programs currently used to verify and commission robots. 2 Table of Contents 1. Introduction .................................................................................................................................. 5 1.1 General information on communicating with ABB-developed robots ................................... 5 1.2 A primer on safety zones ...................................................................................................... 5 1.3 Quick review of how robots are (presently) verified and commissioned ............................... 7 1.4 Brief introduction to the HoloLens 2 platform ...................................................................... 8 1.5 Unique features of the HoloLens 2 ..................................................................................... 10 1.6 Effects of the global coronavirus pandemic on the project ................................................. 11 2. Research Question ....................................................................................................................... 12 2.1 Company objective ............................................................................................................ 12 2.2 Determining the academic contribution of this project ...................................................... 13 2.3 Understanding what a HoloLens 2 “assistant” to RobotStudio should be capable of ........... 15 2.4 Delimitations (or lack thereof)............................................................................................ 16 3. Background Research .................................................................................................................. 17 3.1 Advantages and disadvantages of mixed reality over standard methods ............................ 17 3.2 Image tracking in AR - research and conclusions................................................................. 17 3.3 Visualizing safety zones around a robot.............................................................................. 19 3.4 Using the HoloLens 2 to enhance worker safety on-site ..................................................... 20 3.5 Exploring tactile communication beyond the HoloLens ...................................................... 21 4. Workflow and Project Focus ........................................................................................................ 23 4.1 General outline of the work process ................................................................................... 23 4.2 Ideation process ................................................................................................................. 24 4.3 Preliminary questions ........................................................................................................ 26 4.4 Collecting feedback and narrowing the direction of the prototype ..................................... 27 4.5 Confirming the viability of a prototype using think-aloud and cognitive walkthrough methods .................................................................................................................................. 28 4.6 Merits of coding and testing all concept interactions ......................................................... 29 5. Setup and Interaction Methods ................................................................................................... 31 5.1 Technical landscape ........................................................................................................... 31 5.2 Technical setup and early demo features ........................................................................... 32 5.3 Hand tracking, raycasting from the palm and hand-attached interface ............................... 33 5.4 Eye gaze tracking ............................................................................................................... 34 5.5 Eye gaze selection .............................................................................................................. 35 3 5.6 Speech recognition ............................................................................................................ 35 5.7 Spatial Awareness .............................................................................................................. 36 6. Prototyping ................................................................................................................................. 37 6.1 User alias, creating, saving and loading holographic environments ..................................... 37 6.2 "Jogging" a robotic hologram using hand ray-casting.......................................................... 37 6.3 Interacting with menus and their location in 3D space ....................................................... 39 6.4 Palm-activated menus and hand-attached gizmos .............................................................. 40 6.5 Near-menus, follow distance & orientation of UI elements that track the user ................... 41 6.6 Horizontal vs vertical spacing of UI elements ...................................................................... 42 6.7 Visual cues to verify safety zones ....................................................................................... 43 6.8 Collision logs and comments .............................................................................................. 44 6.9 Functionality of the end product ........................................................................................ 45 7. Validation - Menus ...................................................................................................................... 47 7.1 Field of view real estate - opportunities and limitations ..................................................... 47 7.2 Raycast limitations and adaptations to collision size ........................................................... 48 7.3 Summoning, interacting with menus and attaching them to points in the environment ..... 49 7.4 Constellation-type menus .................................................................................................. 51 7.5 Multi-layered hand-menu panels ....................................................................................... 51 7.6 Setup menus ...................................................................................................................... 53 8. Validation – Robot Interactions.................................................................................................... 55 8.1 Manipulating robot components ........................................................................................ 55 8.2 Selecting joint properties ................................................................................................... 56 8.3 Sliders ................................................................................................................................ 57 8.4 Modifying imported safety zones ....................................................................................... 58 8.5 Selecting specific points along a safety zone and attaching comments ............................... 59 8.6 Detecting and logging speed and boundary violations ........................................................ 60 8.7 UI with a follow behavior ................................................................................................... 61 8.8 Tooltips and the “help me” button ..................................................................................... 62 8.9 Simulating RobotStudio paths ............................................................................................ 62 9. Conclusion ................................................................................................................................... 63 9.1 A user-based perspective on the delivered prototype ........................................................ 63 9.2 Lessons learned ................................................................................................................