End-To-End and Direct Human-Flying Robot Interaction
Total Page:16
File Type:pdf, Size:1020Kb
End-To-End and Direct Human-Flying Robot Interaction by Valiallah (Mani) Monajjemi M.Sc., Mechatronics Engineering, Amirkabir University of Technology, 2011 B.Sc., Electrical Engineering, Amirkabir University of Technology, 2008 Dissertation Submitted in Partial Fulfillment of the Requirements for the Degree of Doctor of Philosophy in the School of Computing Science Faculty of Applied Sciences c Valiallah (Mani) Monajjemi 2016 SIMON FRASER UNIVERSITY Fall 2016 All rights reserved. However, in accordance with the Copyright Act of Canada, this work may be reproduced without authorization under the conditions for “Fair Dealing.” Therefore, limited reproduction of this work for the purposes of private study, research, education, satire, parody, criticism, review and news reporting is likely to be in accordance with the law, particularly if cited appropriately. Approval Name: Valiallah (Mani) Monajjemi Degree: Doctor of Philosophy (Computing Science) Title: End-To-End and Direct Human-Flying Robot Interaction Examining Committee: Chair: Dr. Brian Funt Professor Dr. Richard Vaughan Senior Supervisor Associate Professor Dr. Greg Mori Supervisor Professor Dr. Arrvindh Shriraman Co-Supervisor Assistant Professor Dr. Nick Sumner Internal Examiner Assistant Professor Dr. Steven Waslander External Examiner Associate Professor Department of Mechanical and Mechatronics Engineering University of Waterloo Date Defended: August 19, 2016 ii Abstract As the application domain of Unmanned Aerial Vehicles (UAV) expands to the consumer market and with recent advances in robot autonomy and ubiquitous computing, a new paradigm for human-UAV interaction has started to form. In this new paradigm, humans and UAV(s) are co-located (situated) and use natural and embodied interfaces to share au- tonomy and communicate. This is in contrast to the traditional paradigm in Human-UAV interaction in which the focus is on designing control interfaces for remotely operated UAVs and sharing autonomy among Human-UAV teams. Motivated by application domains such as wilderness search and rescue and personal filming, we define the required components of end-to-end interaction between a human and a flying robot as interaction initiation (ii) approach and re-positioning to facilitate the interaction and (iii) communication of intent and commands from the human to the UAV and vice versa. In this thesis we introduce the components we designed for creating an end-to-end Human-Flying Robot Interaction sys- tem. Mainly (i) a fast monocular computer vision pipeline for localizing stationary periodic motions in the field of view of a moving camera; (ii) a cascade approach controller that combines appearance based tracking and visual servo control to approach a human using a forward-facing monocular camera; (iii) a close-range gaze and gesture based interaction system for communication of commands from a human to multiple flying UAVs using their on-board monocular camera; and (iv) a light-based feedback system for continuous commu- nication of intents from a flying robot to its interaction partner. We provide experimental results for the performance of each individual component as well as the final integrated sys- tem in real-world Human-UAV Interaction tests. Our interaction system, which integrates all these components, is the first realized end-to-end Human-Flying Robot Interaction sys- tem whereby an uninstrumented user can attract the attention of a distant (20 to 30m) autonomous outdoor flying robot. Once interaction is initiated, the robot approaches the user to close range (≈ 2m), hovers facing the user, then responds appropriately to a small vocabulary of hand gestures, while constantly communicating its states to the user through its embodied feedback system. All the software produced for this thesis is Open Source. Keywords: Human-Robot Interaction, Unmanned Aerial Vehicles, Flying Robots iii Dedication To my lovely parents. For their love, sacrifices and continuous support. iv Acknowledgements The process of earning a doctorate and writing a dissertation is a long journey, one full of adventures, challenges and unique experiences. I was truly blessed to share every moment of this quest with Shokoofeh Pourmehr, whose love and unconditional support helped me immensely to overcome all obstacles and meet my academic goals. Shokoofeh made this a journey to remember and I am deeply grateful for that. I would like to sincerely thank my senior supervisor, Professor Richard Vaughan, for his continuous support, encouragement, advice, mentorship and above all, his patience. I ap- preciate his vast knowledge and domain expertise as well as his help in writing manuscripts, designing software systems and crafting new ideas. Without his guidance this research and dissertation would not have been possible. I would also like to thank my supervisor, Professor Greg Mori, for supporting my research and providing me with thoughtful feedback at different stages of the project. He taught me how to be a critical thinker and encouraged me to always aim for the best. In addition, I would like to thank my PhD committee member, Professor Arrvindh Shriraman, for his support and valuable feedback. Lastly, I would like to thank all my colleagues and friends at SFU Autonomy Lab who shared the graduate life experience with me and contributed to my research through their criticism, support and encouragement: Jens Wawerla, Abbas Sadat, Jake Bruce, Jacob Perron, Jack Thomas, Lingkang Zhang and Sepehr Mohaimenianpour. v Table of Contents Approval ii Abstract iii Dedication iv Acknowledgements v Table of Contents vi List of Tables ix List of Figures x 1 Introduction 1 1.1 List of Publications . 5 2 State of The Art in Human-Flying Robot Interaction 6 2.1 Human Factors and Remote Interaction with UAVs . 7 2.1.1 Human Supervisory Control of UAVs . 8 2.1.2 Interfaces for Remote Interaction with UAVs . 11 2.2 Situated Interaction With UAVs . 17 2.2.1 Interaction Initiation . 18 2.2.2 Communication of Intent and Commands to UAVs . 34 2.2.3 Communication of Intent from UAVs to a Human . 46 2.3 Approach and Repositioning in Human-UAV Interaction . 51 2.4 Conclusion . 55 3 Close-range Situated Interaction with a Group of Flying Robots through Face Engagement and Hand Gestures 58 3.1 Introduction . 59 3.2 Background . 60 3.2.1 Gaze and Gesture in Human Robot Interaction . 60 3.2.2 Robot Selection And Task Delegation . 61 vi 3.2.3 Human Robot Interaction with UAVs . 62 3.3 Method . 62 3.3.1 Position Estimate and Control . 63 3.3.2 Face Detection and Tracking . 64 3.3.3 Motion Cancellation and Gesture Recognition . 66 3.3.4 Commanding the Vehicle . 66 3.4 Demonstration . 68 3.4.1 Three-Robot Experiment with Marker-Based Localization . 68 3.4.2 Two-Robot Experiment with Feature-Based Localization . 69 3.4.3 Integrating Multi-Modal Interfaces to Command UAVs . 70 3.4.4 Discussion . 72 3.5 Conclusion . 73 4 Explicit Interaction Initiation with a Flying Robot Using Stationary Periodic Motions 75 4.1 Introduction . 76 4.2 Background . 78 4.3 Method . 79 4.3.1 Camera Motion Estimation and Ego Motion Cancellation . 80 4.3.2 Salient Object Detection and Tracking . 81 4.3.3 Periodicity Detection . 82 4.3.4 Tuning the Parameters . 83 4.3.5 Platform and Implementation . 84 4.4 Experiments . 84 4.4.1 Datasets . 84 4.4.2 Closed-loop experiments with an outdoor UAV . 86 4.4.3 Runtime Performance . 89 4.5 Conclusion . 89 5 An End-To-End, Direct Human-Flying Robot Interaction System 90 5.1 Introduction . 91 5.2 Background . 92 5.3 Method . 94 5.3.1 Hardware Platform . 95 5.3.2 Interaction Initiation using Periodic Gestures . 97 5.3.3 Visual Tracker . 97 5.3.4 Cascade Controller for Approaching the User . 98 5.3.5 Close-range Interaction . 103 5.3.6 Communication of Intents from the UAV to the User . 104 5.4 Experiments . 105 vii 5.4.1 Approach Controller . 106 5.4.2 Outdoor Experiments . 108 5.5 Conclusion . 110 6 Conclusion and Future Work 112 6.1 Future Work . 114 Bibliography 116 Appendix A Supplementary Source Code and Media 131 A.1 Open Source Software . 132 A.2 Media . 134 viii List of Tables Table 2.1 Parameters of the Laban Effort System proposed by Sharma et al. [163] for composing communicative flight trajectories ( c 2013, IEEE) . 47 Table 2.2 Comparison of state of the art in situated and end-to-end interaction with flying robots with the goals of this thesis . 57 Table 3.1 Result summary for three robot experiment, Si, Di and Ci mean issue the Select, Deselect and Command gesture to the ith robot. Unin- tended outcomes are marked by overstrikes. 69 Table 3.2 Result summary for two robot experiment. Si, Di and Ci mean issue the Select, Deselect and Command gesture to the ith robot. Unin- tended outcomes are marked by overstrikes. 72 Table 4.1 Properties of video streams, parameters used for each experiment and average execution time per frame (Section 4.4.3) . 85 Table 4.2 The accuracy of hand waving detection for each experiment (DT: De- tection Rate, FDR: False Detection Rate, MR: Miss Rate) . 86 Table 4.3 False Detection Rate (FDR) for walking and running actions (UCF- ARG and KTH datasets) . 86 Table 4.4 Outcome of all trials. C1: Location 1, Late Afternoon, Overcast, C2: Location 1, Noon, Overcast, C3: Location 2, Noon, Overcast, C4: Location 2, Late Afternoon, Sunny, (r): running, (w): walking . 88 Table 4.5 Mean per-frame execution time breakdown for each component of the pipeline (in milliseconds). 89 Table 5.1 Animation used for providing light-based feedback to the user, their corresponding state and metaphors. 104 Table 5.2 The summary of failures in the end-to-end outdoor experiments . 109 ix List of Figures Figure 2.1 Hierarchical model for human supervisory control of UAVs [67] ( c 2015, Springer Science) .