
TECHNICAL UNIVERSITY OF CATALONIA BARCELONATECH Doctoral Programme AUTOMATIC CONTROL,ROBOTICS AND COMPUTER VISION PhD Dissertation THE ROBOT NULL SPACE: NEW USES FOR NEW ROBOTIC SYSTEMS Josep-Arnau Claret i Robert Supervisor: Luis Basañez Villaluenga May 2019 To my parents. Abstract With the new advances in the field of robotics, the kinematic complexity of robotic systems has dramatically increased, whether by the addition of more joints to standard robots, or by the appearance of new types of robots, like humanoid robots and Unmanned Air Vehicles (UAV), and new scenarios, like cooperative exploration of scenarios by groups of robots, or the interaction between robots and humans, among others. This is specially sensitive in shared tasks between a user and a machine, like in a teleoperation scenario. In this situations, a good compromise between the user input and the degrees of freedom managed by the system is critical, and a failure to properly adjust its autonomy can lead to under-performance and hazardous situations. This dissertation proposes new approaches to address the autonomy of a robotic system through the robot redundancy, that is, the extra degrees of freedom available to solve the main task. These approaches allow to cope with the increasing complexity of the robotic systems, and to exploit them to simultaneously solve a hierarchy of multiple tasks with different levels of priority. In particular, in this work this redundancy is addressed using the robot null-space for two different robotic systems and goals. Envisioning a world where users and humanoid robots tend to increasingly share space and tasks, in the first scenario, the redundancy of a humanoid robot is exploited to execute a novel task with low priority: the conveyance of emotional states to humans. A model to transform emotions as points in a three dimensional space to kinematic features of the robot is developed, as well as the implementation of the proposed approach in a Pepper robot. Further, the results of a user study to assess its validity are shown, its conclusions, and future directions. In the second scenario, a novel robotic system for teleoperation is presented. This robotic system results from the integration of a mobile manipulator, an UAV and a haptic device. The user commands the end effector of the mobile manipulator using the haptic while getting real- time feedback from a camera mounted on the UAV, which takes the role of a free-flying camera. A description of this robotic system and its task hierarchy is given in the second part of the thesis, along with algorithms to coordinate the camera and the mobile manipulator in such a way that the command of the camera is transparent to the operator. Also, the null-space of the robot is exploited to improve the teleoperation experience in two directions. The first, by proposing two enhancements on the continuous inverse. The second, by using the null-space of the mobile manipulator to prevent its TCP from being occluded to the camera by the robot itself, which, to the authors knowledge, it a novel use of the robot null-space. Finally, the proposed algorithms are implemented in the Barcelona Mobile Manipulator and a Parrot AR.Drone. Overall, new contributions on the management of the robot redundancy are presented in this dissertation, along with its implementation and validation, which expand the knowledge of this particular sub-field of robotics. v Acknowledgments I thank Professor Luis Basañez for the opportunity to deep into the realm of robotics and accom- plish this dissertation. This thesis would not have possible without his knowledge, guidance, words of encouragement at difficult times, and amusing conversations. I would also like to thank Raúl Suárez and Jan Rosell. The first, for opening to me the doors of the IOC, many years ago, when I was still an undergraduate student, full of futuristic dreams of robots and spaceships. The second, for his invaluable support during all these years, for the trip to Lausanne, and for giving me the opportunity to contribute to the Kautham Project. I thank Gentiane Venture for the warm welcome during my four-month internship in her lab, which resulted in the first part of the thesis. Also, to the members of the GVLab for the translations to Japanese, and attending the local participants during the user study. I cannot stress enough the impact those months in Tokyo had on me, among the temples and giant buildings of this amazing city, which memories I will treasure for the rest of my life. This dissertation would not have been possible without the technical discussions, chats and laughs at the IOC. I want to thank Leo for always being ready to set up the hardware and software which are at the base of this work. And, also, Orestes, Carlos Aldana, Isiah, Marcos, Andrés, Henry, Gema, Néstor, Diana, José, Nacho, Abiud, Niliana, Carlos Rosales, Sergi, Em- manuel, Fernando, Carlos Rodríguez, Alexander, Paolo, Ali, Muhayyuddin, Diab, and to many others I have met during these exciting years. Finally, thanks to my friends and my relatives, who will be pleased to know that I successfully closed this period of my life, and willing to see, as I am, which ones will come now. And, last but not least, to my parents, for their patience, faith, and unconditional love. Among so many things, I owe them my interest in books since I was a child, which is at the root of my admiration and awe for scientists, science and its endless possibilities. Josep-Arnau Claret i Robert. Barcelona, Spain. May 2019. This work has been partially supported by the Spanish MINECO projects DPI2013-40882-P, DPI2014-57757-R and DPI2016-80077-R, the Spanish predoctoral grant BES-2012-054899, and the Japanese challenging exploratory research grant 15K12124. vii Contents Abstract v Acknowledgments vii Notation and Acronyms xvi 1 Introduction 1 1.1 Context and motivation . 1 1.2 Objectives . 4 1.3 Outline of the Thesis . 7 2 State of the Art 9 2.1 Robotic Systems . 9 2.1.1 Teleoperation . 10 2.2 Exploiting the Redundancy . 12 2.3 Using a Robot to Convey Emotions . 13 2.3.1 Voice Intonation . 13 2.3.2 Head Motion and Facial Expression . 14 2.3.3 Expressing Emotions through Body Motions . 15 3 Modeling 17 3.1 The Jacobian Null Space and The Task Priority Formulations . 17 3.2 Modeling Emotions . 19 3.2.1 Measuring Emotions: The SAM Scale . 22 3.3 The Continuous Inverse . 23 3.4 The Pinhole Camera Model . 26 I Conveying Emotions Using the Null Space 29 4 Emotion Mapping 31 4.1 Introduction . 31 4.2 The Pepper Robot . 33 4.3 Emotion Conveyance . 36 ix x TABLE OF CONTENTS 4.3.1 Proposed approach . 36 4.3.2 From PAD to motion features: fJVG ...................... 38 4.3.3 From JVG to the emotional configuration: fm . 39 4.4 The Multi-Priority Inverse Kinematic Algorithm . 42 4.5 Implementation . 43 4.6 Chapter Contributions . 45 5 The User Study 47 5.1 Description . 48 5.2 Results . 50 5.3 Discussion . 55 5.4 Chapter Contributions . 57 II Teleoperating a Mobile Manipulator with a UAV as visual feedback 59 6 Workspace Mapping 61 6.1 Introduction . 61 6.2 The Teleoperation System . 62 6.2.1 The frames . 63 6.2.2 Inverse kinematics . 66 6.2.3 The Position Mapping . 67 6.2.4 The Orientation Mapping . 67 6.3 The User Study . 71 6.3.1 Results . 72 6.4 Chapter Contributions . 75 7 Enhancing the Continuous Inverse 77 7.1 Introduction . 77 7.2 Limitations of the Continuous Inverse . 79 7.2.1 On the boundaries of the null space matrix . 79 7.2.2 On the lower priority levels . 80 7.2.3 Example . 81 7.3 Enhancements . 86 7.3.1 On the boundaries of the null space matrix . 86 7.3.2 On the lower priority levels . 89 7.4 Chapter Contributions . 90 8 The Object Best View Function 93 8.1 Introduction . 93 8.2 Proposed Solution . 94 8.3 Simulation Results . 100 8.4 Chapter Contributions . 109 TABLE OF CONTENTS xi 9 The Drone Track Function 111 9.1 Introduction . 111 9.2 Proposed Solution . 113 9.2.1 Requirements . 113 9.2.2 Kinematic model . 114 9.2.3 Subtasks . 115 9.2.4 Algorithm Development . 117 9.2.5 Summary . 124 9.3 Simulations . 124 9.4 Chapter Contributions . 134 10 Experimental Validation 135 10.1 Introduction . 135 10.2 The robots . 136 10.2.1 The Barcelona Mobile Manipulator I . 136 10.2.2 The Parrot AR.Drone . 138 10.2.3 The Phantom Omni . 138 10.3 Integration . 139 10.4 Simulation environment: The Kautham Project . 143 10.5 Experimentation . 144 10.5.1 BMM-I Inverse Kinematics Test . 144 10.5.2 Object Best View Function Experiment 1 . 144 10.5.3 Object Best View Function Experiment 2 . 146 10.5.4 Drone Track Experiment . 146 11 Conclusions 149 11.1 Contributions . ..
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages211 Page
-
File Size-