
Contents 1. Introduction 19 1.1. Overview of general robotics context : my point of view of robotics in the 21stcentury.................................. 20 1.1.1. Motivations .............................. 20 1.1.2. Ethicalanalysis ............................ 21 1.2. Air-Cobotproject............................... 25 1.3. Aimsandcontributionsofthisthesis . 27 2. Navigation Strategy 29 2.1. Definition of the navigation processes and related works .......... 32 2.1.1. Perception............................... 32 2.1.2. Modeling ............................... 33 2.1.3. Localization .............................. 33 2.1.4. Planning................................ 34 2.1.5. Action................................. 35 2.1.6. Decision ................................ 35 2.2. Description of navigation processes in the Air-Cobot project ....... 36 2.2.1. Air-Cobotplatform. 36 2.2.2. The Robot Operating System.................... 37 2.2.3. Perception “ exteroceptive” (cameras and lasers) and “propriocep- tive”(odometer) ........................... 39 2.2.4. Modeling: both topological and metric maps . 43 2.2.5. Absoluteandrelativelocalization . 45 2.2.6. Planning................................ 55 2.2.7. Theactionprocess .......................... 55 2.2.8. Decision ................................ 63 2.3. Instantiation of the navigation processes in Air-Cobot project ...... 66 2.3.1. Autonomousapproachmode. 66 2.3.2. Autonomousinspectionmode . 67 2.3.3. Collaborativeinspectionmode . 68 2.4. General overview about the navigational mode management and modu- larityoftheframework ............................ 68 2.4.1. Overview of the management of the navigation modes . 69 2.4.2. Versatility of the navigation framework . 70 2.5. Conclusion................................... 72 3 Contents 3. Multi Visual Servoing 75 3.1. Introduction.................................. 76 3.1.1. State of the Art of Visual Servoing (VS) . 76 3.1.2. Contributions ............................. 78 3.1.3. Outlineofthechapter . 79 3.2. PrerequisitesofVisualServoing . 79 3.2.1. Robotpresentation . 79 3.2.2. Robotcoordinatessystem . 80 3.2.3. Robot Jacobian and Velocity Twist Matrix . 81 3.3. Image-BasedVisualServoing. 82 3.3.1. Method ................................ 82 3.3.2. Simulation............................... 85 3.4. Position-BasedVisualServoing . 92 3.4.1. Method ................................ 92 3.4.2. Simulation............................... 94 3.5. Comparison of IBVS and PBVS regarding robustness towards sensor noise onthebasisofsimulations . 101 3.5.1. Simulationconditions. 101 3.5.2. Theobtainedresults . 102 3.6. TheAir-Cobotvisualservoingstrategy . 103 3.6.1. Handlingthevisualsignallosses . 105 3.6.2. Coupling the augmented image with the visual servoing . 106 3.6.3. PBVSandIBVSswitching. 106 3.7. ExperimentsonAir-Cobot . 107 3.7.1. Testenvironment . 107 3.7.2. ApplyingViSPintheAir-Cobotcontext . 109 3.7.3. Experimental tests combining IBVS with PBVS . 109 3.8. Conclusion................................... 117 4. Obstacle avoidance in dynamic environments using a reactive spiral ap- proach 121 4.1. Introduction.................................. 122 4.1.1. Air−Cobotnavigationenvironment . 122 4.1.2. Stateoftheartofobstacleavoidance . 123 4.1.3. Introduction to spiral obstacle avoidance . 126 4.1.4. StructureofChapter . 128 4.2. Spiralobstacleavoidance . 128 4.2.1. Spiralconventionsanddefinition . 129 4.2.2. Definition and choice of spiral parameters for the obstacle avoidance130 4.2.3. Robotcontrollaws . 136 4.3. Simulationandexperimentalresults . 142 4.3.1. Simulationwithstaticobstacles . 142 4.3.2. Experimentsindynamicenvironment . 146 4.4. Conclusion................................... 147 4 Contents 5. Conclusion 154 5.1. Resume of contributions and project summary . 155 5.2. Futureworkandprospects . 156 5.2.1. Additionofagraphiccard . 157 5.2.2. Using the robot as a platform for autonomous drones . 157 5.2.3. Rethinktherobotdrivesystem . 157 5.2.4. Incorporation of a HUD on a tablet for the operator . 158 5.2.5. RealTimecapabilitiesoftheplatform . 159 6. Abstract 162 A. Robot Transformations and Jacobian Computation 166 A.1. Transformations concerning the stereo camera systems . ......... 167 A.1.1. Transformation to the front cameras system . 167 A.1.2. Transformationtotherearcameras . 168 A.1.3. Additionaltransformations. 169 A.2.KinematicScrew ............................... 171 A.3.Velocityscrew................................. 175 A.3.1. Solution ”1” of deriving the velocity screw matrix . 175 A.3.2.a....................................175 A.3.3. Solution ”2” of deriving the velocity screw matrix . 175 A.3.4.a....................................176 A.3.5. Applyingsolution”2”toourproblem . 176 A.4.Interactionmatrix .............................. 178 A.5. Jacobian and Velocity Twist Matrix of Air-Cobot . 178 A.5.1. extended................................ 179 A.6. Air-Cobot Jacobian and velocity twist matrix . 179 A.6.1. Jacobian-eJe............................. 179 A.6.2. Jacobian-eJereduced . 180 A.6.3. Velocitytwistmatrix-cVe . 180 A.7. Analysis of the Robots Skid Steering Drive on a flat Surface with high Traction .................................... 180 A.8. ROS-based software architecture in at the end of the project ....... 182 5 List of Figures 1.1. Extract of article [Frey and Osborne, 2013]; Likelihood of computerization ofseveraljobs................................. 22 1.2. Extract of [Bernstein and Raman, 2015]; The Great Decoupling explained 24 1.3. Air-Cobotplatformneartheaircraft . 26 2.1. 4MOB platform and fully integrated Air-Cobot . 37 2.2. Remote of the 4MOB platform and tablet operation of Air-Cobot . 38 2.3. Obstacledetectionstepbystep . 42 2.4. Example of a navigation plan around the aircraft . 44 2.5. Exampleofametricmap. .......................... 45 2.6. Example of the absolute robot localization . 46 2.7. Example of the VS environment experiments were conducted in . 48 2.8. Example of the relative robot localization. 49 2.9. 3d model of an Airbus A320 that was used for matching scans acquired for pose estimation in the aircraft reference frame (red = model points / blue=scannedpoints)............................ 51 2.10. Example of two scans that were acquired with an Airbus A320 in a hangar environment.................................. 52 2.11. Transformations during ORB-SLAM initialization . ......... 53 2.12. Simulator that helped in the evaluation of new, rapidly prototyped ideas throughouttheAir-Cobotproject . 57 2.13. Basics of the mobile base angular velocity ω calculation. 58 2.14.Adjustment to the standard go-to-goal behavior . 58 2.15. Aligning first with a target ”in front” and ”behind” the robot ...... 59 2.16. Modification of the align first behavior to decrease tirewear. ....... 61 2.17.Correct-Camera-Anglecontroller. 62 2.18. Picture of Air-Cobot in default configuration and with the pantograph- 3d-scannerunitextendedforinspection . 64 2.20. Schematic of the state machine which defines the decisionprocess . 64 2.19. Collection of elements that are inspected in a pre-flightcheck. 65 2.21. A general overview of the high level state machine of Air-Cobot ..... 69 2.22. Building costmaps with the help of the ROS navigation stack....... 71 2.23.Costmapsbuildduringtheexperiment . 72 3.1. Air-Cobot and the associated frames [Demissy et al., 2016]........ 80 3.2. Robot coordinate system for the front cameras [Demissy etal.,2016] . 80 3.3. Projectionintheimage-planeframe . 83 7 List of Figures 3.4. Control loop of an Image-based visual servoing example .......... 86 3.5. Targetchosenforsimulations . 86 3.6. Evolution of features in an IBVS simulation . 89 3.7. Robot path, orientation and camera state in a IBVS simulation . 90 3.8. Evolution of the visual error in a IBVS simulation . 91 3.9. Evolution of velocities in a IBVS simulation . 92 3.10.BlockdiagramforthePBVSapproach . 94 3.11. Evolution the point C (camera origin) in a PBVS simulation . 96 3.12.Evolution of features in a PBVS simulation. 97 3.13. Robot path, orientation and camera state in a PBVS simulation ..... 97 3.14.EvolutionoferrorinaPBVSsimulation . 98 3.15.Evolution of velocities in a PBVS simulation . 99 3.16. Evolution the point C (camera origin) in a noisy simulation. 102 3.17.Evolution of features in a noisy simulation . ........ 103 3.18. Robot path, orientation and camera state in a noisy simulation. 104 3.19.Evolution of error in a noisy simulation . 104 3.20.Evolution of velocities in a noisy simulation . ......... 105 3.21. Example of the lab environment where the visual servoing experiments wereconductedin............................... 108 3.22. Extract of video sequence showing an outside view onto the experiment . 110 3.23. Extract of video sequence showing the system handling switching between IBVSandPBVS ............................... 111 3.24. Position and velocity (linear and angular) plots obtained during the IBVS- PBVScombinationexperiment . 112 3.25. Orientation and velocity plots obtained during the feature loss experiment 113 3.26. Extract of video sequence showing the systems capability to handle fea- ture loss during execution of a visual servoing-based navigation mission . 115 3.27. Outline of the Gerald Bauzil room in which the experiment takes place withexpectedpathfortherobot . 116 3.28. Extract of video sequence showing the systems capability to perform a combination of IBVS-PBVS-Metric based navigation . 118 4.1. Dealingwithairportforbiddenzones . 123 4.2. Moth flight path being ”redirected” by an artificial lightsource.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages198 Page
-
File Size-