Contribution to the Study of Projection-Based Systems for Industrial Applications in Mixed Reality Guillaume Cortes
Total Page:16
File Type:pdf, Size:1020Kb
Contribution to the study of projection-based systems for industrial applications in mixed reality Guillaume Cortes To cite this version: Guillaume Cortes. Contribution to the study of projection-based systems for industrial applications in mixed reality. Graphics [cs.GR]. Université Rennes 1, 2018. English. NNT : 2018REN1S044. tel-02000387 HAL Id: tel-02000387 https://tel.archives-ouvertes.fr/tel-02000387 Submitted on 1 Feb 2019 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. THESE DE DOCTORAT DE L'UNIVERSITE DE RENNES 1 COMUE UNIVERSITE BRETAGNE LOIRE ECOLE DOCTORALE N° 601 Mathématiques et Sciences et Technologies de l'Information et de la Communication Spécialité : Informatique Par Guillaume CORTES Contribution to the Study of Projection-based Systems for Industrial Applications in Mixed Reality Thèse à présenter et soutenir à Rennes, le 24/10/18 Unité de recherche : IRISA – UMR6074 Thèse N° : Rapporteurs avant soutenance : Marie-Odile Berger Directrice de Recherche Inria à Inria Nancy Martin Hachet Directeur de Recherche Inria à Inria Bordeaux Composition du Jury : Rapporteurs : Marie-Odile Berger Directrice de Recherche Inria à Inria Nancy Martin Hachet Directeur de Recherche Inria à Inria Bordeaux Examinateurs : Sabine Coquillart Directrice de Recherche Inria à Inria Grenoble Guillaume Moreau Professeur à l’Ecole Centrale de Nantes Co-dir. de thèse : Anatole Lécuyer Directeur de Recherche Inria à Inria Rennes Co-dir. de thèse : Eric Marchand Professeur à l’Université de Rennes 1 Acknowledgements First of all, I would like to sincerely thank my two advisors, Anatole Lécuyer and Eric Marchand, for supervising this work and for their unrelenting support on whatever difficulty was encountered during this journey. This manuscript and the PhD would not exist without Jérôme Ardouin and Guillaume Brincin thus I would like to espacially thank them for creating this adventure. I would also like to thank the members of the PhD committee for taking the time to read the manuscript and for traveling to Rennes in order to participate to the de- fense. Thank you to the many great minds that I met at Inria and Realyz. Special thanks to my Inria office mates, Hakim and Yoren. Thank you for your good mood and the many laught we had. To conclude, I thank my friends and family for their support. Special thanks to all the Rennunion group for these great lunch meals, afterworks and week-ends that we spent together. Of course, I especially thank my parents for coming to the defense from many miles away as well as for hosting one of the best defense buffet ever! Finally, thank you Laurie for being there all along. i Contents List of Acronyms v Notations ix 1 Introduction 1 2 Related work 11 2.1 Visual displays for mixed reality ........................ 11 2.1.1 Non-projection-based visual displays ................. 13 2.1.2 Projection-based displays ....................... 19 2.2 Tracking systems for mixed reality ...................... 24 2.2.1 Non-optical tracking systems ..................... 25 2.2.2 Optical tracking systems ....................... 29 2.2.3 Discussion on optical tracking systems for projection-based displays 35 2.3 Industrial applications of mixed reality .................... 37 2.3.1 Training applications ......................... 39 2.3.2 Assistance applications ........................ 40 2.3.3 Design applications .......................... 42 2.3.4 Planning and validation applications ................. 43 2.4 Conclusion ................................... 44 3 Pilot study: Analysis of user motion in a specific CAVE-based in- dustrial application 47 3.1 Analysis of user motion in a CAVE-based industrial application ....... 48 3.1.1 Selection of the industrial application ................ 48 3.1.2 Participants .............................. 51 3.1.3 Procedure ............................... 51 3.1.4 Collected data ............................. 51 3.2 Results ..................................... 52 3.2.1 Cyclop (Head) and Wand (Hand) 3D positions ........... 52 3.2.2 Cyclop and Wand 3D orientations .................. 54 3.2.3 Cyclop and Wand speeds ....................... 55 3.3 Discussion ................................... 55 3.4 Main outcomes and guidelines ........................ 56 3.5 Conclusion ................................... 57 4 Increasing optical tracking workspace for mixed reality applications 59 iii Contents 4.1 General approach: Increasing the optical tracking workspace of mixed reality applications .................................. 60 4.2 Fundamentals of optical tracking ....................... 62 4.2.1 Prespective camera model ...................... 62 4.2.2 Epipolar geometry ........................... 65 4.3 MonSterTrack: Increasing optical tracking workspace with hybrid stereo/monocular tracking .................................... 68 4.3.1 Off-line system calibration ...................... 69 4.3.2 On-line real-time stereo tracking ................... 71 4.3.3 Monocular tracking mode ....................... 76 4.4 CoCaTrack: Increasing optical tracking workspace with controlled cameras . 77 4.4.1 Off-line controlled camera calibration ................. 78 4.4.2 Registration .............................. 80 4.4.3 Controling camera displacements: visual servoing .......... 80 4.5 Proofs of concept ............................... 82 4.6 Performance .................................. 84 4.6.1 Main results of our global approach ................. 84 4.6.2 Comparison with Vicon’s optical tracking .............. 87 4.7 Conclusion ................................... 90 5 Mobile spatial augmented reality for 3D interaction with tangible objects 91 5.1 The MoSART approach: Mobile spatial augmented reality on tangible objects 93 5.2 Proof of concept ................................ 94 5.2.1 Optical tracking ............................ 96 5.2.2 Projection mapping .......................... 97 5.2.3 Interaction tools ............................ 99 5.2.4 Adding collaboration ......................... 100 5.2.5 Characteristics and performances ................... 101 5.3 Use cases ................................... 102 5.3.1 Virtual prototyping .......................... 102 5.3.2 Medical visualization ......................... 103 5.4 Discussion ................................... 104 5.5 Conclusion ................................... 106 6 Introducing user’s virtual shadow in projection-based systems 107 6.1 Related work on virtual shadows ....................... 108 6.2 Studying the use of virtual shadows in projection-based systems ...... 110 6.2.1 Objective ............................... 110 6.2.2 Apparatus ............................... 110 6.2.3 Participants .............................. 112 6.2.4 Experimental task ........................... 113 6.2.5 Experimental protocol ......................... 113 6.3 Results ..................................... 115 6.3.1 Performance measurements ...................... 115 6.3.2 User experience questionnaires .................... 116 iv Contents 6.4 Discussion ................................... 119 6.4.1 Virtual shadows and virtual embodiment ............... 119 6.4.2 Virtual shadows and spatial perception ................ 120 6.5 Conclusion ................................... 121 7 Conclusion 123 Author’s publications 129 A Appendix : Résumé long en français 131 List of Figures 147 Bibliography 164 v List of Acronyms AR Augmented Reality. 1–3, 7, 8, 11, 12, 14–16, 20, 21, 23, 24, 28–30, 32, 35, 36, 38–45, 75, 92, 104–106, 124, 127 AV Augmented Virtuality. 1 CAD Computer-Aided Design. 42, 44 CAVE Cave Automatic Virtual Environment. 3, 19, 42–45, 123, 125 CoCaTrack Controlled Camera Tracking. 8, 60, 77, 90, 123–125 DoF Degrees of Freedom. 5, 25, 37, 51, 92 EKF Extended Kalman Filter. 28 FoR Field-of-Regard. 5, 12, 15, 18, 23, 24 FoV Field-of-View. 12, 14, 15, 18, 23, 24, 34, 94, 101, 104 fps frames per second. 12 HMD Head-Mounted Displays. 3–5, 7, 13–16, 36, 39, 41–45, 121, 123, 124 HMPD Head-Mounted Projection-based Displays. 36 Hz Hertz. 12, 17, 26, 36, 37, 82 IMU Inertial Measurement Units. 27, 28 IPS Immersive Projection-based Systems. 7, 119 MEMS MicroElectroMechanical Systems. 27 MonSterTrack Monocular and Stereo Tracking. 8, 60, 78, 90, 123, 125 MoSART Mobile Spatial Augmented Reality on Tangible objects. 8, 92, 93, 98, 102, 104–106, 125, 126 MR Mixed Reality. 1–8, 11, 13, 24, 25, 34, 37–45, 48, 57, 60, 75, 90, 106, 123, 124, 127 vii List of Acronyms OST Optical See-Through.7, 14, 15, 21, 92, 104, 105 P3P Perspective from 3 Points. 76 PBS Projection-Based Systems. 2–8, 21, 23, 24, 35–37, 44, 45, 48, 56, 57, 60, 90, 92, 106, 108, 110, 121, 123–125, 127 PnP Perspective from n Points. 76 SAR Spatial Augmented Reality. 20, 22, 36, 37, 92, 99, 105, 124 SMB Small and Medium Businesses. 3, 48 SSD Surround-Screen Displays. 19, 23, 36, 37 SVD Single Value Decomposition. 67, 70, 71 UAV Unmanned Aerial Vehicle. 5, 90, 124 VE Virtual Environment. 13, 24 VR Virtual Reality. 1–3, 5, 7, 11, 13, 15–17, 23, 24, 29, 30, 32, 35, 36, 39, 40, 42–45, 48, 49, 52, 56, 60, 68, 71, 75,