
Real-time Simulation of Time-of-Flight Sensors and Accumulation of Range Camera Data Echtzeit Simulation von Time-of-Flight Sensoren und Akkumulation von Tiefendaten bei der Naturwissenschaftlich-Technischen Fakultät der Universität Siegen zur Erlangung des Grades eines Doktors der Ingenieurwissenschaften (Dr.-Ing.) genehmigte Dissertation von Maik Keller 17. Juni 2015 1. Gutachter: Prof. Dr. Andreas Kolb, Universität Siegen 2. Gutachter: Prof. Dr. Reinhard Koch, Christian-Albrechts-Universität zu Kiel Mündliche Prüfung: 13. November 2015 Printed on non-aging wood- and acid-free paper. Gedruckt auf alterungsbeständigem holz- und säurefreiem Papier. Acknowledgments I would like to thank all those who contributed to the success of this work. First and foremost, I would like to thank my supervisor Prof. Dr. Andreas Kolb for his great support and the opportunity he gave me to work in his research group. He gave me the freedom to explore my own ideas and he was always open for discussion and supervision of my work. I thank Prof. Dr. Reinhard Koch for agreeing to act as the second referee. I gratefully acknowledge the financial support of the Federal Ministry of Education and Research within the Lynkeus project. Sincere thanks are given to all my colleagues from the Computer Graphics Group in Siegen, I enjoyed working with them very much. Special thanks go to Prof. Dr. Christof Rezk-Salama who introduced me to the world of Computer Graphics. I feel deeply grateful for co-working with my friend Dr. Jens Orthmann. He is a great person. I thank all co-authors of my publications. Especially, I would like to thank Dr. Shahram Izadi from Microsoft Research in Cambridge, and Prof. Dr. Tim Weyrich from University College London. I also would like to thank Dr. Martin Lambers, Dr. Nicolas Cuntz, and Dr. Marvin Lindner for valuable discussions and their scientific contributions to my papers. Particularly, I would like to mention Damien Lefloch and Thomas Hoegg for their inspiring discussions and their work in the field of Time- of-Flight simulation and 3D reconstruction. I thank Andrea Anders and Willi Gräfrath from the Computer Graphics Group for the handling of all administrative matters. I am very much obliged to my supervisors and colleagues from pmd technologies: Dr. Bernd Buxbaum, Dr. Robert Lange, and Jochen Penne for their support and for giving me the freedom to concentrate on finishing my research. Furthermore, I would like to thank Dr. Lutz Heyne, Stefan Hoberg, Martin Profittlich, Alexander Strauch, Michelle Schneider, and Daniel Queisser for valuable discussions and their help in the context of this thesis. I would also like to thank Maria Knaier for proof-reading this thesis. Finally, and above all, I would like to express my deep gratitude to my family: my parents Renate and Jürgen Keller, my sisters Tina and Daniela, and my parents-in- law Brigitte and Reiner Ohrndorf. They always believed in me, which I appreciate very much. Last, but not least, I would like to thank my wife Kathrin, the love of my life, and our sweet children Momo and Kasimir. My wife sacrificed a lot in order to simplify things for me. Her patience and her support enabled me to finish this work, for which I am very grateful and which is beyond any words. – Maik Keller June 2015 Abstract he present thesis addresses the topic of 3D range imaging in a twofold way: new Tmethods in the field of range data simulation as well as the accumulation of range images into a consistent data representation, namely 3D environment models, for high quality 3D object reconstruction are proposed. Within the last years inexpensive Time-of-Flight (ToF) range imaging devices have become an alternative to traditional depth measuring approaches. ToF cameras mea- sure full-range distance information by actively illuminating a scene and measuring the time until back-scattered light is detected. The final distance information is com- puted from multiple raw images. This thesis proposes a method for simulating the ToF principle in real-time along with the major sensor characteristics. The approach is motivated by physically-based illumination models and applied to the simulation of Photonic Mixing Devices, a specific type of ToF sensors. Furthermore, this thesis presents new methods of range data accumulation in real- time. While the hierarchical volumetric approach supports merging and subtraction of sub-volumes with arbitrary resolutions, the point-based fusion method accounts for spatial limitations of previous approaches and addresses high quality 3D reconstruc- tions at extended scales. Additionally, dynamically changing scenes are supported which results in advanced camera pose estimation as well as reduced drift errors. The algorithms are evaluated using simulated data as well as real camera data from structured light and ToF devices. The algorithms presented in this thesis feature an extensive data-parallel implemen- tation on current graphics processing units in order to ensure the online capability of the methods but without restricting the algorithms to hardware-specific features. i Zusammenfassung ie vorliegende Arbeit befasst sich im Bereich der 3D Tiefen-Bildverarbeitung mit Dzwei unterschiedlichen Themengebieten: zum einen geht es um neue Ansätze zur Simulation von Tiefenbilddaten, zum anderen geht es um die Akkumulierung von Entfernungsbildern in eine konsistente Datenbasis - auch 3D Umgebungsmodelle genannt - für qualitativ hochwertige 3D Objektrekonstruktionen. Im Laufe der letzten Jahre haben sich kostengünstige 3D Laufzeitkamerasysteme (Time-of-Flight, ToF) zu einer ernstzunehmenden Alternative zu traditionellen Ab- standsmessverfahren etabliert. ToF-Kameras erfassen Entfernungsbilder, indem die Szene aktiv beleuchtet und die Laufzeit des reflektierten Lichts bestimmt wird. Dabei werden die Tiefendaten aus mehreren Rohbildern rekonstruiert. Diese Arbeit stellt einen Ansatz zur Echtzeit-Simulation des ToF-Verfahrens vor, welcher auch die Simu- lation von relevanten Sensor-Charakteristiken umfasst. Der Ansatz ist physikalisch motiviert und findet seine Anwendung in der Simulation von Photomischdetektoren (Photonic Mixing Devices), welche auf dem Lichtlaufzeitverfahren basieren. Darüber hinaus werden in der vorliegenden Arbeit neue Verfahren für die Akkumu- lierung von Entfernungsbildern präsentiert. Der hierarchisch-volumetrische Ansatz unterstützt das Hinzufügen und Entfernen von Teil-Volumen mit unterschiedlicher Auflösung. Im Gegensatz zu bisherigen Ansätzen stellt die punkt-basierte Metho- de qualitativ hochwertige Rekonstruktion von großen Szenen sicher. Des Weiteren werden dynamische Szenen unterstützt. Dadurch verbessern sich die Abschätzung der Kamerapose sowie das Abdriften während der Datenfusion. Die Anwendung der Algorithmen wird unter Nutzung von Simulationsdaten und realen Kameradaten (ba- sierend sowohl auf dem strukturierten Licht-Ansatz als auch auf dem ToF Verfahren) demonstriert. Alle präsentierten Algorithmen zeichnen sich durch eine umfangreiche daten-parallele Implementierung auf derzeit verfügbaren Grafikkarten aus, um die Echtzeitverarbei- tung sicherzustellen. Die Algorithmen werden dabei nicht auf hardware-spezifische Aspekte eingeschränkt. iii Contents Abstract i 1 Introduction 1 1.1 Context ................................... 2 1.2 OverallGoals ................................ 2 1.3 Contributions................................ 3 1.4 Outline ................................... 5 2 Fundamentals 7 2.1 CurrentRangeImagingTechniques . 7 2.2 Photo Mixing Device Technology . 11 2.2.1 SignalTheory............................ 11 2.2.2 PixelDesign ............................ 13 2.2.3 SensorEffects............................ 14 2.3 EnvironmentModeling. 16 2.3.1 Types of Representation . 18 2.3.2 Registration............................. 20 2.3.3 Merging............................... 21 2.3.4 Rendering.............................. 23 2.3.5 Discussion.............................. 24 2.4 GraphicsHardware............................. 25 2.4.1 GraphicsPipeline . .. .. .. .. .. .. .. 26 2.4.2 General Purpose Computation . 27 2.4.3 Discussion.............................. 29 3 Time-of-Flight Sensor Simulation 31 3.1 Motivation ................................. 31 3.2 RelatedWork................................ 32 3.3 Physically-basedSensorModel . 33 3.3.1 Illumination and Radiosity . 33 3.3.2 Simple Model with Point Light Illumination . 34 3.3.3 Enhanced Model including Area Light Illumination . 36 3.4 SimulationFramework . 36 3.5 Implementation............................... 38 3.5.1 Phase Image Calculation . 39 3.5.2 FlyingPixels ............................ 39 3.5.3 MotionBlur............................. 40 3.5.4 Correlation Function . 41 3.5.5 Integration of Sensor Noise . 42 3.6 ResultsandAnalysis ............................ 43 v vi CONTENTS 3.6.1 Evaluation: Comparison to Real Sensor Data . 43 3.6.2 Wiggling Evaluation . 44 3.6.3 Motion Evaluation . 45 3.6.4 Illumination: Point Light vs. Area Light . 45 3.6.5 ExperimentalScenes . 46 3.7 Discussion.................................. 49 4 Hierarchical Volumetric Data Accumulation 51 4.1 Motivation ................................. 51 4.2 RelatedWork................................ 53 4.3 OverviewofConcept............................ 54 4.3.1 TreeTopology ........................... 54 4.3.2 Tree Manipulation . 55 4.4 Implementation............................... 57 4.4.1 Storage of Dynamic Volume Trees . 58 4.4.2 Modification of Dynamic Volume Trees . 59 4.4.3 Redundancy Optimization . 62 4.4.4 Generation of
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages135 Page
-
File Size-