Bayesian Orientation Estimation and Local Surface Informativeness for Active Object Pose Estimation Sebastian Riedel

Bayesian Orientation Estimation and Local Surface Informativeness for Active Object Pose Estimation Sebastian Riedel

Drucksachenkategorie Drucksachenkategorie Bayesian Orientation Estimation and Local Surface Informativeness for Active Object Pose Estimation Sebastian Riedel DEPARTMENT OF INFORMATICS TECHNISCHE UNIVERSITAT¨ MUNCHEN¨ Master’s Thesis in Informatics Bayesian Orientation Estimation and Local Surface Informativeness for Active Object Pose Estimation Bayessche Rotationsschatzung¨ und lokale Oberflachenbedeutsamkeit¨ fur¨ aktive Posenschatzung¨ von Objekten Author: Sebastian Riedel Supervisor: Prof. Dr.-Ing. Darius Burschka Advisor: Dipl.-Ing. Simon Kriegel Dr.-Inf. Zoltan-Csaba Marton Date: November 15, 2014 I confirm that this master’s thesis is my own work and I have documented all sources and material used. Munich, November 15, 2014 Sebastian Riedel Acknowledgments The successful completion of this thesis would not have been possible without the help- ful suggestions, the critical review and the fruitful discussions with my advisors Simon Kriegel and Zoltan-Csaba Marton, and my supervisor Prof. Darius Burschka. In addition, I want to thank Manuel Brucker for helping me with the camera calibration necessary for the acquisition of real test data. I am very thankful for what I have learned throughout this work and enjoyed working within this team and environment very much. This thesis is dedicated to my family, first and foremost my parents Elfriede and Kurt, who supported me in the best way I can imagine. Furthermore, I would like to thank Irene and Eberhard, dear friends of my mother, who supported me financially throughout my whole studies. vii Abstract This thesis considers the problem of active multi-view pose estimation of known objects from 3d range data and therein two main aspects: 1) the fusion of orientation measure- ments in order to sequentially estimate an objects rotation from multiple views and 2) the determination of informative object parts and viewing directions in order to facilitate plan- ning of view sequences which lead to accurate and fast converging orientation estimates. Addressing the first aspect, the Bingham probability distribution over 3d rotations, a parametric probability density function defined on the unit quaternion sphere, is inves- tigated in a black box fusion task based on real data. The experiment shows that the re- sulting rotation errors are equal to fusion approaches based on pose clustering, a particle filter and a histogram filter while having the advantage of a continuous and parametric probabilistic representation. To evaluate the informativeness of surface parts and viewing directions of an object with respect to orientation estimation, we present a conceptually simple approach based on the classification of locally computed 3d shape features to viewing directions they could be observed from during a training phase. At first, the applicability of the viewing direction classification to object orientation estimation is investigated. Secondly, the trained classi- fication pipeline is used to determine informative viewing directions and discriminative local surface parts by analyzing the discrepancy between predicted and correct classifi- cations on training data using the Kullback-Leibler divergence as information-theoretic measure of dissimilarity. Experiments on simulated and real data revealed that the accuracy of the orientation estimation using the proposed method is not yet comparable to state-of-the-art algorithms in the general case of unrestricted viewing directions. The problem was identified as non- robustness of the classification to deviations from the discrete set of training view direc- tions. The evaluation of view and surface part informativeness, however, gives plausible and promising results for building effective view planning criteria. ix x Glossary C inverse regularization strength for logistic regression training, C R+, the smaller, the ∈ more regularization. 43, 44, 46, 47, 49 K number of clusters for K-means clustering. 37, 38, 43, 44, 46, 47, 49 M number of samples used for sequential Monte Carlo update method. 26, 27, 29, 31, 44 Nfeat number of features per view used for object rotation estimation. 40, 41, 43, 44, 46, 49, 51, 53, 57 Nsets number of training point clouds per training pose. 35, 43, 44, 46, 47 Nviews number of different training poses for viewing direction based rotation estimation. 35, 37, 38, 43, 44, 46, 47 V list of component-wise direction vector parameters for a Bingham mixture model. 28 α list of component weights for a Bingham mixture model. 28, 29, 31, 32, 72, 75 K list of component-wise concentration parameters for a Bingham mixture model. 28 κ single concentration parameter of a Bingham distribution. 29, 31, 32, 40, 43, 44, 46, 49, 51, 53, 72, 75 BMM Bingham mixture model/distribution. 28, 40 Nmax maximum number of components a Bingham mixture model has after applying mix- ture reduction. 26, 29, 31, 32 o m Tc m-th training pose, transformation from camera frame to object frame. 38, 43, 47 rf radius for feature estimation on point cloud. 36, 37, 43, 44, 46, 47, 65 rn radius for normal estimation on point cloud. 43, 44, 46, 47 BoW Bag-of-Words. 12 FPFH Fast Point Feature Histogram. 12, 33, 34, 36, 37, 41, 43, 44, 52, 65 ICP iterative closest point. 12 KL Kullback-Leibler. 25, 57, 58, 60, 61 LR logistic regression. 13, 35–37, 43, 47, 57 xi Glossary M+R multiply & reduce. 26, 28, 29, 31, 32, 41, 69 MAP maximum a posteriori. 7, 8, 23, 28, 29, 31, 32, 44, 49, 51–53, 63, 66, 67, 81–83 MI mutual information. 9–11 OvR one vs. the rest. 36 PCL Point Cloud Library. 43 POMDP partially observable Markov decision process. 10, 11 SMC sequential Monte Carlo. 26–29, 31, 32, 41, 44, 69 SVM support vector machine. 12 xii Contents Acknowledgments vii Abstract ix Glossary xii 1. Introduction 1 1.1. Overview ....................................... 1 1.2. Conceptual Motivation ............................... 2 1.3. Thesis Outline .................................... 3 2. Generic Passive and Active Multi-View Pose Estimation 5 2.1. Components of a Multi-View Pose Estimation System ............. 5 2.2. Related Work: View Planning ........................... 7 2.3. Related Work: Feature Selection .......................... 12 2.4. Summary ....................................... 14 3. Rotation Estimation using the Bingham Distribution 17 3.1. 3d Rotations ..................................... 17 3.2. Quaternions ..................................... 18 3.3. Bingham Distribution ................................ 18 3.4. Bingham Mixture Models ............................. 22 3.5. Projected Gaussians as Probabilities over Rotations . 23 3.6. State Fusion using Bingham Mixture Models . 24 3.6.1. Algebraic Fusion .............................. 25 3.6.2. Monte Carlo Estimation .......................... 26 3.7. Evaluation ...................................... 27 3.7.1. Evaluation for Gaussian Measurement Model . 28 3.7.2. Evaluation for Multimodal Measurement Model . 29 3.8. Summary ....................................... 32 4. Viewing Direction Classification: Application to Orientation Estimation 33 4.1. Method Overview .................................. 33 4.2. Viewing Direction Classification ......................... 35 4.2.1. Training Setup and Choice of Classifier . 35 4.2.2. Choice of Feature .............................. 36 4.2.3. Example ................................... 37 4.3. Bingham Mixture Measurement Model ..................... 39 4.4. Algorithmic Details for Sequential Estimation . 40 xiii Contents 4.5. Evaluation ...................................... 42 4.5.1. Parameter Space and Parameter Selection using Simulated Data . 42 4.5.2. Evaluation using Real Data ........................ 45 4.6. Summary ....................................... 52 5. Viewing Direction Classification: Application to View Planning 57 5.1. View Informativeness ................................ 57 5.2. Model Surface Informativeness .......................... 58 5.3. Proof-of-Concept Evaluation of Informativeness Values . 63 5.4. Outlook: View Planning Approaches ....................... 64 5.5. Summary ....................................... 68 6. Conclusion 69 Appendix 70 A. Complete Rankings for Rotation Fusion Evaluation 71 B. All Sequence Plots for Occlusion Experiment 81 Bibliography 85 xiv 1. Introduction In this chapter, the general motivation and scope of the presented work will be introduced. Relevant, basic terminology will be explained and a short outline of the thesis is given. 1.1. Overview Recognition and pose estimation of known objects is necessary for many tasks including monitoring and tracking purposes or robotic manipulation of objects. Whereas recogni- tion is the task of deciding which object is present, pose estimation refers to estimating an object’s position and orientation in up to three dimensions. If the objects are known in advance, analyzed by the estimation algorithm in an offline training phase and the online application is limited to the a priori known objects, one speaks of model based object recog- nition and pose estimation. Robotic part handling in industrial applications is a prominent example and commercial use case for such algorithms because industrial manipulation is most often limited to a fixed set of parts known in advance. Model based recognition and pose estimation have been subject to extensive research since the early 70s (Chin and Dyer [5]) and generally work in two steps. In the offline phase, a representation of the object is built using features derived from training data. In the online phase, incoming sensor data is matched to this

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    102 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us