Geometry and Learning Co-Supported Normal Estimation for Unstructured Point Cloud

Geometry and Learning Co-Supported Normal Estimation for Unstructured Point Cloud

Geometry and Learning Co-supported Normal Estimation for Unstructured Point Cloud Haoran Zhou1,2∗ Honghua Chen1∗ Yidan Feng1,2 Qiong Wang3 Jing Qin4 Haoran Xie5 Fu Lee Wang6 Mingqiang Wei1,2† Jun Wang1† 1Nanjing University of Aeronautics and Astronautics 2MIIT Key Laboratory of Pattern Analysis and Machine Intelligence 3Shenzhen Institutes of Advanced Technology 4The Hong Kong Polytechnic University 5Lingnan University 6The Open University of Hong Kong Abstract crucial task for unstructured point cloud. The problem of normal estimation has been extensively In this paper, we propose a normal estimation method researched, yet not well-solved. We can roughly divide ex- for unstructured point cloud. We observe that geometric es- isting normal estimation techniques into two categories: tra- timators commonly focus more on feature preservation but ditional methods and learning-based methods. Traditional are hard to tune parameters and sensitive to noise, while ones usually utilize several elaborately-designed regulari- learning-based approaches pursue an overall normal es- ties to preserve/recover sharp features, while learning-based timation accuracy but cannot well handle challenging re- methods pursue to learn a general mapping from noisy gions such as surface edges. This paper presents a novel inputs to ground truths. However, no existing algorithm normal estimation method, under the co-support of geomet- can serve as a normal estimation panacea: 1) Traditional ric estimator and deep learning. To lowering the learning methods always heavily rely on parameters tuning, like the difficulty, we first propose to compute a suboptimal initial neighborhood scale for plane fitting [31, 32]; 2) Learning- normal at each point by searching for a best fitting patch. based techniques, either using the convolutional neural net- Based on the computed normal field, we design a normal- works (CNN) architecture [6, 4], or the PointNet architec- based height map network (NH-Net) to fine-tune the sub- ture [15, 33], are both limited by the ability of feature rep- optimal normals. Qualitative and quantitative evaluations resentation. It is, therefore, hard to learn a straightforward demonstrate the clear improvements of our results over both mapping from severely degraded inputs to the ground-truth traditional methods and learning-based methods, in terms normals, especially in sharp feature regions. of estimation accuracy and feature recovery. Motivated by these challenges, in this work, we pro- pose a two-stage normal estimation method for unstruc- 1. Introduction tured 3D point clouds. The key idea of our approach is to solve the ill-posed normal estimation problem via two Various kinds of 3D laser scanners and depth cam- sub-steps: 1) computing a suboptimal intermediate normal eras have emerged in recent decades, making point clouds field with features as well-preserved as possible, by a ge- move into the focus of many practical applications, such ometric estimator; 2) formulating the final normal recov- as robotic grasping [20], 3D reconstruction [11], and au- ery procedure as a regression function that maps the in- tonomous driving [24]. Commonly, the scanned point termediate normal results to their ground truths. In detail, clouds only contain points’ spatial location information as- for lowering the difficulty of normal learning in challeng- sociated with the noise, incompleteness and sampling irreg- ing regions, we present a multi-scale fitting patch selection ularity, while lacking local surface geometry properties, like (MFPS) scheme to help estimate a suboptimal normal field, point normals. Quality normals can facilitate a huge amount which contributes more on the feature preservation. Since of downstream tasks, for example, point cloud consolida- in some challenging regions, where parameters are hard to tion [10], surface reconstruction [13], and model segmen- be tuned, the initial normals are still imperfect, we then de- tation [8]. Hence, estimating normals is an inevitable and sign a normal-based height map network (NH-Net), which ∗Co-first authors utilizes both the above estimated normals and the local sur- †Co-corresponding authors (mqwei/[email protected]) face information to obtain the final optimal normals. We 13238 Figure 1. The pipeline of our normal estimation method. We first compute the suboptimal normal at each point via multi-scale fitting patch selection (MFPS). Then, a multi-scale point descriptor is constructed based on bilateral normal filters (BNF) and local height-map patches (HMPs). Our NH-Net, consisting of an HMP-based module and a gathering module, receives MPDs and produces the final normal. experimentally prove that the combination of the geometric of method cannot well estimate the normals of points near/ estimator scheme and the learning-based recovery scheme on sharp features. Based on the observation that the neigh- outperforms either of them. bors belonging to different surface patches should be dis- Our main contributions are three-fold: carded, recent works dedicated to selecting a plane approx- • We design a two-stage normal estimation method by imating the neighbors from the same surface patch to esti- collaborating geometric estimator and deep neural network, mate normals [19, 32, 30, 31]. Under the assumption that which shows clear improvements over the state-of-the-arts. surfaces commonly are composed of piecewise flat patches, • We propose a multi-scale fitting patch selection sparsity-based methods [3, 28, 9] show impressive results, scheme, which can produce a feature-preserving initial nor- especially in sharp feature preservation. Some other meth- mal field as an input of the following recovery network. ods like Hough Transform [5] also yield pleasing results. • We propose a normal refining network (NH-Net), 2.2. Learning•based normal estimator which is able to compensate the imperfection of the sub- optimal normal results computed in the first step. Recently, learning-based methods gradually show its power for normal estimation. Boulch et al. [6] proposed 2. Related work to project a discretized Hough space representing normal directions onto a structure amenable to CNN-based deep Normal estimation for point clouds is a long-standing learning. Roveri et al. [26] defined a grid-like regular input problem in academic. We will review previous researches to the CNN to learn ideal normal results. Ben-Shabat et al. from traditional normal estimators to recent prevalent [4] presented a method that approximates the local normal learning-based techniques. vector using a point-wise, multi-scale 3D modified Fisher 2.1. Traditional normal estimator Vector representation which serves as an input to a deep 3D CNN architecture. In addition, they learn the neighborhood The simplest and best-known method for normal estima- size that minimizes the normal estimation error using a mix- tion is based on the Principal Component Analysis (PCA) ture of experts. The key of the three methods lies in param- [17], by analyzing the covariance in a local structure around eterizing the unstructured point cloud into a regular domain a point and defining the normal as the eigenvector corre- for directly applying the CNN architecture. Another point sponding to the smallest eigenvalue. Following this work, cloud learning framework, namely PointNet [25], becomes a lot of variants have been proposed [23, 7, 14]. In par- very popular in 3D domain, since it can directly learn fea- ticular, Mitra et al. [23] analyzed the effects of neighbor- tures from data of points. Inspired by it, Guerrero et al. [15] hood size, curvature, sampling density, and noise for esti- proposed a unified method for estimating normals and prin- mating normals. Another kind of normal estimation method cipal curvature values in noisy point clouds. This approach is based on Voronoi cells [2, 12, 1, 22]. However, this kind is based on a modification of the PointNet architecture, in 13239 points to construct a set of candidate planes and pick up one that best describes the underlying surface patch. Typ- ical methods are RNE [19] and PCV [31], in which they introduce a residual bandwidth to evaluate proximity of the candidate planes to the underlying surface. However, when they come to noisy inputs, the neighborhood of the target point for detecting candidate plane is often corrupted by (a) Noisy input (b) RNE [19] (c) PCV [31] (d) Ours points from another side of the intersection. This causes the selected plane deviating from the true surface (Fig. 2(b)). Figure 2. Comparison of the selected planes by different methods. PCV is better but still imperfect (Fig. 2(c)). From the left column to the rightmost: Noisy input, and the plane fitting results of RNE, PCV and our method. Red points represent 4.1. Fitting patch selection the noisy input and blue star in the first column is the target point. The green line means the selected plane and the blue points are Considering the problems stated above, we propose to neighbors considered to fit this plane in the corresponding method. select a more consistent and flexible neighborhood which We can easily observe that our method can select the most suitable provides better local structure information for robust normal plane, with the help of better neighboring information. estimation. For each candidate point pi, our method tries to find the best fitting patch that contains pi. Please note normals of smooth points are already computed by PCA. which they place special emphasis on extracting local prop- First, for each point in a point cloud, we define a local erties of a patch around a given central point. By leverag- pj patch Qj (the K-nearest neighboring points of pj). Then, ing both the PointNet and 3DCNN, Hashimoto et al. [16] ∗ the fitting plane θ of each Qj is determined by the fol- proposed a joint network that can accurately infer normal Qj lowing objective function: vectors from a point cloud. Based on the PointNet architec- ture, Zhou et al. [33] introduced an extra feature constraint 1 k mechanism and a multi-scale neighborhood selection strat- EQj (θ)= X Wσj (pj ,θ), (1) |Qj| pk Q egy to estimate normals for 3D point clouds.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    10 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us