<<

remote sensing

Article Estimating Tree Position, Diameter at Breast Height, and Tree Height in Real-Time Using a Mobile Phone with RGB-D SLAM

Yongxiang Fan 1, Zhongke Feng 1,*, Abdul Mannan 1, Tauheed Ullah Khan 2, Chaoyong Shen 1 and Sajjad Saeed 3 1 Precision Key Laboratory of Beijing, Beijing Forestry University, Beijing 100083, China; [email protected] (Y.F.); [email protected] (A.M.); [email protected] (C.S.) 2 School of Nature Conservation, Beijing Forestry University, Beijing 100083, China; [email protected] 3 College of Forestry, Beijing Forestry University, Beijing 100083, China; [email protected] * Correspondence: [email protected]

 Received: 8 October 2018; Accepted: 19 November 2018; Published: 21 November 2018 

Abstract: Accurate estimation of tree position, diameter at breast height (DBH), and tree height measurements is an important task in . Mobile Laser Scanning (MLS) is an important solution. However, the poor global navigation satellite system (GNSS) coverage under the canopy makes the MLS system unable to provide globally-consistent point cloud data, and thus, it cannot accurately estimate the forest attributes. SLAM could be an alternative for solutions dependent on GNSS. In this paper, a mobile phone with RGB-D SLAM was used to estimate tree position, DBH, and tree height in real-time. The main aims of this paper include (1) designing an algorithm to estimate the DBH and position of the tree using the point cloud from the time-of-flight (TOF) camera and camera pose; (2) designing an algorithm to measure tree height using the perspective projection principle of a camera and the camera pose; and (3) showing the measurement results to the observer using augmented reality (AR) technology to allow the observer to intuitively judge the accuracy of the measurement results and re-estimate the measurement results if needed. The device was tested in nine square plots with 12 m sides. The tree position estimations were unbiased and had a root mean square error (RMSE) of 0.12 m in both the x-axis and y-axis directions; the DBH estimations had a 0.33 m (1.78%) BIAS and a 1.26 cm (6.39%) root mean square error (RMSE); the tree height estimations had a 0.15 m (1.08%) BIAS and a 1.11 m (7.43%) RMSE. The results showed that the mobile phone with RGB-D SLAM is a potential for obtaining accurate measurements of tree position, DBH, and tree height.

Keywords: RGB-D SLAM; smart phone; real-time estimation; augmented reality; forest inventory; point cloud

1. Introduction Forest ecosystems are considered important for the survival of both animals and human beings because of their environmental and socio-economic benefits. They not only provide services such as soil and water conservation, carbon storage, climate regulation, and biodiversity, but also provide food, wood, and energy [1–3]. Forest resource information is an important solid basis for making decisions at different levels according to various needs such as timber harvesting, biomass, species diversity, watershed protection, and climate change impact evaluation, etc. [4]. Forest inventory primarily involves the collection of forest resource information, which aims to provide accurate estimates of the forest characteristics including the wood volume, biomass, or species diversity within the region of interest [5]. These attributes are precisely estimated by models constructed using tree species,

Remote Sens. 2018, 10, 1845; doi:10.3390/rs10111845 www.mdpi.com/journal/remotesensing Remote Sens. 2018, 10, 1845 2 of 19 diameter at breast height (DBH), and tree height [6]. The conventional instruments used to measure these properties are and clinometers for the DBH and tree heights, respectively [6,7]. Forest inventory has gradually improved with the advancement of remote sensing sensor technology, computing capabilities, and Internet of Things (IoT) [8–11]. New technologies with high levels of precision and accuracy in the field of forestry have been introduced in recent years [12]. Terrestrial laser scanning (TLS), also known as ground-based Light Detection and Ranging (LiDAR), provides a solution for efficiently collecting the reference data. This technology has been used to extract various forestry attributes [6,13–16]. However, the TLS has some operational and performance limitations, especially when it is used in large forests, as it is time consuming, laborious, and intensive to carry and mount [17]; moreover, its data processing is also difficult [15]. Mobile Laser Scanning (MLS) is a vehicle born system with an inertial navigation system (INS) and a laser scanner, which makes it possible to move and measure 6 degrees-of-freedom (DOF) of the laser scanner in forests [17–21]. However, the poor GNSS coverage under the canopy could hinder the positioning accuracy, hence making it difficult to achieve a globally-consistent point cloud [22]. Simultaneous localization and mapping (SLAM) is a process that can simultaneously generate the map of unfamiliar environmental conditions, as well as locating the mobile platform [23,24]. This technology is a potential solution that can use sensors such as cameras and lasers to perform relative positioning without GNSS signals through SLAM algorithms in real-time. Some previous studies initially applied SLAM to the forestry inventory process [22,25–27]. However, the MLS system is relatively complex, heavy, and expensive. The Time of Flight (TOF) camera is an alternative to LiDAR, and its basic principle is its consistency with LiDAR. The difference is that TOF cameras generally use infrared as the light source, and do not use point-by-point scanning, but rather, an integrated matrix of multiple TOF sensors to measure multiple distances simultaneously [28]. TOF cameras are power efficient and smaller than LiDARs, and can even be integrated into a mobile phone [29]. The combination of the TOF camera and RGB camera forms an RGB-D camera that can acquire the texture and depth information of the surrounding environment, and it is often used as input to the SLAM system which known as the RGB-D SLAM system [30]. This sensor overcomes the shortcomings of the inability of a monocular SLAM system to obtain the scale and correspondence of a stereo SLAM [30]. With the improvement of the SLAM algorithm and the advancement of chip computing capabilities, the RGB-D SLAM system can even run on a mobile phone [29]. Compared to the traditional MLS system, the mobile phone has the advantages of being portable, inexpensive, and highly integrated. Previously, researchers mainly used the SLAM algorithm to improve the pose estimation accuracy of the MLS system offline [25–27,31]. Some other researchers also used point clouds from RGB-D SLAM to obtain tree positions and DBHs offline instead of emphasizing real-time access to forest attributes [32,33]. This paper aims to estimate forest attributes in real-time in a forest inventory using a mobile phone with RGB-D SLAM. We design algorithms for the mobile phone with RGB-D SLAM, to evaluate tree position, DBH, and tree height in forests online. In addition, AR technology is used to display the estimated results on the screen of a mobile phone, enabling the observer to visually evaluate their accuracy. This paper is organized as follows. The SLAM algorithms and the technology of the portable SLAM device used in this paper are described in Section2. Section3 describes the study material, the designed system flow, and the algorithms involved in the estimation process. Section4 provides the experimental results for nine field plots. Section5 discusses the results through comparison with previous studies. Section6 is the conclusions. Remote Sens. 2018, 10, 1845 3 of 19 Remote Sens. 2018, 9, x FOR PEER REVIEW 3 of 19

2.2. TheoryTheory andand TechnologyTechnology

2.1.2.1. SLAMSLAM SLAMSLAM isis thethe processprocess byby whichwhich aa mobilemobile platformplatform buildsbuilds aa mapmap ofof thethe surroundingsurrounding environmentenvironment andand findsfinds itsits locationlocation usingusing thethe mapmap inin real-time,real-time, asas shownshown inin Figure Figure1 .1.

FigureFigure 1.1. TheThe simultaneoussimultaneous localization and mapping (SLAM) problem. Here, Here, xk is is thethe statestate vectorvector

describingdescribing thethe posepose ofof the mobile platform at time kk;; uk is is the the motion motion vector vector describingdescribing thethe movementmovement ofof thethe platformplatform fromfrom timetime kk −− 11 to to time time kk;; mj is is thethe vectorvector describingdescribing thethe positionposition ofof thethe jthjth landmark.landmark.

AA mobilemobile platformplatform movesmoves inin anan unknownunknown areaarea and,and, atat thethe samesame time,time, separatelyseparately observesobserves thethe surroundingsurrounding unknownunknown landmarkslandmarks andand recordsrecords theirtheir relativerelative motionmotion throughthrough thethe mountedmounted visualvisual sensorssensors andand motionmotion sensors.sensors. Then,Then, thethe relativerelative positionspositions ofof thethe landmarkslandmarks andand thethe posepose ofof thethe mobilemobile platform are estimated in real time. In probabilistic form, data from motion sensors u are described as platform are estimated in real time. In probabilistic form, data from motion sensors k are described aas motion a motion model model (a probability(a probability distribution): distribution): ( | P(Pxk|xk−1, u,k).). (1)(1)

Here, , are the platform poses at times k and k − 1, respectively. Data from the vision Here, xk, xk−1 are the platform poses at times k and k − 1, respectively. Data from the vision sensors are described as an observation model (a probability distribution): sensors zk are described as an observation model (a probability distribution): P(|,). (2) P(z |x , m). (2) Here, is the set of all landmarks. Generally,k k the SLAM problem is best described as a ( | probabilisticHere, m Markovis the set . of allThat landmarks. is, the joint Generally,posterior probability the SLAM at problem time k, P is best, described:,:, as), ais solved using the posterior joint probability at time k − 1, P(,|:,:,), and some other probabilistic Markov chain. That is, the joint posterior probability at time k, P(xk, m|Z0:k, U0:k, x0), conditions. In the SLAM algorithm, this is implemented using recursive steps, namely, predictive is solved using the posterior joint probability at time k − 1, P(xk−1, m|Z0:k−1, U0:k−1, x0), and some otherupdate conditions. and observation In the update SLAM forms: algorithm, this is implemented using recursive steps, namely, predictivePredictive updateupdate and observation update forms: Predictive update (,|:,:,)=(|,)(,|:,:,) (3) Z ObservationP (updatexk, m|Z 0:k−1, U0:k, x0) = P(xk|xk−1, uk)P(xk−1, m|Z0:k−1, U0:k−1, x0)dxk−1 (3)

(|,)(,|:,:,) Observation update (,|:,:,)= (4) (|:,:) P(zk|xk, m)P(xk, m|Z0:k−1, U0:k, x0) P(xk, m|Z0:k, U 0:k, x0) = (4) P(zk|Z0:k−1, U0:k)

Remote Sens. 2018, 10, 1845 4 of 19 Remote Sens. 2018, 9, x FOR PEER REVIEW 4 of 19

Obviously,Obviously, thethe keykey toto solvingsolving thethe SLAMSLAM problemproblem isis toto givegive aa reasonable expressionexpression ofof thethe motionmotion modelmodel andand thethe observationobservation model, model, and and then then the the predictive predictive update update and and observation observation update update are are used used to solveto solve the targetthe target probability probability distribution. distribution. Three commonly-usedThree commonly-used methods methods for solving for SLAM solving problems SLAM areproblems extended are Kalmanextended filtering Kalman (EKF), filtering sparse (EKF), nonlinear sparse optimization,nonlinear optimization, and particle and filtering. particle filtering.

2.2.2.2. TheThe TechnologyTechnology ofof aa PortablePortable Graph-SLAMGraph-SLAM DeviceDevice SLAMSLAM algorithmsalgorithms havehave beenbeen implementedimplemented toto tracktrack devicedevice positionspositions inin real-timereal-time toto buildbuild mapsmaps ofof thethe surroundingsurrounding three-dimensionalthree-dimensional (3D)(3D) world.world. Though most SLAMSLAM softwaresoftware todaytoday worksworks onlyonly onon highhigh poweredpowered computers,computers, ProjectProject TangoTango [[29]29] enablesenables thisthis technologytechnology toto runrun onon aa portableportable mobilemobile platformplatform ().(Smartphone). Project createdcreated aa technologytechnology thatthat enablesenables mobilemobile devicesdevices toto acquireacquire theirtheir posesposes inin thethe three-dimensionalthree-dimensional worldworld inin realreal timetime usingusing highlyhighly customizedcustomized hardwarehardware andand software.software. Three-dimensional Three-dimensional maps maps can can also also be be cons constructedtructed by bycombining combining pose pose data data with with texture texture and anddepth depth information. information. FigureFigure2 2 shows shows the the smartphone smartphone with with Google Google Tango Tango sensors sensors ( (Lenovo Phab Phab 2 2 Pro Pro [ 31[31])]) which which was was usedused inin thisthis paper.paper. The mobile phone contains a combinationcombination of an RGBRGB camera,camera, a timetime ofof flightflight camera,camera, andand aa motion-trackingmotion-tracking cameracamera calledcalled aa visionvision sensor.sensor. TheThe sensorsensor isis usedused toto acquireacquire texturetexture andand depthdepth informationinformation ofof thethe 3D3D worldworld so that the phone has a similar perspectiveperspective of the real world. AA 9-axis9-axis acceleration/gyroscope/compassacceleration/gyroscope/compass sensor sensor co combinedmbined with with the the vision vision sensor can be usedused toto implementimplement a a Visual-Inertial Visual-Inertial Odometer Odometer system system using using SLAM SLAM algorithms. algorithms. Some Some special special hardware, hardware, such such as aas computer-vision a computer-vision processor, processor, is usedis used to to help help speed speed up up data data processing processing so so that that the the pose pose of of thethe devicedevice cancan bebe acquiredacquired andand adjustedadjusted inin real-time.real-time.

FigureFigure 2.2. Structure ofof aa smartsmart phonephone (Lenovo(Lenovo PhabPhab 22 Pro)Pro) withwith RGB-DRGB-D SLAM.SLAM.

TheThe devicedevice usesuses aa Visual-InertialVisual-Inertial OdometerOdometer systemsystem forfor thethe frontfront endend andand backback endend optimizationoptimization algorithmsalgorithms implementimplement adjustmentsadjustments toto accountaccount forfor odometryodometry drift.drift. TheseThese adjustmentsadjustments areare mainlymainly donedone throughthrough posepose graphgraph non-linearnon-linear optimizationoptimization andand looploop closureclosure optimization.optimization. TheseThese optimizationsoptimizations useuse visualvisual featuresfeatures toto identifyidentify previously-visitedpreviously-visited regionsregions andand thenthen adjustadjust thethe correspondingcorresponding cameracamera posepose driftdrift duringduring thisthis process.process.

3.3. MaterialsMaterials andand MethodsMethods 3.1. Study Area 3.1. Study Area This study was conducted in a managed forest located in the suburbs of Beijing, China (N39◦590 This study was conducted in a managed forest located in the suburbs of Beijing, China (N39°59′ E116◦110, Figure3). We took nine (12 m × 12 m) square plots based on a wide range of DBH and tree E116°11′, Figure 3). We took nine (12 m × 12 m) square plots based on a wide range of DBH and tree height values. The selected plots had fewer shrubs and were accessible to locals. Table1 presents the height values. The selected plots had fewer shrubs and were accessible to locals. Table 1 presents the summarized basic description of the plots. summarized basic description of the plots.

Remote Sens. 2018, 9, x FOR PEER REVIEW 5 of 19

Remote Sens. 2018, 10, 1845 5 of 19 Remote Sens. 2018, 9, x FOR PEER REVIEW 5 of 19

Figure 3. Location of study area.

Table 1. Summary statistics of plot attributes.

DBH (cm) Tree Height (m) Plot Tree Number DominantFigureFigure Species 3. 3. Location Location of of study study area. area. Mean SD Min Max Mean SD Min Max TableMetasequoia 1. Summary statistics of plot attributes. 1 20 Table 1. Summary statistics14.1 of 2.2plot attributes.8.5 17.5 11.1 3.1 3.8 17.8 glyptostroboides DBH (cm) Tree Height (m) Plot2 Tree Number26 DominantUlmus spp. Species 16.3 DBH6.7 (cm)6.4 31.3 12.2 Tree 4.3Height 4.9 (m) 21.7 Plot Tree Number DominantFraxinus chinensis Species & Mean SD Min Max Mean SD Min Max 3 22 Mean20.7 SD6.9 Min12.4 Max34.5 Mean 9.5 SD 3.4 Min5.4 Max15.9 1 20 MetasequoiaMetasequoiaUlmus glyptostroboides spp. 14.1 2.2 8.5 17.5 11.1 3.1 3.8 17.8 12 2620 Ulmus spp.14.1 16.3 2.2 6.78.5 6.4 17.531.3 11.112.2 3.1 4.3 3.8 4.9 17.821.7 4 28 glyptostroboidesGinkgo biloba 16.5 1.6 13.1 19.6 11.3 1.0 9.6 13.9 3 22 Fraxinus chinensis & Ulmus spp. 20.7 6.9 12.4 34.5 9.5 3.4 5.4 15.9 5 19 Populus spp. 26.1 2.3 22.0 30.2 25.1 4.5 14.9 37.3 24 2826 UlmusGinkgo spp. biloba 16.316.5 6.7 1.66.4 13.131.319.6 12.211.3 4.3 1.0 4.9 9.6 21.713.9 56 1917 FraxinusPopulusPopulus chinensis spp. & 27.8 26.1 2.6 2.3 23.2 22.0 32.030.2 25.425.1 2.6 4.5 19.1 14.9 29.537.3 3 22 Styphnolobium 20.7 6.9 12.4 34.5 9.5 3.4 5.4 15.9 67 1721 UlmusPopulus spp.spp. 12.9 27.8 4.2 2.6 6.1 23.2 21.532.0 11.725.4 3.1 2.6 4.6 19.1 18.129.5 47 2128 StyphnolobiumGinkgojaponicum biloba japonicum 16.512.9 1.6 4.213.1 6.1 19.621.5 11.311.7 1.0 3.1 9.6 4.6 13.918.1 588 201920 FraxinusFraxinusPopulus chinensis spp. 26.116.116.1 2.34.0 4.0 22.010.2 10.2 30.223.523.5 25.110.810.8 4.53.7 3.7 14.91.2 1.2 37.316.416.4 9 20 Ginkgo biloba 19.6 2.4 15.6 23.7 12.8 1.8 10.4 17.5 69 1720 PopulusGinkgo bilobaspp. 27.819.6 2.62.4 23.215.6 32.023.7 25.412.8 2.61.8 19.110.4 29.517.5 Styphnolobium 7 21 12.9 4.2 6.1 21.5 11.7 3.1 4.6 18.1 3.2. Methods japonicum 8 20 Fraxinus chinensis 16.1 4.0 10.2 23.5 10.8 3.7 1.2 16.4 3.2.1.9 The System 20 Workflow Workflow Ginkgo biloba 19.6 2.4 15.6 23.7 12.8 1.8 10.4 17.5 An applicationapplication was was developed developed to enableto enable the SLAMthe SL mobileAM mobile phone phone to be used to be for used forestry for inventory. forestry 3.2. Methods inventory.Figure4 shows Figure the 4 workflow shows the of ourworkflow hardware of our and hardware software system.and software The SLAM system. system The uses SLAM an RGB-Dsystem camera and an inertial measurement unit (IMU) as inputs and produces RGB images, point clouds, 3.2.1.uses Thean RGB-D System camera Workflow and an inertial measurement unit (IMU) as inputs and produces RGB images, pointposes, clouds, and time poses, stamps. and time Our stamps. forest Our inventory forest in systemventory uses system these uses data, these and data, then and interactsthen interacts with withtheAn Android the application Android system system was to show todeveloped show the the results toresults enable on on a screen athe scr SLeen orAM or accept acceptmobile instructions instructions phone to from befrom used users. users. for TheThe forestry forest inventory.inventory systemFigure 4 includesincludes shows the mappingmapping workflow thethe plot plotof our groundground hardware and and tree-by-tree tree-by-tree and software estimation, estimation, system. as Theas shown shown SLAM in in Figuresystem Figure5 . uses5.The The an mapping mappingRGB-D processcamera process and should should an inertial provide provide measurement a a globally globally consistent co unitnsistent (IMU) map map as andinputsand thethe and plotplot produces coordinatecoordinate RGB system images, for pointtree-by-tree clouds, estimation.poses, and time stamps. Our forest inventory system uses these data, and then interacts with the Android system to show the results on a screen or accept instructions from users. The forest inventory system includes mapping the plot ground and tree-by-tree estimation, as shown in Figure 5. The mapping process should provide a globally consistent map and the plot coordinate system for tree-by-tree estimation.

Figure 4. The workflow workflow of our system.

Figure 4. The workflow of our system.

Remote Sens. 2018, 10, 1845 6 of 19 Remote Sens. 2018, 9, x FOR PEER REVIEW 6 of 19 Remote Sens. 2018, 9, x FOR PEER REVIEW 6 of 19

FigureFigure 5. 5. The The workflow workflow workflow ofof thethe forestforest inventoryinventoryory system system for for data data acquirement. acquirement. 3.2.2. Mapping of the Plot Ground 3.2.2.3.2.2. Mapping Mapping of of the the Plot Plot Ground Ground The plot ground was mapped before observing the trees. That mapping was used to correct the TheThe plot plot ground ground was was mapped mapped before before observingobserving the trees.trees. That That mapping mapping was was used used to tocorrect correct the the poses of the mobile phone when observing the trees through loop-closure detection and pose graph posesposes of ofthe the mobile mobile phone phone when when observing observing thethe trees throughthrough loop-closure loop-closure detection detection and and pose pose graph graph nonlinear optimization. To obtain a globally consistent plot ground map, the scan path was designed nonlinearnonlinear optimization. optimization. To To obtain obtain a a globally globally consisconsistent plotplot ground ground map, map, th the escan scan path path was was designed designed as shown in Figure6a. The plot center was used as the starting point of the mapping process towards as asshown shown in in Figure Figure 6a. 6a. The The plot plot ce centernter was was usedused as the startingstarting po pointint of of the the mapping mapping process process towards towards the north of the plot, while the end was near to the north point. Figure6b shows a typical instance of thethe north north of of the the plot, plot, while while the the end end was was nearnear toto the north point.point. Figure Figure 6b 6b shows shows a atypical typical instance instance of of mapping path. mappingmapping path. path.

(a) (b) Figure 6. The mapping path(a)of plot ground. (a) Pre-survey designed scan path(b map) for collecting data Figure 6. The mapping path of plot ground. (a) Pre-survey designed scan path map for collecting data with mobile phone; (b) Post-survey data collection, and actual path map. Figurewith 6.mobile The mapping phone; (b path) Post-survey of plot ground. data collection, (a) Pre-survey and actual designed path map. scan path map for collecting data Buildingwith themobile Plot phone; Coordinate (b) Post-survey System data collection, and actual path map. Building the Plot Coordinate System BuildingBefore the measuring Plot Coordinate the tree System attributes, we built a new coordinate system during the mapping processBefore to describe measuring the position the tree ofattrib eachutes, tree we in built the plot. a new This coordinate coordinate system system, during known the mapping as the Plot process to describe the position of each tree in the plot. This coordinate system, known as the Plot CoordinateBefore Systemmeasuring (PCS), the has tree the attrib centerutes, of the we plot built as itsa origin,new coordinate and the horizontal system during eastward/horizontal the mapping Coordinate System (PCS), has the center of the plot as its origin, and the horizontal northward/verticalprocess to describe upwardthe position directions of each were tree defined in the asplot. the This x/y/z-axis coordinate directions. system, However, known as the the mobile Plot eastward/horizontal northward/vertical upward directions were defined as the x/y/z-axis directions. phoneCoordinate defined System a coordinate (PCS), system has knownthe center as the of Initial the Coordinateplot as its System origin, (ICS), and in whichthe horizontal the origin However, the mobile phone defined a coordinate system known as the Initial Coordinate System iseastward/horizontal the position of the devicenorthward/vertical when it is initiated, upward the directions x-axis is towardswere defined the right as the side x/y/z-axis of the phone directions. screen, However,(ICS), in thewhich mobile the origin phone is definedthe position a coordinate of the device system when known it is initiated, as the Initialthe x-axis Coordinate is towards System the andright the side y-axis of the is directedphone screen, vertically and the upward, y-axis is whiledirected the vertically z-axis is upward, towards while the the mobile z-axis phone is towards screen. (ICS), in which the origin is the position of the device when it is initiated, the x-axis is towards the Conversionthe mobile of phone the coordinate screen. Conversion system was of neededthe coordinate after the system center was and needed the north after corner the center of the and plot the had right side of the phone screen, and the y-axis is directed vertically upward, while the z-axis is towards beennorth tapped corner on of the the screen plot had during been the tapped mapping on the process, screen during as shown the inmapping Figure7 process,a–c. The as center shown of in the the mobile phone screen. Conversion of the coordinate system was needed after the center and the plotFigure was 7a–c. tapped The at center the beginning of the plot ofwas the tapped mapping at the process, beginning and of the mapping north corner process, was and tapped the north at the north corner of the plot had been tapped on the screen during the mapping process, as shown in endcorner of the was mapping tapped process.at the end After of the that, mapping the transformation process. After matrixthat, the between transformation the PCS matrix and the between ICS was obtained.Figurethe PCS 7a–c. Then, and The the the center ICS PCS ofwas was the obtained. transformed plot was Then,tapped into the theat PCSthe OpenGL beginning was transformed coordinate of the mapping systeminto the and process,OpenGL displayed andcoordinate asthe shown north incorner Figuresystem was7 andd. tapped displayed at the as end shown of the in Figuremapping 7d. process. After that, the transformation matrix between the PCS and the ICS was obtained. Then, the PCS was transformed into the OpenGL coordinate system and displayed as shown in Figure 7d.

Remote Sens. 2018, 10, 1845 7 of 19 Remote Sens. 2018, 9, x FOR PEER REVIEW 7 of 19

(a) (b)

(c) (d)

(e) (f)

(g) (h)

(i) (j)

FigureFigure 7. 7.Different Different states states of of the the SLAM SLAM devicedevice during the observation observation process. process. (a ()a Waiting) Waiting to totap tap the the plotplot center center on on the the screen; screen; (b )(b waiting) waiting to to move move towards towards the the north north corner corner of of the the plot; plot; (c ()c waiting) waiting to to tap tap the norththe north corner corner on the on screen; the screen; (d) completion (d) completion of the of PCSthe PCS building; building; (e) ( waitinge) waiting to tapto tap a bottoma bottom point point of a tree;of (af )tree; waiting (f) waiting to scan to the scan breast the breast height height section section of the of tree; the (tree;g) completion (g) completion of fitting of fitting the circlethe circle at breast at height;breast ( hheight;) waiting (h) to waiting tap the to position tap the of position the treetop of the on treetop the screen; on the (i) completionscreen; (i) completion of the estimation of the of theestimation tree height; of the (j) finishedtree height; estimating (j) finished the estimating attributes the of the attributes tree. of the tree.

Remote Sens. 2018, 10, 1845 8 of 19 Remote Sens. 2018, 9, x FOR PEER REVIEW 8 of 19 Remote Sens. 2018, 9, x FOR PEER REVIEW 8 of 19

3.2.3.3.2.3. Estimation Estimation of of the the Stem Stem Position, Position, DBH, DBH, and and Tree Tree Height 3.2.3. Estimation of the Stem Position, DBH, and Tree Height Estimation of the stem position and DBH Estimation of the stem position and DBH After mapping the plot ground, we were able to observe each tree individually. While observing After mapping the plot ground, we were able to observe each tree individually. While observing a standing tree, a point near the bottom of the tree was acquired to determine the breast height and a standing tree,tree, aa pointpoint near near the the bottom bottom of of the the tree tree was was acquired acquired to determineto determine the the breast breast height height and and the the approximate stem position (see Figure 7e,f). After the bottom point had been taken, the breast approximatethe approximate stem stem position position (see Figure(see Figure7e,f). After7e,f). theAfter bottom the bottom point hadpoint been had taken, been thetaken, breast the heightbreast height was displayed on the screen as shown in Figure 7f. Then, the position of the stem center and washeight displayed was displayed on the on screen the screen as shown as shown in Figure in 7Figuf. Then,re 7f. theThen, position the position of the stemof the center stem center and DBH and DBH were calculated using the current camera pose and the point cloud (Figure 8). wereDBH calculatedwere calculated using using the current the current camera camera pose and pose the and point the cloudpoint cloud (Figure (Figure8). 8).

(a) (b) (a) (b) Figure 8. The RGB image and depth image used to calculate the stem position and DBH: (a) the RGB Figure 8. The RGB image and depth image used to calculate the stem position and DBH: (a) the RGB image; (b) the Depth image. image; (b) the Depth image. For convenience of operation, an auxiliary coordinate system (Auxiliary Camera Coordinate For convenience of operation, an auxiliary coordinatecoordinate system (Auxiliary Camera Coordinate System, ACCS) was established with a similar origin to that of the camera coordinate system (CCS), System, ACCS) was established with a similar origin to that of the camera coordinate system (CCS), as the y-axis direction was same as the y-axis direction of the PCS (vertically upward), and the z-axis as the y-axis direction was same as the y-axis direction of the PCS (vertically upward), and the z-axis was in the plane formed by the new y-axis and the z-axis of the CCS. The point cloud data were was in the plane formed by the new y-axis and the z-axis of the CCS. The point cloud data were constructed and transformed into the ACCS. The points belonging to the tree were filtered according constructed andand transformedtransformed into into the the ACCS. ACCS. The The points points belonging belonging to to the the tree tree were were filtered filtered according according to to the bottom point (_). Figure 9 shows the scatter plot of these points on the plane O − theto the bottom bottom point point (xac _(bottom_). Figure). Figure9 shows 9 shows the scatter the scatter plot of theseplot of points these on points the plane on theO acplane− xac O−zac−, x −z, while the filter conditions were set to whilex −z the, filterwhile conditions the filter conditions were set to were set to  x_ −1x x_ +1 x_−−1x< <x_++1  xzac_bottom −1z1 xac zxac_bottom +11  z_ −1z z_ +1 (5) z _− 1 < zac< z _+ 1 (5) y_ac_bottom−1.25y yac_bottom_ +1.25  y_ −1.25y y_ +1.25  yac_bottom − 1.25 < yac < yac_bottom + 1.25

Figure 9. The scatter plot of all filtered points on the plane Oac − xac − zac. Figure 9. The scatter plot of all filtered points on the plane O −x −z. Figure 9. The scatter plot of all filtered points on the plane O −x −z. The marginal distribution of all filtered points in the x-axis direction was approximately uniformly The marginal distribution of all filtered points in the x-axis direction was approximately distributedThe marginal (Figure 10distribution√). When the of√ mean all (filteredµx) and variancepoints in ( σthex) of x-axis the points direction were calculated,was approximately the range uniformly distributedµ − (Figureσ µ +10). Whenσ the mean (μ ) and variance ( σ ) of the points were ofuniformly the stem wasdistributed ( x 3(Figurex, x 10).3 Whenx) in the the x-axis mean direction. (μ ) and RANSAC variance (Random ( σ ) of Sample the points Consensus) were calculated,was used to the improve range theof robustnessthe stem was of the (μ interval − √3σ in,μ this + process√3σ ) in (Figure the x-axis 11). direction. RANSAC calculated, the range of the stem was (μ − √3σ,μ + √3σ ) in the x-axis direction. RANSAC (Random Sample Consensus) was used to improve the robustness of the interval in this process (Random Sample Consensus) was used to improve the robustness of the interval in this process (Figure 11). (Figure 11).

Remote Sens. 2018, 9, x FOR PEER REVIEW 9 of 19 RemoteRemote Sens. Sens.2018 2018, 10, 9,, 1845 x FOR PEER REVIEW 9 of9 of 19 19

FigureFigure 10. 10.The The edge edge distribution distribution of of all all filtered filtered points points in in the the x-axis x-axis direction. direction. Figure 10. The edge distribution of all filtered points in the x-axis direction.

(a) (b) (a) (b) FigureFigure 11. 11.The The stem stem boundary boundary detection detection result result by by a a uniform uniform distribution: distribution: ( a()a the) the boundaries boundaries in in the the Figure 11. The stem boundary detection result by a uniform distribution: (a) the boundaries in the RGBRGB image; image; (b ()b the) the boundaries boundaries on on the the plane plane O acO− −xxac − z−zac.. RGB image; (b) the boundaries on the plane O −x −z. TheThe two two edges edges of of the the stem stem were were underestimated underestimated due due to to the the assumption assumption of of uniform uniform distribution. distribution. The two edges of the stem were underestimated due to the assumption of uniform distribution. TheThe stem stem edges edges were were adjusted adjusted by by linear linear fitting fitting the pointsthe points near near to the to stem the stem edges, edges, and the and point the onpoint each on The stem edges were adjusted by linear fitting the points near to the stem edges, and the point on fittedeach linefitted that line was that farthest was farthest from the from previous the previous estimation estimation interval interval was used was as used the boundaryas the boundary point each fitted line that was farthest from the previous estimation interval was used as the boundary (Figurepoint (Figure12); RANSAC 12); RANSAC was used was in used the process in the process too. too. point (Figure 12); RANSAC was used in the process too.

(a) (b) (a) (b) FigureFigure 12. 12.The The stem stem boundary boundary detection detection result result after after linear linear fitting: fitting: (a) The (a boundaries) The boundaries in the RGB in the image; RGB Figure 12. The stem boundary detectionO result−x after−z linear fitting: (a) The boundaries in the RGB (bimage;) The boundaries (b) The boundaries on the plane on the Oac plane− xac O−zac−x. −z. image; (b) The boundaries on the plane O −x −z. TheThe TOF TOF camera camera usesuses the principle of of perspective perspective projection projection to to obtain obtain images. images. So, So,when when the stem the The TOF camera uses the principle of perspective projection to obtain images. So, when the stem stemedge edge points points were were defined defined as as andT1 and , Tboth2, both were were the thetangent tangent points points of the of thestem stem circle circle and and the lines the edge points were defined as and , both were the tangent points of the① stem circle and the lines linesT1 Oandac andT2.O Theac. Thestem stem circle circle needed needed to meet to meetthe following the following conditions: conditions: ① Line 1 LineT 1andOac and and . The stem circle needed to meet the following conditions: ① Line ② and T2hadOac tohad be tothe be tangents the tangents of the ofcircle the circleand points and points , T had1, T2 tohad be the to be tangent the tangent points; points; and ② and The 2 pointsThe had to be the tangents of the circle and points , had to be the tangent points; and ② The points pointsbetween between point point andT1 pointand point hadT1 hadto be to in be the in circle. the circle. Condition Condition ① could 1 could be expressed be expressed as the as thecost between point and point had to be in the circle. Condition ① could be expressed as the cost costfunctions: functions: functions:   r = w· zT (zc − zT ) + xT (xc − xT ) (6) r=∙z 1z −z 1+x 1x −x 1 (6) r=∙zz −z+xx −x (6) r=∙z z −z+xx −x  (6) r = w· zT2 (zc − zT2 ) + xT2 (xc − xT2 ) (7) r=∙z z −z +x x −x (7) r=∙zz −z+xx −x (7) r=∙zz −z+xx −x (7)

Remote Sens. 2018, 9, x FOR PEER REVIEW 10 of 19

Remote Sens. 2018, 10, 1845 10 of 19 r=∙d /4 − (x −x) −(z −z) (8)

h 2 2 2i = · − ( − ) − ( − ) r w d /4 xc xT1 zc zT1 (8) r=∙d /4 − (x −x) −(z −z) . (9) h 2 2 2i Here, r is the residual to be optimized; is the weight of the cost function; X =x ,z and r = w· d /4 − (xc − xT2 ) − (zc − zT2 ) . (9) ( ) X =x,z are the coordinates of points and in the plane O −x −z; X = x,z is Here, r is the residual to be optimized; w is the weight of the cost function; X = (x , z ) and the stem center coordinate; and d is the DBH of the tree. Each point in ConditionT1 T②1 Tcan1 be X = (x , z ) are the coordinates of points T and T in the plane O − x − z ; X = (x , z ) is the constructedT2 T2 asT2 the cost function 1 2 ac ac ac c c c 2 stem center coordinate; and d is the DBH of the tree. Each point in Condition can be constructed as the cost function r= − (x −x) −(z −z) . (10) 2 d 2 2 r = − (xc − x ) − (zc − z ) . (10) Here, X = (x,z) is one of the points4 betweeni points andi in the circle. Because condition ① is more important for the circle fitting than condition ②, the weight should have a large value. Here, Xi = (xi, zi) is one of the points between points T1 and T2 in the circle. Because condition In this paper, it was determined according to Condition ② ; if the number of points in the 1 is more important for the circle fitting than condition 2 , the weight w should have a large value. optimization process was defined as N in condition ②, the weight was equal to N/4. After the In this paper, it was determined according to Condition 2 ; if the number of points in the optimization cost functions had been constructed, the Levenberg Marquardt algorithm was used to fit the stem process was defined as N in condition 2 , the weight w was equal to N/4. After the cost functions had circle. To increase the robustness of the optimization result, RANSAC was used in the fitting process. been constructed, the Levenberg Marquardt algorithm was used to fit the stem circle. To increase the Figure 13 shows the fitting circle. Then, the stem position coordinate was converted into the PCS and robustness of the optimization result, RANSAC was used in the fitting process. Figure 13 shows the projected into the OpenGL coordinate system, so the result could be viewed from the display (Figure fitting circle. Then, the stem position coordinate was converted into the PCS and projected into the 7g). OpenGL coordinate system, so the result could be viewed from the display (Figure7g).

Figure 13. The fitted stem circle at breast height. Figure 13. The fitted stem circle at breast height. Estimation of the tree height EstimationTo estimate of the thetree height height of the tree, the device was kept at a position where the treetop could be observedTo estimate simultaneously the height (Figure of the7 h).tree, After the tappingdevice wa thes kept treetop at a position position on where the screen, the treetop the tree could height be observedwas calculated simultaneously in real time. (Figure As the 7h). treetop After was tapping on the th connectione treetop position line between on the the screen, optical the center tree height of the wascamera calculated and the in tapped real time. As (Figure the treetop 14), the was coordinates on the connection of the treetop line pointbetween were the expressed optical center as of the camera and the tapped pixel (Figure 14), the coordinates of the treetop point were expressed as −1 xp_top = tp_c + kRp_cK u. (11) _ = _ +R_K . (11) T Here, xp_top = (xp_top, yp_top, zp_top) is the coordinate of the treetop point in the PCS; tp_c is the Here, _ =_,y_,z_ is the coordinate of the treetop point in the PCS; _ is the translation vector of the camera in the PCS; Rp_c is the rotation matrix of the camera in the PCS; K is translation vector of the camera in the PCS; R_ is the rotation matrix of the camera in the PCS; K is the intrinsic matrix of the camera; u = (u, v, 1)T is the coordinate of the treetop pixel in the pixel the intrinsic matrix of the camera; =(u, v, 1) is the coordinate of the treetop pixel in the pixel coordinate system; k is an unknown factor. However, additional conditions were needed to determine coordinate system; is an unknown factor. However, additional conditions were needed to k. If the standing tree was assumed to be vertical, the treetop would be on the vertical line passing determine k. If the standing tree was assumed to be vertical, the treetop would be on the vertical line through the center of the stem, but since a natural stem is not exactly vertical and the tapped treetop passing through the center of the stem, but since a natural stem is not exactly vertical and the tapped pixel showed deviation, the two lines were difficult to intersect or even impossible to intersect. To solve treetop pixel showed deviation, the two lines were difficult to intersect or even impossible to this problem, we assumed that the treetop was on the plane of the stem center, whose normal vector is intersect. To solve this problem, we assumed that the treetop was on the plane of the stem center, denoted by whose normal vector is denoted by

Remote Sens. 2018, 10, 1845 11 of 19

Remote Sens. 2018, 9, x FOR PEER REVIEW   11 of 19 1 0 0 n =  (t − x ) 100 0 0 0  p_c p_center . (12) 0 0 1 =000_ −_. (12) 001 Here, x = (x , y , z )T is the central coordinate of the stem. In this way, p_center p_center p_center p_center the treetopHere, coordinate_ =x satisfied_,y_,z_ is the central coordinate of the stem. In this way, the treetop coordinate satisfied (xp_top − xp_center)·n = 0. (13)

Then, after being calculated from_ (11) − and_ (13),∙=0 the treetop. coordinate xp_top was (13)

Then, after being calculated from (11) and (13), the treetop coordinate _ was (xp_center − tp_c)·n −1 xp_top = tp_c + ·(Rp_cK u). (14) _(R K−_1u∙)·n _ = _ + p_c ∙R_K . (14) _ ∙ Naturally, thethe treetree heightheight hhequaled equaled

h=h y=_yp_top− y−_yp_center+1.3+ 1.3.. (15)

As shownshown inin Figure Figures7i,j, 7i,j, the the treetop treetop point pointxp_top _was projectedwas projected into the into OpenGL the OpenGL coordinate coordinate system andsystem displayed and displayed on the screen. on the screen.

Figure 14. The geometric relationshiprelationship betweenbetweenthe the points points in in the the tree tree height height solution. solution.O is theis the center center of ofthe the stem stem at breastat breast height; height;C is the is optical the optical center center of the of camera; the camera;T is the treetop;is the treetop;U is the treetop is the pixeltreetop on pixelthe image; on then image;is the horizontal is the horizontal component component vector of vector the O ofC thevector; O andvector;P is and the plane is the whose plane normal whose vectornormal is vectorn and is on which and onO whichis a point. is a point.

3.2.4. Evaluation of the Accuracy of the Stem Position, DBH and Tree Height Measurements 3.2.4. Evaluation of the Accuracy of the Stem Position, DBH and Tree Height Measurements Stem position references were determined by total station (Leica Flexline TS06plus Total Station) Stem position references were determined by total station (Leica Flexline TS06plus Total Station) measurement data, and DBH measurements. When measuring the position of a tree, the total station measurement data, and DBH measurements. When measuring the position of a tree, the total station was installed in the center of the plot. The angle between the stem and north and the horizontal was installed in the center of the plot. The angle between the stem and north and the horizontal distance from the plot center to the trunk were measured by the total station and recorded. The true distance from the plot center to the trunk were measured by the total station and recorded. The true distance true position of the stem was calculated from the distance and angle, and the center point of distance true position of the stem was calculated from the distance and angle, and the center point of the stem was needed for true position calculation. The stem position errors were described by a mean the stem was needed for true position calculation. The stem position errors were described by a mean vector µ and a two-dimensional covariance matrix Σ, as follows: vector and a two-dimensional covariance matrix Σ, as follows: " # " n # ∑i=1 (xi−xir) µx ∑( x −xn) µ = = n μ ∑ (y −y ) (16) µy i=1 i ir =μ = n (16) ∑( y − y) " #" #T n x − x − µ x − x − µ Σ = i ir x i ir x (17) ∑ − − µ − − µ i=1 yi yir y yi yir y x −x −μ x −x −μ Σ= (17) Here, (xi, yi) is the ith position measurement.y − y −μ (xyir,− yiry)is−μ the ith reference, and n is the number of estimations. The direction of the eigenvector corresponding to the largest eigenvalue of the covariance

Here, (x,y) is the ith position measurement. (x,y) is the ith reference, and is the number of estimations. The direction of the eigenvector corresponding to the largest eigenvalue of the covariance matrix is the direction with the greatest variability. The standard deviation of this

Remote Sens. 2018, 10, 1845 12 of 19 matrix is the direction with the greatest variability. The standard deviation of this direction is equal to the square root of the largest eigenvalue, which can describe the variability of the trunk position, and can be expressed as q σmax = max(eigenvalues(Σ)). (18) The references of the DBH were measured using a taper and the references of the tree height were measured using a total station. The accuracy of the DBH and tree height measurements were evaluated using BIAS, root mean square error (RMSE), relative BIAS, and relative RMSE, as defined in the following equations: ∑n (x − x ) BIAS = i=1 i ir (19) n ∑n (x /x − 1) relBIAS = i=1 i ir × 100% (20) n s 2 ∑n (x − x ) RMSE = i=1 i ir (21) n s 2 ∑n (x /x − 1) relRMSE = i=1 i ir × 100% (22) n

Here, xi is the ith measurement. xir is the ith reference. n is the number of estimations. RMSE is also a way of describing the accuracy of the tree position by computing the values in the x-axis and y-axis directions, respectively.

4. Results

4.1. Evaluation of Tree Position The estimated stem positions are given in Figure 15. The accuracy of the stem position was checked with a total station instrument. Our results showed that the range of the average error was approximately −0.12 m to 0.13 m in both the x-axis and y-axis directions, as shown in Table2. In addition, no significant correlation (approximately −0.08 to 0.35) was found between the errors in the x-axis and y-axis directions, so the standard deviations of the maximum variability direction (0.09~0.16 m) were close to the x-axis (0.05~0.16 m) and y-axis (0.05~0.13 m). The RMSEs in the x-axis (0.09~0.17 m) and y-axis (0.07~0.17 m) directions were also close. As shown in Figure 16, although the tree position estimated for each plot had systematic errors, there was a weak error overall. This shows that the systematic errors were random in different sample plots. The overall standard deviation in the maximum variability direction was small (0.10 m) in all plots.

Table 2. Accuracy of the stem position estimations using the SLAM smartphone.

µx µy σx σy σmax RMSEx RMSEy Plot ρxy (m) (m) (m) (m) (m) (m) (m) 1 0.08 −0.12 0.08 0.13 0.20 0.13 0.11 0.17 2 0.00 0.08 0.09 0.05 0.09 0.09 0.09 0.10 3 0.05 −0.01 0.09 0.10 0.12 0.10 0.11 0.10 4 −0.08 −0.06 0.07 0.11 0.05 0.10 0.11 0.12 5 0.08 −0.04 0.06 0.10 −0.04 0.10 0.10 0.11 6 −0.10 −0.08 0.05 0.08 −0.08 0.08 0.12 0.12 7 0.00 −0.01 0.12 0.07 0.12 0.12 0.12 0.07 8 0.13 0.06 0.11 0.11 0.01 0.11 0.17 0.13 9 −0.01 0.05 0.16 0.08 0.35 0.16 0.16 0.09 Total 0.01 −0.01 0.10 0.09 0.10 0.10 0.12 0.12

ρxy means correlation coefficient. Remote Sens. 2018, 10, 1845 13 of 19 Remote Sens. 2018, 9, x FOR PEER REVIEW 13 of 19 Remote Sens. 2018, 9, x FOR PEER REVIEW 13 of 19

Figure 15. Estimated and reference stem positions: the red circles indicate the stem references; the Figure 15. Estimated and reference stem positions: the red circles indicate the stem references; the blue blueFigure crosses 15. Estimated indicate the and estimations. reference stem positions: the red circles indicate the stem references; the crosses indicate the estimations. blue crosses indicate the estimations.

Figure 16. The position errors of all trees in the plots. Figure 16. The position errors of all trees in the plots.

Remote Sens. 2018, 10, 1845 14 of 19 Remote Sens. 2018, 9, x FOR PEER REVIEW 14 of 19 Remote Sens. 2018, 9, x FOR PEER REVIEW 14 of 19

4.2.4.2. Evaluation Evaluation of of DBH DBH TheThe DBH DBHDBH values valuesvalues measured measuredmeasured by byby the thethe smartphone smartphonesmartphone with wiwith RGB-Dth RGB-DRGB-D sensor sensorsensor were werewere compared comparedcompared with with fieldwith field datafield thatdatadata werethat that were measuredwere measured measured using using using diameter di diameterameter tape. tape. Ourtape. resultsOur Our results results showed showed showed that our that that estimated our our estimated estimated values values values of DBH of of DBH wereDBH similarwerewere similar similar to the to referenceto the the reference reference data obtained data data obtained obtained in the field in in the the (Figure field field 17(Figure (Figure). Table 17). 17).3 describes Table Table 3 3 describes thedescribes statistical the the resultsstatistical statistical of theresultsresults DBHs of of the inthe different DBHs DBHs in in plots. different different A 1.26 plots. plots. cm (6.39%)A A 1.26 1.26 cm RMSEcm (6.39%) (6.39%) and RMSE aRMSE 0.33 cmand and (1.78%) a a 0.33 0.33 BIAScm cm (1.78%) (1.78%) is shown BIAS BIAS for is is all shown shown plots. Figureforfor all all plots. plots.18 shows Figure Figure that 18 18 forshows shows smaller that that DBHfor for smaller smaller values, DBH DBH our values, estimatedvalues, our our resultsestimated estimated were results results larger were were than larger larger the reference than than the the value,referencereference while value, value, the while casewhile was the the reversedcase case was was forreversed reversed larger for DBHfor larger larger values. DBH DBH It values. values. can also It It becan can seen also also frombe be seen seen the from from figure the the that figure figure the dispersionthatthat the the dispersion dispersion of observations of of observations observations was relatively was was relatively relatively stable. stable. stable.

FigureFigure 17. 17. Scatter Scatter plot plot of of estimated estimated DBH DBH values. values.

Figure 18. The errors of DBH observations for different DBH values. FigureFigure 18. 18. The The errors errors of of DBH DBH observatio observationsns for for different different DBH DBH values. values. Table 3. Accuracy of the DBH estimations using the SLAM smartphone. TableTable 3. 3. Accuracy Accuracy of of the the DBH DBH estimations estimations using using the the SLAM SLAM smartphone. smartphone. Plot RMSE (cm) relRMSE (%) BIAS (cm) relBIAS (%) PlotPlot RMSE RMSE (cm) (cm) relRMSE relRMSE (%) (%) BIAS BIAS (cm) (cm) relBIAS relBIAS (%) (%) 11 0.73 0.73 5.21% 5.21% 0.41 2.91% 2.91% 21 1.800.73 11.09% 5.21% 0.41 1.26 2.91% 7.73% 322 1.501.80 1.80 11.09% 7.27%11.09% 1.26 0.751.26 7.73% 7.73% 3.63% 433 1.051.50 1.50 7.27%6.40% 7.27% 0.75 0.820.75 3.63% 3.63% 4.99% − − 544 2.221.05 1.05 6.40%8.39% 6.40% 0.82 0.821.64 4.99% 4.99%6.21% 65 0.892.22 3.19%8.39% −−1.640.32 −−6.21%1.16% 75 0.512.22 8.39% 3.97% −1.64 0.40 −6.21% 3.06% 6 0.89 3.19% −0.32 −1.16% 86 0.790.89 3.19% 4.93% −0.32 0.49 −1.16% 3.04% 977 0.390.51 0.51 3.97%1.99% 3.97% − 0.40 0.400.15 3.06% 3.06%−0.74% Total88 1.260.79 0.79 4.93%6.39% 4.93% 0.49 0.330.49 3.04% 3.04% 1.78% 99 0.390.39 1.99%1.99% −−0.150.15 −−0.74%0.74% TotalTotal 1.26 1.26 6.39% 6.39% 0.33 0.33 1.78% 1.78%

Remote Sens. 2018, 9, x FOR PEER REVIEW 15 of 19

Remote4.3. Evaluation Sens. 2018, 109of, ,xTree 1845 FOR HeightPEER REVIEW 15 of 19 Our estimated results of the tree heights were similar to the reference field data measured by 4.3. Evaluation of Tree Height 4.3.total Evaluation stations (Figure of Tree Height 19). The statistical results are summarized in Table 4. It is shown that the estimationsOur estimated generally results had smaller of the treeBIAS heights (0.07 m, were 0.54 %)similar and RMSEto the (0.55reference m, 3.71%) field values,data measured except plot by total5 and Ourstations plot estimated 6. The(Figure dispersion results 19). ofThe theof statisticalobserved tree heights valuesresults were inare similarcreased summarized to as the the reference tree in Tableheight field 4.increased, data It is measured shown as shown that by total the in estimationsstationsFigure 20. (Figure generally 19). The had statistical smaller BIAS results (0.07 are m, summarized 0.54%) and in RMSE Table 4(0.55. It is m, shown 3.71%) that values, the estimations except plot 5generally and plot had 6. The smaller dispersion BIAS (0.07 of observed m, 0.54%) values and RMSE increased (0.55 as m, the 3.71%) tree values,height exceptincreased, plot as 5 andshown plot in 6. TheFigure dispersion 20. of observed values increased as the tree height increased, as shown in Figure 20.

Figure 19. Scatter plot of estimated tree heights.

Table 4. AccuracyFigure of the 19. treeScatter height plot esti of mationsestimated using tree theheights. SLAM smartphone.

TablePlot 4. Accuracy RMSE of the(m) tree relRMSE height estimations (%) BIAS using (m) the SLAM relBIAS smartphone. (%) Table 4. Accuracy of the tree height estimations using the SLAM smartphone. 1 0.54 4.91% −0.08 −0.72% Plot RMSE (m) relRMSE (%) BIAS (m) relBIAS (%) Plot2 RMSE0.90 (m) relRMSE 7.35% (%) BIAS 0.12 (m) relBIAS 1.01% (%) 1 0.54 4.91% −0.08 −0.72% 13 0.54 4.91% 5.70% −0.080.13 −0.72%1.34% 2 0.90 7.35% 0.12 1.01% 324 0.540.900.56 7.35%5.70%4.94% 0.120.05−0.13 1.01%0.44%−1.34% 435 0.560.541.88 5.70%4.94%7.47% −0.130.83 0.05 −1.34%3.31% 0.44% 546 1.880.562.44 4.94%7.47%9.58% 0.052.08−0.83 0.44%8.18%−3.31% 657 2.441.880.76 7.47%9.58%6.53% − 0.410.83 2.08 − 3.50%3.31% 8.18% 7 0.76 6.53% 0.41 3.50% 6 2.44 9.58% 2.08 8.18% 88 0.460.46 4.22% −−0.180.18 −1.67%−1.67% 979 0.750.760.75 6.53%5.90% − 0.41−0.160.16 − 3.50%1.25%−1.25% TotalTotal8 1.110.461.11 4.22%7.43% − 0.150.18 0.15 − 1.08%1.67% 1.08% 9 0.75 5.90% −0.16 −1.25% Total 1.11 7.43% 0.15 1.08%

Figure 20. The errors in the tree height observationsobservations for different treetree heights.heights.

Figure 20. The errors in the tree height observations for different tree heights.

Remote Sens. 2018, 10, 1845 16 of 19

5. Discussion SLAM technology provides a solution for positioning without GNSS signals in places like forests. In this study, we measured the positions, DBHs, and heights of trees in real-time using the poses and dense point cloud data provided by a phone with RGB-D SLAM. The tree positions were obtained from the device pose and a circle was fitted to the points around the breast height of each tree trunk. The drift of the phone pose mainly affected the accuracy of the tree positions. In this paper, a global consistency map of the plot ground was pre-built; it can be used to reduce the drift by loop-closure detection and pose graph nonlinear optimization. Our result showed an error of approximately −0.12 to 0.13 m on both the x-axis and the y-axis and a 0.1~0.16 m standard deviation in the maximum variability direction; the RMSEs were 0.09~0.17 m and 0.07~0.17 m in the x-axis and the y-axis, respectively. Many studies have examined the extraction of tree positions, but few of them given results. Reference [22] used an unmanned ground vehicle with 3D LiDAR and graph-SLAM to scan the study area, and the error for the relative distance estimation between trees was 0.0476 m. Reference [25] used a Small-Footprint Mobile LiDAR to scan the study area, resulting in RMSEs of the motion trajectories in the two different areas of 0.32 m and 0.29 m, respectively, but no detail of the positional accuracy of the trees was provided. However, those studies were conducted over a relatively larger study areas than ours; therefore, our method of using the mobile phone with RGB-D SLAM needs to be tested for accuracy over large areas. Reference [33] used Tango-based point clouds to extract tree positions offline, showing an RMSE value of 0.105 m for distances from the central point in the better scanning pattern, which is similar to our results. We attempted to attain the DBH values from point clouds. The method has been widely studied; Reference [34] extracted DBH values by fitting circles to the point cloud at breast height using three different algorithms (Lemen, Pratt and Taubin), and the results showed that the BIAS value was approximately −0.12 to 0.07 cm and the RMSE was 1.47~2.43 cm in single scan mode, while in merged scan mode, the BIAS was approximately −0.32 to 0 cm, and the RMSE was 0.66~1.21 cm. Similarly, reference [15] extracted DBH values by fitting the cylindrical and point cloud data from TLS that showed a BIAS value of approximately −0.18 to 0.76 cm and an RMSE value of 0.74~2.41 cm when using the single-scan method; however, the BIAS value was 0.11~0.77 cm and the RMSE was 0.90~1.90 cm when using the multi-single-scan method. Reference [35] extracted DBH values using the Hough transformation and the RANSAC algorithm; their results showed a BIAS value of approximately −0.6 to 1.3 cm and an RMSE value of 2.6~5.9 cm. Reference [33] obtained DBHs from Tango-based point clouds, and the RMSEs were in the range of 1.61~2.10 cm. This paper showed a 1.26 cm (6.39%) RMSE and a 0.33 cm (1.78%) BIAS. In our study, the point clouds for detecting DBH originated from a TOF camera instead of a LiDAR; therefore, our work increased the tangential and tangent point constraints by detecting the trunk boundaries and using the perspective projection principle of the TOF camera. This could be a possible reason why higher accuracies were obtained with a low-quality point cloud compared to LiDAR. In addition, the AR technology in this paper showed the fitted circle at breast height on the screen in real-time, making it possible for the observer to judge whether it needed to be re-fitted or not, thus artificially reducing the interference caused by noise points. The method used to measure the height in this paper was similar to the traditional altimeter, which calculates the attribute by measuring the distance from the observation position to the tree and the inclinations of the tree bottom and treetop. The results showed a BIAS value of approximately –0.83 to 2.08 m and an RMSE value of 0.46~2.44 m. The result also showed that the measurement results had high precision when the tree was not higher than 20 m, as shown in Figure 20, although AR technology was used, which enabled the observer to determine whether the displayed treetop position was appropriate; otherwise, it was adjusted as needed. This may be influenced by occlusion and the subjective decisions of observers. Reference [36] evaluated a Laser-, a classical traditional instrument; they reported a −0.016 m BIAS and a 0.190 m standard deviation. The limitation of the Laser-relascope is that it needs to be mounted on a fixed site, but our device did not need that. Reference [15] used the height difference between the ground level and highest point on the point Remote Sens. 2018, 10, 1845 17 of 19 cloud around the tree model, and obtained results with a BIAS value of approximately −1.30 to 1.50 m and an RMSE value of 1.36~4.29 m when using the single-scan method, while the BIAS value was approximately −0.34 to 2.11 m and the RMSE value was 2.04~6.53 m when using the multi-single-scan method. Reference [7] examined the consistency of the point cloud on the treetops when exclusively using TLS data compared to the previous method; the BIAS value was approximately −0.3 to 0.11 m and the RMSE value was 0.31~0.8 m. The point cloud at the treetop was difficult to obtain due to occlusion, so the estimates were not accurate. The device used in this article can acquire point clouds by using the TOF camera, but the measuring range is only 0.4~3.0 m, which makes it hard to get the point cloud at the treetop. In the SLAM algorithm, corners or blobs are often used as visual feature points. A good feature should have localized accuracy (both in position and scale), repeatability, robustness, etc. However, it is difficult to find good corners or blobs, especially in forests with complex ground conditions, such as shrubs. The plot data used in this article were taken from human accessible areas with fewer shrubs on the ground. However, most forests will not meet these special conditions. In addition, the RGB camera needs to search for feature points in an environment with sufficient illumination, while the TOF camera is susceptible to strong illumination because it uses infrared light as its light source. Therefore, the device is only suitable for use under the canopies during the day. Although the device is a possible option for forestry surveys, it still has some limitations. For example, (1) although the device can estimate forest inventory parameters in real-time, it is less efficient than TLS, MLS, or the previous uses of SLAM because of the need to access each tree individually; (2) the taller tree accuracy of our tree height estimates was compromised a little, perhaps due to drift; and (3) the tree position coordinates obtained by the device do not use the geodetic coordinate system, but rather, the plot coordinate system of the origin at the center of the plot. In addition, the device uses the manual selection method to locate the bottom of the tree. Unfortunately, the Google Tango Project has been terminated, i.e., is longer being developed or supported. However, ARKit [37], a monocular SLAM system, was released by Apple. Google has also introduced a similar solution—ARCore [38]. These two technologies allow simultaneous localization and mapping for ordinary (without a TOF camera). Of course, due to the new features of these technologies, many aspects, such as how to get dense point clouds in real-time, still need to be investigated in the future.

6. Conclusions This paper provided a solution to estimate tree position, DBH, and tree height under the canopy in real-time without GNSS signals using a mobile phone with RGB-D SLAM. The tree position and DBH were estimated by circle fitting the point cloud data. This paper added the tangent point constraints and the tangent constraints. The difference was that this article used the pose data provided by the SLAM system in real-time instead of using angles and distances to estimate tree height. The results showed that the tree height estimations were accurate, especially when the tree was not too tall. The experimental results also showed that this method can potentially be used to accurately estimate the tree position and DBH. It is recommended that future research studies are carried out to test the device under complex forest conditions, such as in areas with more shrubs, better or poorer light, different tree species, and in different aged forests. Future studies should also focus on extracting other forest inventory attributes, such as stem curves and crown diameters.

Author Contributions: Methodology, Y.F. and Z.F.; Software, Y.F.; Data processing, A.M.; Compilation and formal analysis, A.M. and T.U.K.; Writing–original draft, Y.F.; Writing–review & editing, T.U.K. and C.S.; Proof reading, T.U.K. and S.S. Funding: This work was supported by the National Natural Science Foundation of China (Grant number U1710123), the Medium-to-long-term project of young teachers’ scientific research in Beijing Forestry University (Grant number 2015ZCQ-LX-01) and the Beijing Municipal Natural Science Foundation (Grant number 6161001). Remote Sens. 2018, 10, 1845 18 of 19

Acknowledgments: The authors wish to say thanks to E.M., S.C., W.M., Y.T., and Y.L. for help collecting data. Conflicts of Interest: The authors declare no conflict of interest.

References

1. Trumbore, S.; Brando, P.; Hartmann, H. Forest health and global change. Science 2015, 349, 814–818. [CrossRef][PubMed] 2. FAO (Food and Agriculture Organization of the United Nations). Global Forest Resources Assessment 2010; Main report, FAO Forest paper; FAO: Rome, Italiy, 2010; Volume 163. 3. Tubiello, F.N.; Salvatore, M.; Ferrara, A.F.; House, J.; Federici, S.; Rossi, S.; Biancalani, R.; Condor Golec, R.D.; Jacobs, H.; Flammini, A.; et al. The contribution of agriculture, forestry and other land use activities to global warming, 1990–2012. Glob. Chang. Biol. 2015, 21, 2655–2660. [CrossRef][PubMed] 4. MacDicken, K.G. Global forest resources assessment 2015: What, why and how? For. Ecol. Manag. 2015, 352, 3–8. [CrossRef] 5. Reutebuch, S.E.; Andersen, H.-E.; McGaughey, R.J. Light detection and ranging (LIDAR): An emerging tool for multiple resource inventory. J. For. 2005, 103, 286–292. 6. Liang, X.; Kankare, V.; Hyyppä, J.; Wang, Y.; Kukko, A.; Haggrén, H.; Yu, X.; Kaartinen, H.; Jaakkola, A.; Guan, F.; et al. Terrestrial laser scanning in forest inventories. ISPRS J. Photogramm. Remote Sens. 2016, 115, 63–77. [CrossRef] 7. Cabo, C.; Ordóñez, C.; López-Sánchez, C.A.; Armesto, J. Automatic dendrometry: Tree detection, tree height and diameter estimation using terrestrial laser scanning. Int. J. Appl. Earth Obs. Geoinf. 2018, 69, 164–174. [CrossRef] 8. Barrett, F.; McRoberts, R.E.; Tomppo, E.; Cienciala, E.; Waser, L.T. A questionnaire-based review of the operational use of remotely sensed data by national forest inventories. Remote Sens. Environ. 2016, 174, 279–289. [CrossRef] 9. Suciu, G.; Ciuciuc, R.; Pasat, A.; Scheianu, A. Remote Sensing for Forest Environment Preservation. In Proceedings of the 2017 World Conference on Information Systems and Technologies, Madeira, Portuga, 11–13 April 2017; pp. 211–220. 10. Gougherty, A.V.; Keller, S.R.; Kruger, A.; Stylinski, C.D.; Elmore, A.J.; Fitzpatrick, M.C. Estimating tree phenology from high frequency tree movement data. Agric. For. Meteorol. 2018, 263, 217–224. [CrossRef] 11. Alcarria, R.; Bordel, B.; Manso, M.Á.; Iturrioz, T.; Pérez, M. Analyzing UAV-based remote sensing and WSN support for data fusion. In Proceedings of the 2018 International Conference on Information Technology & Systems, Libertad City, Ecuador, 10–12 January 2018; pp. 756–766. 12. Lim, K.; Treitz, P.; Wulder, M.; St-Onge, B.; Flood, M. LiDAR remote sensing of forest structure. Prog. Phys. Geogr. 2003, 27, 88–106. [CrossRef] 13. Liang, X.; Litkey, P.; Hyyppa, J.; Kaartinen, H.; Vastaranta, M.; Holopainen, M. Automatic stem mapping using single-scan terrestrial laser scanning. IEEE Trans. Geosci. Remote Sens. 2012, 50, 661–670. [CrossRef] 14. Béland, M.; Widlowski, J.-L.; Fournier, R.A.; Côté, J.-F.; Verstraete, M.M. Estimating leaf area distribution in savanna trees from terrestrial LiDAR measurements. Agric. For. Meteorol. 2011, 151, 1252–1266. [CrossRef] 15. Liang, X.; Hyyppä, J. Automatic stem mapping by merging several terrestrial laser scans at the feature and decision levels. Sensors 2013, 13, 1614–1634. [CrossRef][PubMed] 16. Srinivasan, S.; Popescu, S.C.; Eriksson, M.; Sheridan, R.D.; Ku, N.-W. Terrestrial laser scanning as an effective tool to retrieve tree level height, crown width, and stem diameter. Remote Sens. 2015, 7, 1877–1896. [CrossRef] 17. Liang, X.; Hyyppä, J.; Kukko, A.; Kaartinen, H.; Jaakkola, A.; Yu, X. The use of a mobile laser scanning system for mapping large forest plots. IEEE Geosci. Remote Sens. Lett. 2014, 11, 1504–1508. [CrossRef] 18. Lin, Y.; Hyyppa, J. Multiecho-recording mobile laser scanning for enhancing individual tree crown reconstruction. IEEE Trans. Geosci. Remote Sens. 2012, 50, 4323–4332. [CrossRef] 19. Ryding, J.; Williams, E.; Smith, M.J.; Eichhorn, M.P. Assessing handheld mobile laser scanners for forest surveys. Remote Sens. 2015, 7, 1095–1111. [CrossRef] 20. Bauwens, S.; Bartholomeus, H.; Calders, K.; Lejeune, P. Forest inventory with terrestrial LiDAR: A comparison of static and hand-held mobile laser scanning. Forests 2016, 7, 127. [CrossRef] 21. Forsman, M.; Holmgren, J.; Olofsson, K. Tree stem diameter estimation from mobile laser scanning using line-wise intensity-based clustering. Forests 2016, 7, 206. [CrossRef] Remote Sens. 2018, 10, 1845 19 of 19

22. Pierzchała, M.; Giguère, P.; Astrup, R. Mapping forests using an unmanned ground vehicle with 3D LiDAR and graph-SLAM. Comput. Electron. Agric. 2018, 145, 217–225. [CrossRef] 23. Durrant-Whyte, H.; Bailey, T. Simultaneous localization and mapping: Part, I. IEEE Robot. Autom. Mag. 2006, 13, 99–110. [CrossRef] 24. Bailey, T.; Durrant-Whyte, H. Simultaneous localization and mapping (SLAM): Part II. IEEE Robot. Autom. Mag. 2006, 13, 108–117. [CrossRef] 25. Tang, J.; Chen, Y.; Kukko, A.; Kaartinen, H.; Jaakkola, A.; Khoramshahi, E.; Hakala, T.; Hyyppä, J.; Holopainen, M.; Hyyppä, H. SLAM-aided stem mapping for forest inventory with small-footprint mobile LiDAR. Forests 2015, 6, 4588–4606. [CrossRef] 26. Holmgren, J.; Tulldahl, H.; Nordlöf, J.; Nyström, M.; Olofsson, K.; Rydell, J.; Willén, E. Estimation of tree position and stem diameter using simultaneous localization and mapping with data from a backpack-mounted laser scanner. Int. Arch. Photogramm. Remote Sens. Spat. Inf. Sci. 2017, 42, 59–63. [CrossRef] 27. Kukko, A.; Kaijaluoto, R.; Kaartinen, H.; Lehtola, V.V.; Jaakkola, A.; Hyyppä, J. Graph SLAM correction for single scanner MLS forest data under boreal forest canopy. ISPRS J. Photogramm. Remote Sens. 2017, 132, 199–209. [CrossRef] 28. Foix, S.; Alenya, G.; Torras, C. Lock-in time-of-flight (ToF) cameras: A survey. IEEE Sens. J. 2011, 11, 1917–1926. [CrossRef] 29. Aijaz, M.; Sharma, A. Google Project Tango. In Proceedings of the 2016 International Conference on Advanced Computing, Moradabad, India, 22–23 January 2016. 30. Mur-Artal, R.; Tardós, J.D. ORB-SLAM2: An Open-Source SLAM System for Monocular, Stereo, and RGB-D Cameras. IEEE Trans. Robot. 2017, 33, 1255–1262. [CrossRef] 31. Lenovo Phab 2 Pro. Available online: http://www3.lenovo.com/us/en/virtual-reality-and-smart-devices/ augmented-reality/-phab-2-pro/Lenovo-Phab-2-Pro/p/WMD00000220/ (accessed on 22 October 2018). 32. Hyyppä, J.; Virtanen, J.-P.; Jaakkola, A.; Yu, X.; Hyyppä, H.; Liang, X. Feasibility of Google Tango and Kinect for crowdsourcing forestry information. Forests 2017, 9, 6. [CrossRef] 33. Tomaštík, J.; Saloˇn,Š.; Tunák, D.; Chudý, F.; Kardoš, M. Tango in forests–An initial experience of the use of the new Google technology in connection with forest inventory tasks. Comput. Electron. Agric. 2017, 141, 109–117. [CrossRef] 34. Pueschel, P.; Newnham, G.; Rock, G.; Udelhoven, T.; Werner, W.; Hill, J. The influence of scan mode and circle fitting on tree stem detection, stem diameter and volume extraction from terrestrial laser scans. ISPRS J. Photogramm. Remote Sens. 2013, 77, 44–56. [CrossRef] 35. Olofsson, K.; Holmgren, J.; Olsson, H. Tree stem and height measurements using terrestrial laser scanning and the RANSAC algorithm. Remote Sens. 2014, 6, 4323–4344. [CrossRef] 36. Kalliovirta, J.; Laasasenaho, J.; Kangas, A. Evaluation of the laser-relascope. For. Ecol. Manag. 2005, 204, 181–194. [CrossRef] 37. Buerli, M.; Misslinger, S. Introducing ARKit-Augmented Reality for iOS. In Proceedings of the 2017 Apple Worldwide Developers Conference, San Jose, CA, USA, 5–9 June 2017; pp. 1–187. 38. ARCore. Available online: https://developers.google.com/ar/ (accessed on 22 October 2018).

© 2018 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).