sensors

Article Development of a Wide Area System with a Rotating Line

Jaeho Lee 1,†, Hyunsoo Shin 1,† and Sungon Lee 2,*

1 Department of Electrical and Electronic Engineering, Hanyang University, Ansan 15588, Korea; [email protected] (J.L.); [email protected] (H.S.) 2 School of Electrical Engineering, Hanyang University, Ansan 15588, Korea * Correspondence: [email protected]; Tel.: +82-31-400-5174 † These authors contributed equally to this work.

Abstract: In a 3D scanning system, using a camera and a line laser, it is critical to obtain the exact geometrical relationship between the camera and laser for precise . With existing depth cameras, it is difficult to scan a large object or multiple objects in a wide area because only a limited area can be scanned at a time. We developed a 3D scanning system with a rotating line laser and wide-angle camera for large-area reconstruction. To obtain 3D information of an object using a rotating line laser, we must be aware of the plane of the line laser with respect to the camera coordinates at every rotating angle. This is done by estimating the rotation axis during calibration and then by rotating the laser at a predefined angle. Therefore, accurate calibration is crucial for 3D reconstruction. In this study, we propose a calibration method to estimate the geometrical relationship between the rotation axis of the line laser and the camera. Using the proposed method, we could accurately estimate the center of a cone or cylinder shape generated while the line laser was rotating. A simulation study was conducted to evaluate the accuracy of the calibration. In the experiment, we  compared the results of the 3D reconstruction using our system and a commercial depth camera. The  results show that the precision of our system is approximately 65% higher for plane reconstruction, Citation: Lee, J.; Shin, H.; Lee, S. and the scanning quality is also much better than that of the depth camera. Development of a Wide Area 3D Scanning System with a Rotating Line Keywords: 3D reconstruction; laser-camera calibration; triangulation Laser. Sensors 2021, 21, 3885. https:// doi.org/10.3390/s21113885

Academic Editor: Bijan Shirinzadeh 1. Introduction Today, many application fields require technology for of real-world Received: 6 May 2021 objects, such as design and construction of (AR), quality verification, Accepted: 1 June 2021 Published: 4 June 2021 the restoration of lost CAD data, heritage scanning, surface deformation tracking, 3D reconstruction for object recognition and so on [1–7]. Among the non-contact scanning

Publisher’s Note: MDPI stays neutral methods, the method using a camera and a line laser is widely used because the scanning with regard to jurisdictional claims in speed is relatively fast and there is no damage to the object [8,9]. There are two methods published maps and institutional affil- for measuring the distance with the laser: the laser triangulation (LT) method and the iations. time-of-flight (ToF) method. The LT method is slower but it has more precise results than the ToF method [10]. However, both methods have relatively small measurable zones. They scan a point or a line, at most, a narrow region at a time. A laser range finder (LRF) was developed to overcome this limitation. It extends the scan area by rotating the light source [11]. The scan area can be further increased by rotating the LRF. Recently, various Copyright: © 2021 by the authors. Licensee MDPI, Basel, Switzerland. methods for estimating the rotation axis of the LRF have been studied to obtain a 3D depth This article is an open access article map of the environment [12–22]. distributed under the terms and In this study, an LT system with a rotating line laser and wide-angle camera was conditions of the Creative Commons proposed as shown in Figure1. We only rotate the line laser, not the entire system, to scan a Attribution (CC BY) license (https:// wide area. Rotating a laser is easy because the line laser is light, leading to a simple system. creativecommons.org/licenses/by/ We do not need to rotate the camera because it already has a wide field of view with a 4.0/). proper lens (e.g., fish-eye lens).

Sensors 2021, 21, 3885. https://doi.org/10.3390/s21113885 https://www.mdpi.com/journal/sensors Sensors 2021, 21, x FOR PEER REVIEW 2 of 14 Sensors 2021, 21, 3885 2 of 14

system. We do not need to rotate the camera because it already has a wide withWith a proper this lens setup, (e.g the., fish first-eye step lens). is to precisely estimate this rotation axis with respect to theWith camera this setup coordinates,, the first step to measure is to precisely the depth estimate of anthis object rotation using axis with triangulation respect to [23]. Owingthe camera to various coordinates reasons,, to measure such asthe the dep fabricationth of an object error using of triangulation mechanical parts,[23]. Owing linelaser misalignmentto various reasons to the, such rotation as the axis, fabrication line laser error optical of mechanical properties, parts, and line so on, laser the misalign- rotation axis ofment the to line the laser rotation is somewhat axis, line laser different optical from propert the idealies, and CAD so on design., the rotation However, axis thisof the axis is preciselyline laser is measured somewhat to different obtain accuratefrom the 3Dideal information CAD design. from However, the LT this method. axis is precisely In this paper, wemeasured propose to a obtain novel calibrationaccurate 3D between information the camerafrom the coordinate LT method system. In this and paper the, rotatingwe pro- line laserpose coordinatea novel calibration system between using a the cone camera model. coordinat Findinge system the axis and of the rotation rotating with line respect laser to thecoordinate camera system coordinate using system a cone is model the key. Finding for accurate the axis calibration. of rotation We with noticed respect that to thethe cone shapecamera is coordinate always created system while is the the key plane for accurate of the line calibration. laser rotates We aboutnoticed the that rotation the cone axis of theshape motor is always due to created inevitable while imperfect the plane of alignment the line laser between rotate thes about line the laser rotation and the axis rotating of axis.the motor Therefore, due to we inevitable estimate imperfect the axis ofalignment rotation between using the the cone linemodel laser and because the rotating the central axis. Therefore, we estimate the axis of rotation using the cone model because the central axis of the cone and the rotation axis of the motor are same. The cone model enables us to axis of the cone and the rotation axis of the motor are same. The cone model enables us to estimate the rotation axis accurately leading to improved calibration results. In summary, estimate the rotation axis accurately leading to improved calibration results. In summary, our contribution is to propose a cone model for an accurate calibration between the camera our contribution is to propose a cone model for an accurate calibration between the cam- and the rotating line laser resulting in an accurate and wide 3D scanning system. era and the rotating line laser resulting in an accurate and wide 3D scanning system.

Figure 1. Three dimensional (3D) reconstruction using our system, (a) setup of the system, (b) recon- Figure 1. Three dimensional (3D) reconstruction using our system, (a) setup of the system, (b) re- structionconstruction results. results.

TheThe rest of of Section Section 11 definesdefines the the notation notation and and coordinate coordinate system, system, and and Section Section 2 intro-2 intro- ducesduces the schematic diagram diagram of of the the system system and and the the calibration calibration process. process. In Section In Section 3, the3, the triangulationtriangulation method method through through the the line line-plane-plane intersection intersection is described, is described, andand the calibration the calibration betweenbetween the fixed fixed line line laser laser and and camera, camera, and and the the calibration calibration between between the rotating the rotating line laser line laser andand cameracamera are explained. explained. In In Section Section 4,4 the, the calibration calibration accuracy accuracy according according to the to therotation rotation axisaxis ofof the line laser laser is is evaluated, evaluated, and and the the 3D 3D reconstruction reconstruction results results of the of theRGB RGB-D-D camera camera andand thisthis system are are compared. compared. Then, Then, the the accuracy accuracy of ofthe the 3D 3Ddepth depth estimation estimation is analyzed is analyzed usingusing a planar model. model. Finally, Finally, the the conclusions conclusions are are presented presented in Section in Section 5. 5.

2.2. OverallOverall System 2.1.2.1. HardwareHardware OurOur scanning system system has has a aline line laser laser fixed fixed to toa motor a motor and and a camera a camera with with an an infrared ◦ (IR)(IR) filter,filter, as shown shown in in Figure Figure 2.2 The. The line line laser laser used used in our in system our system has 90 has° fan 90 anglesfan at angles 850 at 850nm nmwavelength wavelength (FLEXPOINT (FLEXPOINT MVnano, MVnano, Laser Components, Laser Components, Bedford, Bedford, NH, USA). NH, We USA). de- We designedsigned a mount a mount that that can can fix fixand and adjust adjust the theorientation orientation of the of theline linelaser. laser. ThisThis mount mount has has twotwo degreesdegrees ofof freedomfreedom rotating about pitch pitch and and roll roll axes. axes. We We will will explain explain the the effect effect of of the rollthe angleroll angle with with simulation. simulation. Our Our brushless brushless direct direct current current (BLDC) (BLDC motor) motor with with an an incremental incre- encodermental encoder has approximately has approximately a resolution a resolution of 0.08 of◦. To0.08 obtain°. To obtain the absolute the absolute angle angleθ about θ the motorabout the axis, motor we calculated axis, we calculated the angle the with angle the with incremental the incremental encoder encoder from from the home the home position. Weposition. used aWe camera used a with camera a wide-angle with a wide lens-angle and lens an and 850 an nm 850 infrared nm infrared bandpass bandpass filter filter to detect onlyto detect IR light only (FLEA3,IR light (FLEA3, FLIR System, FLIR System, Wilsonville, Wilsonville, OR, USA). OR, USA) The. camera The camera with with a lens a has vertical and horizontal field of view of approximately 57◦ and 44◦, respectively. The camera resolution is 1280 by 1080 and its acquisition speed is 150 frames per second.

Sensors 2021, 21, x FOR PEER REVIEW 3 of 14

Sensors 2021, 21, x FOR PEER REVIEW 3 of 14

lens has vertical and horizontal field of view of approximately 57° and 44°, respectively. The camera resolution is 1280 by 1080 and its acquisition speed is 150 frames per second.

Sensors 2021, 21, 3885 lens has vertical and horizontal field of view of approximately 57° and3 of44 14°, respectively. The camera resolution is 1280 by 1080 and its acquisition speed is 150 frames per second.

Figure 2. Scanning system with a line laser fixed to a motor and infrared camera.

2.2. Process of Scanning System As shown in Figure 3, we executed three processes to acquire 3D scan data. Because our scanning system consists of three components, we determined the transformation re- lationship between the camera and other devices via calibration methods. FigureFigureBecause 2. 2. ScanningScanning every system pointsystem of with thewith a 3D line ascan laserline data fixedlaser was to fixed a acquired motor to anda motorfrom infrared the and image, camera. infrared and the camera area . illuminated by the laser was narrow in the field of view of the sensor, we rotated the laser 2.2. Process of Scanning System via the motor and recorded a camera image for every angle of the motor. By using cali- 2.2bration. ProcessAs data shown ofand Scanning inthe Figure angle 3value Sys, wetem of executed the motor three, we processescalculated to3D acquire points from 3D scan each data.recorde Becaused iourmageAs scanning. Tshownhe 3D systempoints in Figure are consists stored 3, ofwein threethe executed program components, memorythree weproces, and determined weses can to visualize acquire the transformation or store3D scan data. Because relationship between the camera and other devices via calibration methods. ourthem scanning in the storage system of our consist PC. s of three components, we determined the transformation re- lationship between the camera and other devices via calibration methods. Because every point of the 3D scan data was acquired from the image, and the area illuminated by the laser was narrow in the field of view of the sensor, we rotated the laser via the motor and recorded a camera image for every angle of the motor. By using cali- bration data and the angle value of the motor, we calculated 3D points from each recorded image. The 3D points are stored in the program memory, and we can visualize or store them in the storage of our PC.

Figure 3. 3. FlowchartFlowchart of of our our scanning scanning process process..

2.3. CameraBecause Calibration every point of the 3D scan data was acquired from the image, and the area illuminatedIn digital by imaging, the laser the was camera narrow is modeled in the as field a pinhole of view camera of the model sensor,. The we pinhole rotated the cameralaser via model the motor is a method and recorded for projecting a camera any single image point for present every angle in a 3D of space the motor. into image By using coordinatescalibration data(, ). andIt consists the angle of a combination value of the of motor,intrinsic we parameters calculated ( 3D, , points, , ) from and each extrinsicrecorded parameter image. Thes, rigid 3D points body transform are stored to in a theworld program coordinate memory, system and. we can visualize or storeThe them parameters in the storage estimated of our through PC. the camera calibration process are used. Cur- rently, most intrinsic and extrinsic parameters are estimated using a checkerboard printed on2.3. a Cameraflat plane Calibration [24]. If we know the intrinsic parameters of the camera, we can calculate whereIn any digital points imaging, represented the camerain the camera is modeled reference as aframe pinhole are project cameraed model. into the The image pinhole coordinatecamera model system. is a Furthermore, method for projecting knowing the any 3D single point point to 2D present point corresponding in a 3D space intopairs image of more than four points with the camera’s intrinsic parameters identified, we can estimate  coordinates (r, c). It consists of a combination of intrinsic parameters fx, fy, cx, cy, α Figurethe rigid 3. Flowchartbody transform of our of it scanning with respect process to the .camera coordinate system. and extrinsic parameters, rigid body transform to a world coordinate system. For the experiments, we used a checkerboard with an AR marker, ArUco (ChArUco), The parameters estimated through the camera calibration process are used. Currently, to estimate the rigid body transform, as shown in Figure 4. Its parameters were also used 2.3.most Camera intrinsic Calibration and extrinsic parameters are estimated using a checkerboard printed on a flat to identify the laser plane in the camera reference coordinate system. planeIn [digital24]. If we imaging, know the intrinsicthe camera parameters is modeled of the camera, as a wepinhole can calculate camera where model any . The pinhole points represented in the camera reference frame are projected into the image coordinate camerasystem. model Furthermore, is a method knowing for the projecting 3D point to 2Dany point single corresponding point present pairs ofin more a 3D than space into image coordinatesfour points with(, the). It camera’s consists intrinsic of a combination parameters identified, of intrinsic we canparameters estimate the ( rigid, , , , ) and extrinsicbody transform parameter of it withs, rigid respect body to the transform camera coordinate to a world system. coordinate system.

TheFor theparameters experiments, estimated we used a checkerboard through the with camera an AR marker, calibration ArUco (ChArUco),process are used. Cur- to estimate the rigid body transform, as shown in Figure4. Its parameters were also used rently,to identify most the intrinsic laser plane and in extrinsic the camera parameters reference coordinate are estimated system. using a checkerboard printed on a flat plane [24]. If we know the intrinsic parameters of the camera, we can calculate where any points represented in the camera reference frame are projected into the image coordinate system. Furthermore, knowing the 3D point to 2D point corresponding pairs of more than four points with the camera’s intrinsic parameters identified, we can estimate the rigid body transform of it with respect to the camera coordinate system. For the experiments, we used a checkerboard with an AR marker, ArUco (ChArUco), to estimate the rigid body transform, as shown in Figure 4. Its parameters were also used to identify the laser plane in the camera reference coordinate system.

Sensors 2021, 21, 3885 4 of 14 Sensors 2021, 21, x FOR PEER REVIEW 4 of 14

FigureFigure 4.4. ((a) Intrinsic calibration calibration and and (b ()b Extrinsic) Extrinsic calibration calibration.. 3. Solving Extrinsic Calibration 3. Solving Extrinsic Calibration Our scanning system estimates 3D depth information based on the triangulation of Our scanning system estimates 3D depth information based on the triangulation of pixel points in the image coordinate system. Therefore, it is necessary to know the ray pixel points in the image coordinate system. Therefore, it is necessary to know the ray informationinformation of the the camera camera coordinate coordinate system system and and the the plane plane information information of the of line the laser line laser,, suchsuch asas thethe normalnormal vectors vectors,, to to calculate calculate the the depth. depth. Our Our scanning scanning system system uses uses a rotating a rotating lineline laserlaser andand fixedfixed camera. camera. The The first first step step is isto toperform perform a calibration a calibration that that calculates calculates the the geometricalgeometrical relationship between between the the camera camera and and the the laser laser-rotating-rotating axis axis.. We Weperformed performed thisthis calibrationcalibration using a a cone cone shape shape model model because because a cone a cone is made is made from from multiple multiple plane planess generatedgenerated by the rotation of of the the line line laser laser.. In In the the following following subsections, subsections, we weexplain explain this this processprocess inin detail.detail. First, First, the the triangulation triangulation method method is introduced, is introduced, and and then then the theprocess process of of obtainingobtaining the relationship between between the the camera camera and and plan planee generated generated by the by theline linelaser laser is is explained.explained. After that that,, the the calibration calibration to to find find the the transformation transformation between between the therotation rotation axis axis ofof thethe lineline laser and and the the camera camera is is explicated. explicated. Finally, Finally, we we obtained obtained the thedepth map from from the the encoderencoder valuevalue of the motor motor and and the the info informationrmation between between the the line line laser laser and and camera. camera.

3.1.3.1. TriangulationTriangulation TriangulationTriangulation refers refers to to the the process process ofof calculating calculating the the depth depth value valueλ λof of a acommon common point observedpoint observed by two by sensors, two sensors, provided provided that a that transformation a transformation matrix matrix between between both bothsystems, suchsystems, as camera such as and camera line and laser lin ise known.laser is known In 3D. space,In 3D space, a common a common point point is created is created through line-linethrough orline line-plane-line or line intersections.-plane intersections. The line Thel isline the rayis the of ray the of camera, the camera, and the and plane the PL isplane the line is laser the line plane. laser Our plane. system Our correspondssystem corresponds to a line-plane to a line intersection-plane intersection as shown as in Figureshown5 .in The Figure intersection 5. The intersection between abetween plane and a plane a line and is a given line is by given by  −  λ = T , (1) n qp− qL λ = T , (1) where , , and are the normal vectorn ofv , a point of , a point and a direc- tional vector of , respectively. where n, q , q and v are the normal vector of PL, a point of PL, a point and a directional becomesp L a zero vector when passing through the origin of the camera coordinate vector of l, respectively. system. Therefore, the depth λ can be calculated as qL becomes a zero vector when passing through the origin of the camera coordinate system. Therefore, the depth λ can be calculated as λ = . (2) T n qp λ = . (2) nTv

Sensors 2021, 21, 3885 5 of 14 Sensors 2021, 21, x FOR PEER REVIEW 5 of 14

Sensors 2021, 21, x FOR PEER REVIEW 5 of 14

Figure 5. Line-Plane intersection. Figure 5. Line-Plane intersection. 3.2. Camera to Fixed Laser Calibration Figure3.2. Camera 5. Line to-Plane Fixed intersection Laser Calibration. To calculate the relationship between fixed line laser and the camera, point p on the 3.2. CameraTo calculate to Fixed the Laser relationship Calibration between fixed line laser and the camera, point on the planeplane PL acquiredacquired fromfrom multiple poses poses are are required required,, asas indicated indicated in Figure in Figure 6. Because6. Because the the To calculate the relationshipC between fixed line laser and the camera, point on the transformationtransformation matrix matrix WT fromfrom camera camera (C ()C coordinate) coordinate to toworld world (W) ( Wcoordinate) coordinate can be can be plane acquired from multiple poses are required, as indicated in Figure 6. Because the acquiredacquired throughthrough the the AR AR m marker,arker, the the laser laser points points on on the the checkerboard checkerboard can canestimate estimate the the transformation matrix from camera (C) coordinate to world (W) coordinate can be depthdepthλ λthrough through triangulation. triangulation. PointPoint p onon thethe estimatedestimated plane PL can be represented by by the acquired through the AR marker, the laser points on the checkerboard can estimate the followingthe follow impliciting implicit equation: equation: depth λ through triangulation. Point n Ton(p the− q estimated) = 0. plane can be represented by (3) the following implicit equation: ( − ) = 0. (3) ··· { } The elements p1, p2, , pn of the points set p collected from n poses and the plane The elements , , ⋯ , of the points( − set) = 0{. } collected from n poses and the(3) PL were formulated with matrix multiplication as follows. plane were formulated with matrix multiplication as follows. , , ⋯ , {} n The elements of theT points set collected from poses and the  p −1  plane were formulated with matrix 1 multiplication−1 as follows. ⎡ T ⎤    p 2 −−11 n  ⎢ −1⎥ = . = 0.(4) (4)  ⎡. ⋮ . ⋮ ⎤ T  ⎢. . ⎥ q n −1 ⎢⎣ T −1⎥⎦ = . (4) p⎢ n⋮ −1⋮ ⎥ Because the set of points {} acquired⎣ − 1through⎦ the camera inevitably contains noise data,Because this equation the set is of not points equal{ p to} acquired zero. The throughsolution the can camerabe found inevitably by the linear contains least noise- Because the set of points {} acquired through the camera inevitably contains noise data,squares this method. equation Therefore, is not equal the to equation zero. The of solutionthe plane can begenerated found by by the the linear line laser least-squares can data, this equation is not equal to zero. The solution can be found by the linear least- method.be estimated Therefore, with respect the equation to the c ofamera the plane coordinatesP generated. In addition, by the we line normalized laser can bethe estimated nor- squares method. Therefore, the equation of theL plane generated by the line laser can withmal vector respect to of the the camera plane to coordinates. simplify the Ingeometric addition, analysis we normalized in the next the section. normal vector n of be estimated with respect to the camera coordinates. In addition, we normalized the nor- themal plane vector to simplify of the plane the geometric to simplify analysis the geometric in the analysis next section. in the next section.

Figure 6. Calibration of laser plane with respect to camera coordinate.

FigureFigure3.3. Camera 6.6. Calibration to Rotating of of laser laserLaser plane plane Calibration with with respect respect to tocamera camera coordinate coordinate.. 3.3.3.3. CameraCameraWhen theto Rotating line laser Laser Laser rotates Calibration Calibration along the rotation axis of the motor, the plane created by the line laser rotates along the rotation axis. This means that the equation of the plane When the line laser rotates along the rotation axis of the motor, the plane created by generatedWhen by the the line line laser laser rotates in the camera along thecoordinate rotation also axis changed of the motor,through the an planeangle createdθ. The by the line laser rotates along the rotation axis. This means that the equation of the plane theequa linetion laser of rotates is related along to the rotation rotation axis axis. Thisand meansthe translation that the vector equation of ofthe the line plane θ PgeneratedL generated by the by theline linelaser laser in the in camera the camera coordinate coordinate also changed also changed through throughan angle an. The angle θ. equation of is related to the rotation axis and the translation vector of theC line The equation of PL is related to the rotation axis ω and the translation vector L t of the line laser expressed in the camera coordinates. The angle θ can be obtained through an incremental encoder with the motor. If the line laser is fixed, the parameters (n, q) for the

Sensors 2021, 21, 3885 6 of 14

equation of the plane PL are constant. However, in our system, the equation of the plane C PL depends on the parameters L t , ω and θ because the line laser rotates along the rotation axis of the motor.

3.3.1. Find a Point on Rotating Axis C To estimate the translation vector L t , the various planes PLθ rotated around angle θ should be calculated. The implicit equation of plane sets with m poses can be expressed using Equation (5). T −  = PL1 : n1 p q1 0 T( − ) = PL2 : n2 p q2 0 . . T PLm : nm(p − qm) = 0 ∩ ∩ · · · ∩ = C PL1 PL2 PLm L t. (5) If a plane rotates along an arbitrary axis in a 3D space, there exists an intersection point through which all planes pass through, excluding a specific case (Figure7a). We discuss this specific case with the simulation in Section 4.1. Simultaneously, the intersection point C L t exists on the rotation axis of the motor. Because the parameters n1...m, q1...m about these C planes PL1...m had noise data, we obtained the position vector L t using the least squared method, as represented by Equation (6).

 T T  n1 −n1 q1  nT −nTq    2 2 2  p Ax =  . .  ≈ 0. (6)  . .  1 T T nm −nmqm

C This position vector p corresponds to the translation L t from the camera coordinate to the laser coordinate. In addition, Equation 6 can be interpreted as a geometric meaning that minimizes the distance to a common point for all planes. The shortest distance from a point to the plane was along a line perpendicular to the plane.

nT(p − q) T Sensors 2021, 21, x FOR PEER REVIEW d = = n (p − q) (∵ knk2 = 1).7 of 14 (7) knk2

(a) (b) FigureFigure 7. 7.(a)( Thea) The laser laser plane plane with with an angle an angle to the torotating the rotating axis is forming axis is forming a cone. (b a) cone.The trace (b) of The trace of thethe normal normal vector vector of the of the rotating rotating plane plane is forming is forming an inverted an inverted cone after cone starting after starting points pointsof them of them are are moved to a point on the rotating axis. moved to a point on the rotating axis. 3.3.2.As Findshown Rotating in Figure Orientation 7b, one end of the normal vector points toward the origin, and the other end points to the base of a cone in 3D space. The shape of the cone is deformed Similar to estimating the translation of the rotation axis, we used {PL} to estimate the according to the position of the axis of rotation and the position of the laser, but the central direction of ω. The parameters constituting each equation of PL equation are (n, q). n is axis of the cone and the axis of rotation of the motor are always the same. a normal vector of the plane P and q is a point on the plane. Because n is a free vector, The central axis of the cone alwaysL is parallel with the normal vector of the base of the starting point of the vector can be freely moved. If the starting points of the normal the cone. Finding the normal vector of the base of the cone can be regarded as the same task as finding the direction vector of the motor’s axis of rotation. Let us denote this base plane of the inverted cone as with ( − ) = 0. The process of finding the rotation axis direction from the collected plane {} is as follows: We generate a matrix consisting of corresponding to the normal vector of the

collected planes , , … , . Each row of a matrix consisting of refers to a point on the base () of the cone, so it satisfies the plane equation. Because we have m normal vectors, this can be expressed as follows: : ( − ) = 0

: ( − ) = 0

: ( − ) = 0. In the following matrix form, it becomes: −1 ⎡ ⎤ −1 ⎢ ⎥ = . (8) ⎢ ⋮ ⋮ ⎥ ⎣ −1⎦ Because noise exists in the data, the expression is modified in an approximate form,

and an optimal solution is obtained in the form of ≈ with constraint || = 1.

3.4. Rotation of Line Laser Plane around an Axis The plane created when the line laser rotates can also be determined through the transformation matrix between the camera and the rotation axis. The rotated line laser plane about the axis should be calculated because it is difficult to calibrate between the camera and line laser plane whenever the motor rotates by θ. As shown in Figure 8, plane is a new plane created when plane rotates about an arbitrary axis of rotation. The plane is expressed as a normal vector n and a point on the plane. Therefore, both the normal vector and point on plane need to be rotated about the Z-axis of the laser coordinate (i.e., rotation axis of the motor). A vector in ℝ can be rotated by Ro-

drigues’ rotation formula. By applying the formula to and , we find the plane rotated by θ as follows:

Sensors 2021, 21, 3885 7 of 14

vectors move to a point on the rotating axis, the moved normal vectors form an inverted cone shape. The inverted cone shape is shown in Figure7b. As shown in Figure7b, one end of the normal vector points toward the origin, and the other end points to the base of a cone in 3D space. The shape of the cone is deformed according to the position of the axis of rotation and the position of the laser, but the central axis of the cone and the axis of rotation of the motor are always the same. The central axis of the cone always is parallel with the normal vector of the base of the cone. Finding the normal vector of the base of the cone can be regarded as the same task as finding the direction vector of the motor’s axis of rotation. Let us denote this base plane of T the inverted cone as PX with nX( p − qX) = 0. The process of finding the rotation axis direction from the collected plane {PL} is as follows: We generate a matrix consisting of n corresponding to the normal vector of the collected planes (PL1, PL2,..., PLm). Each row of a matrix consisting of n refers to a point on the base (PX) of the cone, so it satisfies the plane equation. Because we have m normal vectors, this can be expressed as follows:

T − = PL1 : nX( n1 qX) 0

T PL2 : nX( n2 − qX) = 0 . . T PLm : nX( nm − qX) = 0. In the following matrix form, it becomes:

 T  n1 −1 T    n2 −1  n   X = 0. (8)  . .  T  . .  qX nX T nm −1

Because noise exists in the data, the expression is modified in an approximate form, and an optimal solution is obtained in the form of Ax ≈ 0 with constraint |nX| = 1.

3.4. Rotation of Line Laser Plane around an Axis The plane created when the line laser rotates can also be determined through the transformation matrix between the camera and the rotation axis. The rotated line laser plane about the axis should be calculated because it is difficult to calibrate between the camera and line laser plane whenever the motor rotates by θ. As shown in Figure8, plane P2 is a new plane created when plane P1 rotates about an arbitrary axis of rotation. The plane P is expressed as a normal vector n and a point q on the plane. Therefore, both the normal vector n1 and point q1 on plane P1 need to be rotated about the Z-axis of the laser coordinate (i.e., rotation axis of the motor). A vector in R3 can be rotated by Rodrigues’ rotation formula. By applying the formula to q1 and n1, we find the plane rotated by θ as follows: q2 = Rω(θ)(q1 − t) + t, n2 = Rω(θ)n1, where   0 −ωz ωy 2 Rω(θ) = I + (sin θ)Ω + (1 − cos θ)Ω and Ω =  ωz 0 −ωx . −ωy ωx 0 Sensors 2021, 21, x FOR PEER REVIEW 8 of 14

= (θ)( − ) + ,

= (θ), where

Sensors 2021, 21, 3885 0 − 8 of 14 (θ) = + (sinθ) + (1 − cosθ) and = 0 −. − 0

Figure 8. 8. TheThe shape shape and and trajectory trajectory of ofthe the plane plane rotating rotating by θ by alongθ along arbitrary arbitrary axis axis(red) (red)..

4. Results Results ToTo evaluate evaluate the the calibration calibration described described in the in previous the previous section section,, we performed we performed simula- simu- tionslations and and experiments. experiments. First, First, we evaluated we evaluated the calibration the calibration accuracy accuracy in the simulation in the simulation with respectwith respect to the change to the changein , the in angleα, the between angle betweenthe rotation the axis rotation of the line axis laser of the and line the laserline and laserthe line plane laser. Then plane., we present Then, weed two presented experimental two experimental results in a real results environment in a real. For environment. com- parison,For comparison, we used wea commercial used a commercial depth camera depth (RealSense camera (RealSense SR305, Intel, SR305, Santa Intel, Clara, Santa CA, Clara, USA)CA, USA),, which which has a has relatively a relatively high- high-qualityquality in the in short the shortrange. range. In the In first the experiment, first experiment, a a plane checkerboard was was scanned scanned and and then then we measure measuredd how far the set of points was from fromthat plane.that plane. In the In secondthe second experiment, experiment we, we show show the the scan scan results results for for various various objects objects using usingtwo scan two systems.scan systems.

4.1. Simulation Simulation ToTo obtain obtain aa depth depth map map for for the the camera camera view, view, we werotated rotated the line the linelaser laser about about a rotation a rotation axis. As As shown shown in in Figure Figure 9,9 ,α determinesdetermines the the 3D 3Dshape shape that that is tangent is tangentialial to the to collected the collected Sensors 2021, 21, x FOR PEER REVIEWlaser planes. The type of this tangential shape is either cone or cylindrical if is less than9 of 90°. 14 laser planes. The type of this tangential shape is either cone or cylindrical if α is less than 90◦.

Figure 9. 9. SimulationSimulation condition condition (a () aAngle) Angle α betweenα between thethe axis axis of rotation of rotation and andthe normal the normal vector vector of of the theplane plane generated generated by by the the line line laser laser and and (b (b)) the the change change ofof thethe shapeshape as as the the angle angle α increases.increases.

As the value of approaches 0°, the slope of the cone becomes steep. Therefore, if the base area of the cone is constant, the intersection point increases as the value of approaches 0°. If the value of is 0°, the slope of the cone becomes +∞. This means that the intersection point cannot be determined. For this reason, the calibration method for models with very high slope values, such as cylinders, was prepared separately. If the angle becomes 0°, it shows a cylindrical shape instead of a cone shape, while the line laser rotates about the axis of rotation. If it is formed in a cylindrical shape, as shown in Figure 10b, the following method is used. Unlike the cone shape, normal vectors of planes tangent to a cylinder are in a 2D space perpendicular to the axis of rotation. We find the intersected lines between the collected planes and the plane of the normal vectors. Principal component analysis (PCA) was used to estimate the plane of normal vectors. The center of the circle was estimated using these intersection lines. The distance between the center of the circle and the angle bisector of any two outer tangents was zero. Because of noise, we found a point where the value of the distance to the bisector was minimal from multiple intersections. This point was considered as the coordinate passing through the central axis of the cylinder.

Figure 10. A different shape of the set of planes, (a) Cone shape, (b) Cylinder shape.

To correctly project in 2D using PCA, the normal vector of the tangent plane must be in a 2D space. That is, it works correctly only for special cases where = 0°. Nevertheless, it is also applicable for otherwise ( ≠ 0°), but it will serve as an error for estimating the rotational axis. For this reason, we can see that the estimation error increases as the value of increases, as shown in Figure 11.

Sensors 2021, 21, x FOR PEER REVIEW 9 of 14

Sensors 2021, 21, 3885 9 of 14

Figure 9. Simulation condition (a) Angle α between the axis of rotation and the normal vector of the plane generated by the line laser and (b) the change of the shape as the angle increases. As the value of α approaches 0◦, the slope of the cone becomes steep. Therefore, if the baseAs the area value of theof cone approaches is constant, 0°, the the slope intersection of the cone point becomes increases steep. as Therefore, the value if of α approachesthe base area 0◦ of. If the the cone value is ofconstant,α is 0◦, the the intersection slope of the point cone increases becomes as+∞ the. This value means of that theapproaches intersection 0°. If pointthe value cannot of be is determined.0°, the slope of For the this cone reason, becomes the + calibration∞. This means method that for modelsthe intersection with very poin hight cannot slope be values, determined such as. For cylinders, this reason, was the prepared calibration separately. method for modelsIf the with angle veryα highbecomes slope 0values,◦, it shows such aas cylindrical cylinders, was shape prepared instead separately. of a cone shape, while the lineIf the laser angle rotates becomes about 0 the°, it axisshows of a rotation. cylindrical If shape it is formed instead inof a cone cylindrical shape, while shape, as shownthe line in laser Figure rotat 10esb, about the following the axis methodof rotation. is used. If it is Unlike formed the in cone a cylindrical shape, normal shape, vectorsas ofshown planes in tangentFigure 10b, to athe cylinder following are method in a 2D is space used. perpendicularUnlike the cone toshape, the axis normal of rotation. vectors We findof planes the intersected tangent to linesa cylinder between are in the a 2D collected space perpendicular planes and the to plane the axis of of the rotation. normal We vectors. find the intersected lines between the collected planes and the plane of the normal vectors. Principal component analysis (PCA) was used to estimate the plane of normal vectors. The Principal component analysis (PCA) was used to estimate the plane of normal vectors. center of the circle was estimated using these intersection lines. The distance between the The center of the circle was estimated using these intersection lines. The distance between center of the circle and the angle bisector of any two outer tangents was zero. Because of the center of the circle and the angle bisector of any two outer tangents was zero. Because noise, we found a point where the value of the distance to the bisector was minimal from of noise, we found a point where the value of the distance to the bisector was minimal multiplefrom multiple intersections. intersections. This This point point was was considered considered as as the the coordinate coordinate passingpassing through through the centralthe central axis axis of the of the cylinder. cylinder.

FigureFigure 10.10. A different shape shape of of the the set set of of planes planes,, (a) ( aCone) Cone shape, shape, (b) (Cylinderb) Cylinder shape shape..

ToTo correctlycorrectly project in in 2D 2D using using PCA, PCA, the the normal normal vector vector of the of thetangent tangent plane plane must must be be inin aa 2D2D s space.pace. That That is, is, it itworks works correctly correctly only only for forspecial special cases cases where where =α0°=. Nevertheless,0◦. Nevertheless, itit isis alsoalso applicable for for otherwise otherwise ((α≠6=0°0)◦, )but, but it will it will serve serve as an as error an error for estimating for estimating the the rotationalrotational axis. For For this this reason, reason, we we can can see see that that the the estimation estimation error error increas increaseses as the as value the value ofof αincreases, increases, as as shownshown inin Figure 11.11. Although the points on the plane generated by the line laser in camera coordinates were estimated through the detection of the checkerboard and line laser in the real exper- iment, we assumed that the points were given values in the simulation. In addition, we added Gaussian noise with a standard deviation of 0.1 mm to emulate the noise of the real environment in the simulation environment. Because the ground truth is known in the case of simulation, unlike actual experiments, an accurate calibration difference can be obtained. To verify the accuracy of the calibration

according to α, the distance ep1 between the estimated point (cone apex) and the true rotation axis of the line laser was calculated using different angles α. Simulation results show that if the angle α is more than 0.47◦, a cone-shape-based method is better than a cylinder-based method (Figure 11). This means that the cylinder model is fine if we can adjust the angle between the rotation axis and the laser plane precisely; however, if there is an error of more than 1◦, our cone model leads to better results. Furthermore, the results also show that the lowest estimation error occurs at an angle α = 31.82◦ under our noise model. For this reason, we adjusted the angle α of our scanning system in real experiments to be approximately 30◦. Sensors 2021, 21, x FOR PEER REVIEW 10 of 14

Although the points on the plane generated by the line laser in camera coordinates were estimated through the detection of the checkerboard and line laser in the real exper- iment, we assumed that the points were given values in the simulation. In addition, we added Gaussian noise with a standard deviation of 0.1 mm to emulate the noise of the real environment in the simulation environment. Because the ground truth is known in the case of simulation, unlike actual experi- ments, an accurate calibration difference can be obtained. To verify the accuracy of the

calibration according to , the distance between the estimated point (cone apex) and the true rotation axis of the line laser was calculated using different angles . Simulation results show that if the angle is more than 0.47°, a cone-shape-based method is better than a cylinder-based method (Figure 11). This means that the cylinder model is fine if we can adjust the angle between the rotation axis and the laser plane pre- cisely; however, if there is an error of more than 1°, our cone model leads to better results. Furthermore, the results also show that the lowest estimation error occurs at an angle

Sensors 2021, 21, 3885 = 31.82° under our noise model. For this reason, we adjusted the angle10 of 14 of our scan- ning system in real experiments to be approximately 30°.

FigureFigure 11. 11. ResultsResults of of the the simulation simulation according according to the angle to theα. angle . 4.2. Real Experiments 4.2. RealIn thisExperiments experiment, we compared our system with the SR305 camera, which has a relativelyIn this high experiment, precision compared we compared to other RGB-D our camerassystem in with terms the of the SR305 performance camera of , which has a relativelythe depth high map. Twoprecision experiments compared were performed. to other DuringRGB-D the cameras experiments, in terms we fixed of thethe performance two scanning systems together at the plate. First, we evaluated the planarity of the plane, ofwhich the depth was estimated map. Two for theexperiments checkerboard were after calibrationperformed. between During the the line experiments laser and , we fixed thecamera. two scanning We also compared systems the together quality of at the the results plate. scanning First various, we evaluated objects, including the planarity of the planethe environments., which was estimated for the checkerboard after calibration between the line laser and4.2.1. camera. Planarity We of also the Plane compared the quality of the results scanning various objects, includ- ing theBecause environments. of the flatness condition, the checkerboard is suitable for evaluating the performance of depth estimation. Moreover, we can easily define the reference coordinate 4.2.1.with Planarity respect to theof the camera Plane coordinates using the pattern of the checkerboard. Because the origin of the scanning system also has the camera coordinates, it is easy to calculate theBecause error of the of planarity. the flatness Therefore, condition we estimated, the checkerboard the depth map usingis suitable the checkerboard for evaluating the per- formanceplane using of twodepth scanning estimation systems.. Moreover, We removed we the outliercan easily points define in the result the reference using a coordinate withpoint respect cloud , to the such camera as MeshLab, coordinates because itusing is not the easy pattern to only have of the the checkerboard checkerboard . Because the originin the of field the of scanning view. To evaluatesystem thealso accuracy has the of camera the planarity, coordinates, each camera it is estimated easy to calculate the world coordinates using intrinsic parameters. We calculated the distance ep2 between the errorreference of the frame planarity. and the estimated Therefore, points we on estimated the checkerboard the d planeepth using map Equation using the (9). checkerboard planeThe parametersusing twon, scanningq of the plane systems. can be obtained We removed using the the perspective-n-point outlier points algorithm in the result using a pointand pointcloud vectors tool, psuchare achieved as MeshLab by depth, because estimation it is using not the easy scanning to only systems. have the checkerboard in the field of view. To evaluate the accuracy of the planarity, each camera estimated world nT(p − q) ep2 = (9) knk2 Meanwhile, to verify the effect of the angle α of the rotation axis and the line laser as assessed in the simulation, we experimented with αill and αwell as shown in Figure 12a,c. As shown in Figure 12b, the line laser was moved 15 mm from the axis of rotation. The reason is to easily check the difference in accuracy according to α, and to confirm the robustness of the calibration estimation for design errors. As shown in Figure 13, the results of the 3D reconstruction of the checkerboard are indicated by each method. The average

and standard deviation of the distances ep2 between the points from the plane are shown in Figure 14. We also compared the performance of the 3D reconstruction between our Sensors 2021, 21, x FOR PEER REVIEW 11 of 14

coordinates using intrinsic parameters. We calculated the distance between the refer- ence frame and the estimated points on the checkerboard plane using Equation (9). The parameters , of the plane can be obtained using the perspective-n-point algorithm and point vectors are achieved by depth estimation using the scanning systems. |( − )| = (9) ‖‖ Meanwhile, to verify the effect of the angle of the rotation axis and the line laser Sensors 2021, 21, x FOR PEER REVIEW 11 of 14 as assessed in the simulation, we experimented with and as shown in Figure 12a and Figure 12c. As shown in Figure 12b, the line laser was moved 15 mm from the axis of rotation. The reason is to easily check the difference in accuracy according to ,

coordinatesand to confirm using the intrinsic robustness parameters. of the calibration We calculated estimation the distance for design between errors. As the shown refer- in enceFigure frame 13, and the theresults estimated of the 3Dpoints reconstruction on the checkerboard of the checkerboard plane using are Equation indicated (9 )by. The each , parametersmethod. The averageof the plane and can standard be obtained deviation using of the the perspective distances- n-point between algorithm the pointsand pointfrom vectors the plane are are achieved shown i nby Figure depth 14.estimation We also usingcompare thed scanning the performance systems. of the 3D re- construction between our system and SR305| (on − Table)| 1. Since our system can detect rela- tively large distances, the scan volume =size of our system is far bigger than SR305. (9) ‖‖ Meanwhile, to verify the effect of the angle of the rotation axis and the line laser as assessed in the simulation, we experimented with and as shown in Figure 12a and Figure 12c. As shown in Figure 12b, the line laser was moved 15 mm from the axis of rotation. The reason is to easily check the difference in accuracy according to , and to confirm the robustness of the calibration estimation for design errors. As shown in Sensors 2021, 21, 3885 Figure 13, the results of the 3D reconstruction of the checkerboard are indicated by each 11 of 14

method. The average and standard deviation of the distances between the points from the plane are shown in Figure 14. We also compared the performance of the 3D re- construction between our system and SR305 on Table 1. Since our system can detect rela- tivelysystem large and distance SR305s, the on Tablescan volume1. Since size our of system our system can detectis far bigger relatively than SR305. large distances, the scan volume size of our system is far bigger than SR305.

Figure 12. Experiment setup, (a) ill-posed condition (), (b) displacement part, and (c) well-posed condition ().

Table 1. Comparison between our system and the commercial sensor (SR305).

SR305 Our System Estimated accuracy [mm] 13.8 ± 3.8 4.8 ± 2.2 Working distance [m] 0.2 ~ 1.5 0.5 ~ 3.0 Pyramid volume [m3] 1.8838 7.1297 Relative pointcloud density × 1 × 2.786 IR Line Laser IR Laser Projector Minimal components IR Camera Figure 12. Experiment setup, (a) ill-posedIR condition Camera (αill,(b) displacement part, and (c) well-posed Figure 12. Experiment setup, (a) ill-posed condition (), (b) displacementMotor part, assembled and (c) well encoder-posed conditioncondition ( (αwell). ).

Table 1. Comparison between our system and the commercial sensor (SR305).

SR305 Our System Estimated accuracy [mm] 13.8 ± 3.8 4.8 ± 2.2 Working distance [m] 0.2 ~ 1.5 0.5 ~ 3.0 Pyramid volume [m3] 1.8838 7.1297 Sensors 2021, 21, x FOR PEER REVIEWRelative pointcloud density × 1 × 2.786 12 of 14 IR Line Laser IR Laser Projector Minimal components IR Camera Figure 13. Results of 3D reconstruction:IR (a Camera) Our system at αill,(b) αwell and (c) SR305. Figure 13. Results of 3D reconstruction: (a) Our system at , (b) Motor and assembled (c) SR305 encoder.

Figure 13. Results of 3D reconstruction: (a) Our system at , (b) and (c) SR305.

Figure 14. Distance errors from the checkerboard plane to every point. Figure 14. Distance errors from the checkerboard plane to every point. Table 1. Comparison between our system and the commercial sensor (SR305).

4.2.2. Scene Reconstruction SR305 Our System Estimated accuracy [mm] 13.8 ± 3.8 4.8 ± 2.2 UnlikeWorking distance previous [m] quantitative 0.2 ∼assessments1.5, we verified 0.5∼3.0 additional qualitative results regardingPyramid the volume reconstruction [m3] of various1.8838 shapes of objects 7.1297 using two scanner systems. As Relative pointcloud density ×1 ×2.786 shown in Figure 15, we scanned multiple hand-heldIR Lineitems Laser, such as a tennis ball and a IR Laser Projector Minimal components IR Camera teddy bear. Among the scannedIR Cameraobjects, the result of the orange cube with sharp edges Motor assembled encoder was significantly different between the two sensors, as depicted in Figure 16. As depicted in Figure 17a, a human plaster with complicated features is appropriate for a qualitative comparison. The size of the plaster was similar to that of the human head. Compared with Figure 17b,c, the cortex areas of the results that were scanned by the two scanners were different. Unlike SR305, our method relatively measured the curvature components of the object.

Figure 15. Three dimensional reconstruction of multiple objects with background.

Figure 16. Comparison of the result that was scanned a cube object using two devices.

SensorsSensors 2021 2021, ,21 21, ,x x FOR FOR PEER PEER REVIEW REVIEW 1212 ofof 1414

FigureFigure 14. 14. Distance Distance error errorss from from the the checkerboard checkerboard plane plane to to every every point point. .

4.2.2.4.2.2. Scene Scene Reconstruction Reconstruction Sensors 2021, 21, 3885 12 of 14 UnlikeUnlike previousprevious quantitativequantitative assessmentsassessments, , wewe verifiedverified additionaladditional qualitativequalitative resultsresults regardingregarding thethe reconstructionreconstruction ofof variousvarious shapesshapes ofof objectsobjects usingusing twotwo scannerscanner systems.systems. AsAs shownshown inin FigureFigure 15,15, wewe scannedscanned multiplemultiple hand hand--heldheld itemsitems, , suchsuch asas aa tennistennis ballball andand aa teddy4.2.2.teddy Scene bear.bear. Among ReconstructionAmong thethe scannedscanned objects,objects, thethe resultresult ofof thethe orangeorange cubecube withwith sharpsharp edgesedges waswas significantly significantly different different between between the the two two sensors sensors, ,as as depicted depicted in in Figure Figure 16. 16. Unlike previous quantitative assessments, we verified additional qualitative results AsAs depicted depicted in in Figure Figure 17a, 17a, a a human human plaster plaster with with complicated complicated features features is is appropriate appropriate regarding the reconstruction of various shapes of objects using two scanner systems. As forfor a a qualitative qualitative comparison. comparison. The The size size of of the the plaster plaster was was s similarimilar to to that that of of the the human human head. head. shown in Figure 15, we scanned multiple hand-held items, such as a tennis ball and a ComparedCompared with with Figure Figure 17 17bb,c,,c, the the cortex cortex areas areas of of the the results results that that were were scanned scanned by by the the two two teddy bear. Among the scanned objects, the result of the orange cube with sharp edges was scannersscanners were were different. different. UnlikeUnlike SR305, SR305, o ourur method method relativelyrelatively measuremeasuredd thethe curvaturecurvature componentssignificantlycomponents of of different the the object. object. between the two sensors, as depicted in Figure 16.

FigureFigure 15. 15. Three Three dimensional dimensional dimensional reconstruction reconstruction reconstruction of of multiple ofmultiple multiple objects objects objects with with with background background background.. .

FigureFigure 16. 16. Comparison Comparison of of of the the the result result result that that that was was was scanned scanned scanned a a cube cube a cube object object object using using using two two devices twodevices devices.. . As depicted in Figure 17a, a human plaster with complicated features is appropriate for a qualitative comparison. The size of the plaster was similar to that of the human head. Compared with Figure 17b,c, the cortex areas of the results that were scanned by the two Sensors 2021, 21, x FOR PEER REVIEWscanners were different. Unlike SR305, our method relatively measured the13 curvature of 14

components of the object.

Figure 17. Comparison of the 3D reconstruction of a human plaster: (a) Original, (b) our system and Figure 17. Comparison of the 3D reconstruction of a human plaster: (a) Original, (b) our system (c) SR305. and (c) SR305. 5. Conclusions 5. Conclusions We developed a 3D scanning system for a broad region using a rotating line laser and camera.We developed The transformation a 3D scanning should system be for acquired a broad betweenregion using the cameraa rotating coordinates line laser and and camera. The transformation should be acquired between the camera coordinates and the plane of the line laser to reconstruct an object. The information about the plane of the line laser with respect to the camera coordinate whenever the plane is rotated should be determined for 3D reconstruction. In this study, we proposed a novel calibration method to determine the relationship between the rotation axis of the line laser and the camera. The center of a cone or cylinder shape generated while the line laser was rotating was calculated. In the simulation study, we evaluated the accuracy of the proposed calibration using two models. The simulation shows that the proposed cone model is superior to the simple cylinder model if the angle alignment error is greater than approximately 0.5. This means that, in most cases, we have better results when using the proposed cone model because the misalignment of the laser plane and the rotation axis is larger than this value. Finally, we evaluated the performance of the 3D reconstruction of our system through real experiments and compared the results with those of a commercial RGB-D camera. The quality of the restoration using our system was far superior to that of the RGB-D camera. Our system can acquire the depth map of object at a far distance compared to the RGB-D camera. Our future work will include multiple line laser calibration for improving scan- ning speed and 3D object recognition using this improved depth information.

Author Contributions: Conceptualization, J.L. and S.L.; methodology, J.L.; software, J.L.; valida- tion, J.L. and H.S.; investigation, H.S. and J.L.; writing—original draft preparation, H.S. and J.L.; writing—review and editing, S.L.; supervision, S.L.; project administration, S.L. All authors have read and agreed to the published version of the manuscript. Funding: This research was supported in part by the Technology Innovation Program (20001856) funded by the Ministry of Trade, Industry & Energy, and also supported in part by the grant (21NSIP-B135746-05) from National Spatial Information Research Program (NSIP) funded by Min- istry of Land, Infrastructure and Transport of Korea government of Korean government. Institutional Review Board Statement: Not applicable. Informed Consent Statement: Not applicable. Conflicts of Interest: The authors declare no conflict of interest.

Sensors 2021, 21, 3885 13 of 14

the plane of the line laser to reconstruct an object. The information about the plane of the line laser with respect to the camera coordinate whenever the plane is rotated should be determined for 3D reconstruction. In this study, we proposed a novel calibration method to determine the relationship between the rotation axis of the line laser and the camera. The center of a cone or cylinder shape generated while the line laser was rotating was calculated. In the simulation study, we evaluated the accuracy of the proposed calibration using two models. The simulation shows that the proposed cone model is superior to the simple cylinder model if the angle alignment error is greater than approximately 0.5◦. This means that, in most cases, we have better results when using the proposed cone model because the misalignment of the laser plane and the rotation axis is larger than this value. Finally, we evaluated the performance of the 3D reconstruction of our system through real experiments and compared the results with those of a commercial RGB-D camera. The quality of the restoration using our system was far superior to that of the RGB-D camera. Our system can acquire the depth map of object at a far distance compared to the RGB-D camera. Our future work will include multiple line laser calibration for improving scanning speed and 3D object recognition using this improved depth information.

Author Contributions: Conceptualization, J.L. and S.L.; methodology, J.L.; software, J.L.; validation, J.L. and H.S.; investigation, H.S. and J.L.; writing—original draft preparation, H.S. and J.L.; writing— review and editing, S.L.; supervision, S.L.; project administration, S.L. All authors have read and agreed to the published version of the manuscript. Funding: This research was supported in part by the Technology Innovation Program (20001856) funded by the Ministry of Trade, Industry & Energy, and also supported in part by the grant (21NSIP- B135746-05) from National Spatial Information Research Program (NSIP) funded by Ministry of Land, Infrastructure and Transport of Korea government of Korean government. Institutional Review Board Statement: Not applicable. Informed Consent Statement: Not applicable. Conflicts of Interest: The authors declare no conflict of interest.

References 1. Blais, F. Review of 20 years of range sensor development. J. Electron. Imaging 2004, 13, 231–243. [CrossRef] 2. Brown, G.M.; Chen, F.; Song, M. Overview of three-dimensional shape measurement using optical methods. Opt. Eng. 2000, 39, 10–22. [CrossRef] 3. Godin, G.; Beraldin, J.-A.; Taylor, J.; Cournoyer, L.; Rioux, M.; El-Hakim, S.; Baribeau, R.; Blais, F.; Boulanger, P.; Domey, J.; et al. Active optical 3D imaging for heritage applications. IEEE Comput. Graph. Appl. 2002, 22, 24–36. [CrossRef] 4. Rambach, J.; Pagani, A.; Schneider, M.; Artemenko, O.; Stricker, D. 6DoF Object Tracking based on 3D Scans for Augmented Reality Remote Live Support. Computers 2018, 7, 6. [CrossRef] 5. Sieka´nski,P.; Magda, K.; Malowany, K.; Rutkiewicz, J.; Styk, A.; Krzesłowski, J.; Kowaluk, T.; Zagórski, A. On-Line Laser Triangulation Scanner for Wood Logs Surface Geometry Measurement. Sensors 2019, 19, 1074. [CrossRef][PubMed] 6. Tang, Y.; Li, L.; Wang, C.; Chen, M.; Feng, W.; Zou, X.; Huang, K. Real-time detection of surface deformation and strain in recycled aggregate concrete-filled steel tubular columns via four-ocular vision. . Comput. Manuf. 2019, 59, 36–46. [CrossRef] 7. Tang, Y.; Chen, M.; Wang, C.; Luo, L.; Li, J.; Lian, G.; Zou, X. Recognition and Localization Methods for Vision-Based Fruit Picking : A Review. Front. Plant Sci. 2020, 11, 510. [CrossRef][PubMed] 8. Kjaer, K.H.; Ottosen, C.-O. 3D Laser Triangulation for Plant Phenotyping in Challenging Environments. Sensors 2015, 15, 13533–13547. [CrossRef][PubMed] 9. Khoshelham, K.; Elberink, S.O. Accuracy and Resolution of Depth Data for Indoor Mapping Applications. Sensors 2012, 12, 1437–1454. [CrossRef][PubMed] 10. Schlarp, J.; Csencsics, E.; Schitter, G. Optical Scanning of a Laser Triangulation Sensor for 3-D Imaging. IEEE Trans. Instrum. Meas. 2020, 69, 3606–3613. [CrossRef] 11. Zeng, Y.; Yu, H.; Dai, H.; Song, S.; Lin, M.; Sun, B.; Jiang, W.; Meng, M.Q.-H. An Improved Calibration Method for a Rotating 2D System. Sensors 2018, 18, 497. [CrossRef][PubMed] 12. Li, G.; Liu, Y.; Dong, L.; Cai, X.; Zhou, D. An Algorithm for Extrinsic Parameters Calibration of a Camera and a Laser Range Finder Using Line Features. In Proceedings of the 2007 IEEE/RSJ International Conference on Intelligent Robots and Systems, San Diego, CA, USA, 29 October–2 November 2007; pp. 3854–3859. [CrossRef] Sensors 2021, 21, 3885 14 of 14

13. Vilaça, J.L.; Fonseca, J.C.; Pinho, A.M. Calibration procedure for 3D measurement systems using two cameras and a laser line. Opt. Laser Technol. 2009, 41, 112–119. [CrossRef] 14. Naroditsky, O.; Patterson, A.; Daniilidis, K. Automatic alignment of a camera with a line scan LIDAR system. In Proceedings of the 2011 IEEE International Conference on Robotics and Automation, Shanghai, China, 9–13 May 2011; pp. 3429–3434. 15. Kwak, K.; Huber, D.F.; Badino, H.; Kanade, T.; Hand, A.; Friedl, W.; Chalon, M.; Reinecke, J.; Grebenstein, M. Extrinsic calibration of a single line scanning lidar and a camera. In Proceedings of the 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems, San Francisco, CA, USA, 25–30 September 2011; pp. 1366–1372. 16. Geiger, A.; Moosmann, F.; Car, O.; Schuster, B. Automatic camera and range sensor calibration using a single shot. In Proceedings of the 2012 IEEE International Conference on Robotics and Automation, St Paul, MN, USA, 14–18 May 2012; pp. 3936–3943. [CrossRef] 17. Vasconcelos, F.; Barreto, J.P.; Nunes, U.J.C. A Minimal Solution for the Extrinsic Calibration of a Camera and a Laser-Rangefinder. IEEE Trans. Pattern Anal. Mach. Intell. 2012, 34, 2097–2107. [CrossRef][PubMed] 18. Moghadam, P.; Bosse, M.; Zlot, R. Line-based extrinsic calibration of range and image sensors. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013; pp. 3685–3691. [CrossRef] 19. Park, Y.; Yun, S.; Won, C.S.; Cho, K.; Um, K.; Sim, S. Calibration between Color Camera and 3D LIDAR Instruments with a Polygonal Planar Board. Sensors 2014, 14, 5333–5353. [CrossRef][PubMed] 20. Levinson, J.; Thrun, S. Automatic Online Calibration of Cameras and . Robot. Sci. Syst. 2016, 9.[CrossRef] 21. Kang, J.; Doh, N.L. Full-DOF Calibration of a Rotating 2-D LIDAR with a Simple Plane Measurement. IEEE Trans. Robot. 2016, 32, 1245–1263. [CrossRef] 22. Dong, W.; Isler, V. A Novel Method for the Extrinsic Calibration of a 2D Laser Rangefinder and a Camera. IEEE Sens. J. 2018, 18, 4200–4211. [CrossRef] 23. Lanman, D.; Taubin, G. Build your own 3D scanner. In ACM SIGGRAPH 2009 Courses on—SIGGRAPH ’09; ACM: New York, NY, USA, 2009; pp. 1–94. [CrossRef] 24. Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [CrossRef]