sensors
Article Development of a Wide Area 3D Scanning System with a Rotating Line Laser
Jaeho Lee 1,†, Hyunsoo Shin 1,† and Sungon Lee 2,*
1 Department of Electrical and Electronic Engineering, Hanyang University, Ansan 15588, Korea; [email protected] (J.L.); [email protected] (H.S.) 2 School of Electrical Engineering, Hanyang University, Ansan 15588, Korea * Correspondence: [email protected]; Tel.: +82-31-400-5174 † These authors contributed equally to this work.
Abstract: In a 3D scanning system, using a camera and a line laser, it is critical to obtain the exact geometrical relationship between the camera and laser for precise 3D reconstruction. With existing depth cameras, it is difficult to scan a large object or multiple objects in a wide area because only a limited area can be scanned at a time. We developed a 3D scanning system with a rotating line laser and wide-angle camera for large-area reconstruction. To obtain 3D information of an object using a rotating line laser, we must be aware of the plane of the line laser with respect to the camera coordinates at every rotating angle. This is done by estimating the rotation axis during calibration and then by rotating the laser at a predefined angle. Therefore, accurate calibration is crucial for 3D reconstruction. In this study, we propose a calibration method to estimate the geometrical relationship between the rotation axis of the line laser and the camera. Using the proposed method, we could accurately estimate the center of a cone or cylinder shape generated while the line laser was rotating. A simulation study was conducted to evaluate the accuracy of the calibration. In the experiment, we compared the results of the 3D reconstruction using our system and a commercial depth camera. The results show that the precision of our system is approximately 65% higher for plane reconstruction, Citation: Lee, J.; Shin, H.; Lee, S. and the scanning quality is also much better than that of the depth camera. Development of a Wide Area 3D Scanning System with a Rotating Line Keywords: 3D reconstruction; laser-camera calibration; triangulation Laser. Sensors 2021, 21, 3885. https:// doi.org/10.3390/s21113885
Academic Editor: Bijan Shirinzadeh 1. Introduction Today, many application fields require technology for 3D modeling of real-world Received: 6 May 2021 objects, such as design and construction of augmented reality (AR), quality verification, Accepted: 1 June 2021 Published: 4 June 2021 the restoration of lost CAD data, heritage scanning, surface deformation tracking, 3D reconstruction for object recognition and so on [1–7]. Among the non-contact scanning
Publisher’s Note: MDPI stays neutral methods, the method using a camera and a line laser is widely used because the scanning with regard to jurisdictional claims in speed is relatively fast and there is no damage to the object [8,9]. There are two methods published maps and institutional affil- for measuring the distance with the laser: the laser triangulation (LT) method and the iations. time-of-flight (ToF) method. The LT method is slower but it has more precise results than the ToF method [10]. However, both methods have relatively small measurable zones. They scan a point or a line, at most, a narrow region at a time. A laser range finder (LRF) was developed to overcome this limitation. It extends the scan area by rotating the light source [11]. The scan area can be further increased by rotating the LRF. Recently, various Copyright: © 2021 by the authors. Licensee MDPI, Basel, Switzerland. methods for estimating the rotation axis of the LRF have been studied to obtain a 3D depth This article is an open access article map of the environment [12–22]. distributed under the terms and In this study, an LT system with a rotating line laser and wide-angle camera was conditions of the Creative Commons proposed as shown in Figure1. We only rotate the line laser, not the entire system, to scan a Attribution (CC BY) license (https:// wide area. Rotating a laser is easy because the line laser is light, leading to a simple system. creativecommons.org/licenses/by/ We do not need to rotate the camera because it already has a wide field of view with a 4.0/). proper lens (e.g., fish-eye lens).
Sensors 2021, 21, 3885. https://doi.org/10.3390/s21113885 https://www.mdpi.com/journal/sensors Sensors 2021, 21, x FOR PEER REVIEW 2 of 14 Sensors 2021, 21, 3885 2 of 14
system. We do not need to rotate the camera because it already has a wide field of view withWith a proper this lens setup, (e.g the., fish first-eye step lens). is to precisely estimate this rotation axis with respect to theWith camera this setup coordinates,, the first step to measure is to precisely the depth estimate of anthis object rotation using axis with triangulation respect to [23]. Owingthe camera to various coordinates reasons,, to measure such asthe the dep fabricationth of an object error using of triangulation mechanical parts,[23]. Owing linelaser misalignmentto various reasons to the, such rotation as the axis, fabrication line laser error optical of mechanical properties, parts, and line so on, laser the misalign- rotation axis ofment the to line the laser rotation is somewhat axis, line laser different optical from propert the idealies, and CAD so on design., the rotation However, axis thisof the axis is preciselyline laser is measured somewhat to different obtain accuratefrom the 3Dideal information CAD design. from However, the LT this method. axis is precisely In this paper, wemeasured propose to a obtain novel calibrationaccurate 3D between information the camerafrom the coordinate LT method system. In this and paper the, rotatingwe pro- line laserpose coordinatea novel calibration system between using a the cone camera model. coordinat Findinge system the axis and of the rotation rotating with line respect laser to thecoordinate camera system coordinate using system a cone is model the key. Finding for accurate the axis calibration. of rotation We with noticed respect that to thethe cone shapecamera is coordinate always created system while is the the key plane for accurate of the line calibration. laser rotates We aboutnoticed the that rotation the cone axis of theshape motor is always due to created inevitable while imperfect the plane of alignment the line laser between rotate thes about line the laser rotation and the axis rotating of axis.the motor Therefore, due to we inevitable estimate imperfect the axis ofalignment rotation between using the the cone linemodel laser and because the rotating the central axis. Therefore, we estimate the axis of rotation using the cone model because the central axis of the cone and the rotation axis of the motor are same. The cone model enables us to axis of the cone and the rotation axis of the motor are same. The cone model enables us to estimate the rotation axis accurately leading to improved calibration results. In summary, estimate the rotation axis accurately leading to improved calibration results. In summary, our contribution is to propose a cone model for an accurate calibration between the camera our contribution is to propose a cone model for an accurate calibration between the cam- and the rotating line laser resulting in an accurate and wide 3D scanning system. era and the rotating line laser resulting in an accurate and wide 3D scanning system.
Figure 1. Three dimensional (3D) reconstruction using our system, (a) setup of the system, (b) recon- Figure 1. Three dimensional (3D) reconstruction using our system, (a) setup of the system, (b) re- structionconstruction results. results.
TheThe rest of of Section Section 11 definesdefines the the notation notation and and coordinate coordinate system, system, and and Section Section 2 intro-2 intro- ducesduces the schematic diagram diagram of of the the system system and and the the calibration calibration process. process. In Section In Section 3, the3, the triangulationtriangulation method method through through the the line line-plane-plane intersection intersection is described, is described, andand the calibration the calibration betweenbetween the fixed fixed line line laser laser and and camera, camera, and and the the calibration calibration between between the rotating the rotating line laser line laser andand cameracamera are explained. explained. In In Section Section 4,4 the, the calibration calibration accuracy accuracy according according to the to therotation rotation axisaxis ofof the line laser laser is is evaluated, evaluated, and and the the 3D 3D reconstruction reconstruction results results of the of theRGB RGB-D-D camera camera andand thisthis system are are compared. compared. Then, Then, the the accuracy accuracy of ofthe the 3D 3Ddepth depth estimation estimation is analyzed is analyzed usingusing a planar model. model. Finally, Finally, the the conclusions conclusions are are presented presented in Section in Section 5. 5.
2.2. OverallOverall System 2.1.2.1. HardwareHardware OurOur scanning system system has has a aline line laser laser fixed fixed to toa motor a motor and and a camera a camera with with an infrared an infrared ◦ (IR)(IR) filter,filter, as shown shown in in Figure Figure 2.2 The. The line line laser laser used used in our in system our system has 90 has° fan 90 anglesfan at angles 850 at 850nm nmwavelength wavelength (FLEXPOINT (FLEXPOINT MVnano, MVnano, Laser Components, Laser Components, Bedford, Bedford, NH, USA). NH, We USA). de- We designedsigned a mount a mount that that can can fix fixand and adjust adjust the theorientation orientation of the of theline linelaser. laser. ThisThis mount mount has has twotwo degreesdegrees ofof freedomfreedom rotating about pitch pitch and and roll roll axes. axes. We We will will explain explain the the effect effect of of the rollthe angleroll angle with with simulation. simulation. Our Our brushless brushless direct direct current current (BLDC) (BLDC motor) motor with with an an incremental incre- encodermental encoder has approximately has approximately a resolution a resolution of 0.08 of◦. To0.08 obtain°. To obtain the absolute the absolute angle angleθ about θ the motorabout the axis, motor we calculated axis, we calculated the angle the with angle the with incremental the incremental encoder encoder from from the home the home position. Weposition. used aWe camera used a with camera a wide-angle with a wide lens-angle and lens an and 850 an nm 850 infrared nm infrared bandpass bandpass filter filter to detect onlyto detect IR light only (FLEA3,IR light (FLEA3, FLIR System, FLIR System, Wilsonville, Wilsonville, OR, USA). OR, USA) The. camera The camera with with a lens a has vertical and horizontal field of view of approximately 57◦ and 44◦, respectively. The camera resolution is 1280 by 1080 and its acquisition speed is 150 frames per second.
Sensors 2021, 21, x FOR PEER REVIEW 3 of 14
Sensors 2021, 21, x FOR PEER REVIEW 3 of 14
lens has vertical and horizontal field of view of approximately 57° and 44°, respectively. The camera resolution is 1280 by 1080 and its acquisition speed is 150 frames per second.
Sensors 2021, 21, 3885 lens has vertical and horizontal field of view of approximately 57° and3 of44 14°, respectively. The camera resolution is 1280 by 1080 and its acquisition speed is 150 frames per second.
Figure 2. Scanning system with a line laser fixed to a motor and infrared camera.
2.2. Process of Scanning System As shown in Figure 3, we executed three processes to acquire 3D scan data. Because our scanning system consists of three components, we determined the transformation re- lationship between the camera and other devices via calibration methods. FigureFigureBecause 2. 2. ScanningScanning every system pointsystem of with thewith a 3D line ascan laserline data fixedlaser was to fixed a acquired motor to anda motorfrom infrared the and image, camera. infrared and the camera area . illuminated by the laser was narrow in the field of view of the sensor, we rotated the laser 2.2. Process of Scanning System via the motor and recorded a camera image for every angle of the motor. By using cali- 2.2bration. ProcessAs data shown ofand Scanning inthe Figure angle 3value Sys, wetem of executed the motor three, we processescalculated to3D acquire points from 3D scan each data.recorde Becaused iourmageAs scanning. Tshownhe 3D systempoints in Figure are consists stored 3, ofwein threethe executed program components, memorythree weproces, and determined weses can to visualize acquire the transformation or store3D scan data. Because relationship between the camera and other devices via calibration methods. ourthem scanning in the storage system of our consist PC. s of three components, we determined the transformation re- lationship between the camera and other devices via calibration methods. Because every point of the 3D scan data was acquired from the image, and the area illuminated by the laser was narrow in the field of view of the sensor, we rotated the laser via the motor and recorded a camera image for every angle of the motor. By using cali- bration data and the angle value of the motor, we calculated 3D points from each recorded image. The 3D points are stored in the program memory, and we can visualize or store them in the storage of our PC.
Figure 3. 3. FlowchartFlowchart of of our our scanning scanning process process..
2.3. CameraBecause Calibration every point of the 3D scan data was acquired from the image, and the area illuminatedIn digital by imaging, the laser the was camera narrow is modeled in the as field a pinhole of view camera of the model sensor,. The we pinhole rotated the cameralaser via model the motor is a method and recorded for projecting a camera any single image point for present every angle in a 3D of space the motor. into image By using coordinatescalibration data( , ). andIt consists the angle of a combination value of the of motor,intrinsic we parameters calculated ( 3D, , points , , ) from and each extrinsicrecorded parameter image. Thes, rigid 3D points body transform are stored to in a theworld program coordinate memory, system and. we can visualize or storeThe them parameters in the storage estimated of our through PC. the camera calibration process are used. Cur- rently, most intrinsic and extrinsic parameters are estimated using a checkerboard printed on2.3. a Cameraflat plane Calibration [24]. If we know the intrinsic parameters of the camera, we can calculate whereIn any digital points imaging, represented the camerain the camera is modeled reference as aframe pinhole are project cameraed model. into the The image pinhole coordinatecamera model system. is a Furthermore, method for projecting knowing the any 3D single point point to 2D present point corresponding in a 3D space intopairs image of more than four points with the camera’s intrinsic parameters identified, we can estimate coordinates (r, c). It consists of a combination of intrinsic parameters fx, fy, cx, cy, α Figurethe rigid 3. Flowchartbody transform of our of it scanning with respect process to the .camera coordinate system. and extrinsic parameters, rigid body transform to a world coordinate system. For the experiments, we used a checkerboard with an AR marker, ArUco (ChArUco), The parameters estimated through the camera calibration process are used. Currently, to estimate the rigid body transform, as shown in Figure 4. Its parameters were also used 2.3.most Camera intrinsic Calibration and extrinsic parameters are estimated using a checkerboard printed on a flat to identify the laser plane in the camera reference coordinate system. planeIn [digital24]. If we imaging, know the intrinsicthe camera parameters is modeled of the camera, as a wepinhole can calculate camera where model any . The pinhole points represented in the camera reference frame are projected into the image coordinate camerasystem. model Furthermore, is a method knowing for the projecting 3D point to 2Dany point single corresponding point present pairs ofin more a 3D than space into image coordinatesfour points with( , the). It camera’s consists intrinsic of a combination parameters identified, of intrinsic we canparameters estimate the ( rigid , , , , ) and extrinsicbody transform parameter of it withs, rigid respect body to the transform camera coordinate to a world system. coordinate system.
TheFor theparameters experiments, estimated we used a checkerboard through the with camera an AR marker, calibration ArUco (ChArUco),process are used. Cur- to estimate the rigid body transform, as shown in Figure4. Its parameters were also used rently,to identify most the intrinsic laser plane and in extrinsic the camera parameters reference coordinate are estimated system. using a checkerboard printed on a flat plane [24]. If we know the intrinsic parameters of the camera, we can calculate where any points represented in the camera reference frame are projected into the image coordinate system. Furthermore, knowing the 3D point to 2D point corresponding pairs of more than four points with the camera’s intrinsic parameters identified, we can estimate the rigid body transform of it with respect to the camera coordinate system. For the experiments, we used a checkerboard with an AR marker, ArUco (ChArUco), to estimate the rigid body transform, as shown in Figure 4. Its parameters were also used to identify the laser plane in the camera reference coordinate system.
Sensors 2021, 21, 3885 4 of 14 Sensors 2021, 21, x FOR PEER REVIEW 4 of 14
FigureFigure 4.4. ((a) Intrinsic calibration calibration and and (b ()b Extrinsic) Extrinsic calibration calibration.. 3. Solving Extrinsic Calibration 3. Solving Extrinsic Calibration Our scanning system estimates 3D depth information based on the triangulation of Our scanning system estimates 3D depth information based on the triangulation of pixel points in the image coordinate system. Therefore, it is necessary to know the ray pixel points in the image coordinate system. Therefore, it is necessary to know the ray informationinformation of the the camera camera coordinate coordinate system system and and the the plane plane information information of the of line the laser line laser,, suchsuch asas thethe normalnormal vectors vectors,, to to calculate calculate the the depth. depth. Our Our scanning scanning system system uses uses a rotating a rotating lineline laserlaser andand fixedfixed camera. camera. The The first first step step is isto toperform perform a calibration a calibration that that calculates calculates the the geometricalgeometrical relationship between between the the camera camera and and the the laser laser-rotating-rotating axis axis.. We Weperformed performed thisthis calibrationcalibration using a a cone cone shape shape model model because because a cone a cone is made is made from from multiple multiple plane planess generatedgenerated by the rotation of of the the line line laser laser.. In In the the following following subsections, subsections, we weexplain explain this this processprocess inin detail.detail. First, First, the the triangulation triangulation method method is introduced, is introduced, and and then then the theprocess process of of obtainingobtaining the relationship between between the the camera camera and and plan planee generated generated by the by theline linelaser laser is is explained.explained. After that that,, the the calibration calibration to to find find the the transformation transformation between between the therotation rotation axis axis ofof thethe lineline laser and and the the camera camera is is explicated. explicated. Finally, Finally, we we obtained obtained the thedepth depth map map from from the the encoderencoder valuevalue of the motor motor and and the the info informationrmation between between the the line line laser laser and and camera. camera.
3.1.3.1. TriangulationTriangulation TriangulationTriangulation refers refers to to the the process process ofof calculating calculating the the depth depth value valueλ λof of a acommon common point observedpoint observed by two by sensors, two sensors, provided provided that a that transformation a transformation matrix matrix between between both bothsystems, suchsystems, as camera such as and camera line and laser lin ise known.laser is known In 3D. space,In 3D space, a common a common point point is created is created through line-linethrough orline line-plane-line or line intersections.-plane intersections. The line Thel isline the rayis the of ray the of camera, the camera, and the and plane the PL isplane the line is laser the line plane. laser Our plane. system Our correspondssystem corresponds to a line-plane to a line intersection-plane intersection as shown as in Figureshown5 .in The Figure intersection 5. The intersection between abetween plane and a plane a line and is a given line is by given by − λ = T , (1) n q p − qL λ = T , (1) where , , and are the normal vectorn ofv , a point of , a point and a direc- tional vector of , respectively. where n, q , q and v are the normal vector of PL, a point of PL, a point and a directional becomesp L a zero vector when passing through the origin of the camera coordinate vector of l, respectively. system. Therefore, the depth λ can be calculated as qL becomes a zero vector when passing through the origin of the camera coordinate system. Therefore, the depth λ can be calculated as λ = . (2) T n qp λ = . (2) nTv
Sensors 2021, 21, 3885 5 of 14 Sensors 2021, 21, x FOR PEER REVIEW 5 of 14
Sensors 2021, 21, x FOR PEER REVIEW 5 of 14
Figure 5. Line-Plane intersection. Figure 5. Line-Plane intersection. 3.2. Camera to Fixed Laser Calibration Figure3.2. Camera 5. Line to-Plane Fixed intersection Laser Calibration. To calculate the relationship between fixed line laser and the camera, point p on the 3.2. CameraTo calculate to Fixed the Laser relationship Calibration between fixed line laser and the camera, point on the planeplane P L acquiredacquired fromfrom multiple poses poses are are required required,, asas indicated indicated in Figure in Figure 6. Because6. Because the the To calculate the relationshipC between fixed line laser and the camera, point on the transformationtransformation matrix matrix W T fromfrom camera camera (C ()C coordinate) coordinate to toworld world (W) ( Wcoordinate) coordinate can be can be plane acquired from multiple poses are required, as indicated in Figure 6. Because the acquiredacquired throughthrough the the AR AR m marker,arker, the the laser laser points points on on the the checkerboard checkerboard can canestimate estimate the the transformation matrix from camera (C) coordinate to world (W) coordinate can be depthdepthλ λthrough through triangulation. triangulation. PointPoint p onon thethe estimatedestimated plane P L can be represented by by the acquired through the AR marker, the laser points on the checkerboard can estimate the followingthe follow impliciting implicit equation: equation: depth λ through triangulation. Point n Ton(p the− q estimated) = 0. plane can be represented by (3) the following implicit equation: ( − ) = 0. (3) ··· { } The elements p1, p2, , pn of the points set p collected from n poses and the plane The elements , , ⋯ , of the points( − set) = 0{ . } collected from n poses and the(3) PL were formulated with matrix multiplication as follows. plane were formulated with matrix multiplication as follows. , , ⋯ , { } n The elements of theT points set collected from poses and the p −1 plane were formulated with matrix 1 multiplication−1 as follows. ⎡ T ⎤ p 2 −−11 n ⎢ −1⎥ = . = 0.(4) (4) ⎡. ⋮ . ⋮ ⎤ T ⎢. . ⎥ q n −1 ⎢⎣ T −1⎥⎦ = . (4) p⎢ n⋮ −1⋮ ⎥ Because the set of points { } acquired⎣ − 1through⎦ the camera inevitably contains noise data,Because this equation the set is of not points equal{ p to} acquired zero. The throughsolution the can camerabe found inevitably by the linear contains least noise- Because the set of points { } acquired through the camera inevitably contains noise data,squares this method. equation Therefore, is not equal the to equation zero. The of solutionthe plane can begenerated found by by the the linear line laser least-squares can data, this equation is not equal to zero. The solution can be found by the linear least- method.be estimated Therefore, with respect the equation to the c ofamera the plane coordinatesP generated. In addition, by the we line normalized laser can bethe estimated nor- squares method. Therefore, the equation of theL plane generated by the line laser can withmal vector respect to of the the camera plane to coordinates. simplify the Ingeometric addition, analysis we normalized in the next the section. normal vector n of be estimated with respect to the camera coordinates. In addition, we normalized the nor- themal plane vector to simplify of the plane the geometric to simplify analysis the geometric in the analysis next section. in the next section.
Figure 6. Calibration of laser plane with respect to camera coordinate.
FigureFigure3.3. Camera 6.6. Calibration to Rotating of of laser laserLaser plane plane Calibration with with respect respect to tocamera camera coordinate coordinate.. 3.3.3.3. CameraCameraWhen theto Rotating line laser Laser Laser rotates Calibration Calibration along the rotation axis of the motor, the plane created by the line laser rotates along the rotation axis. This means that the equation of the plane When the line laser rotates along the rotation axis of the motor, the plane created by generatedWhen by the the line line laser laser rotates in the camera along thecoordinate rotation also axis changed of the motor,through the an planeangle createdθ. The by the line laser rotates along the rotation axis. This means that the equation of the plane theequa linetion laser of rotates is related along to the rotation rotation axis axis. Thisand meansthe translation that the vector equation of ofthe the line plane θ PgeneratedL generated by the by theline linelaser laser in the in camera the camera coordinate coordinate also changed also changed through throughan angle an. The angle θ. equation of is related to the rotation axis and the translation vector of theC line The equation of PL is related to the rotation axis ω and the translation vector L t of the line laser expressed in the camera coordinates. The angle θ can be obtained through an incremental encoder with the motor. If the line laser is fixed, the parameters (n, q) for the
Sensors 2021, 21, 3885 6 of 14
equation of the plane PL are constant. However, in our system, the equation of the plane C PL depends on the parameters L t , ω and θ because the line laser rotates along the rotation axis of the motor.
3.3.1. Find a Point on Rotating Axis C To estimate the translation vector L t , the various planes PLθ rotated around angle θ should be calculated. The implicit equation of plane sets with m poses can be expressed using Equation (5). T − = PL1 : n1 p q1 0 T( − ) = PL2 : n2 p q2 0 . . T PLm : nm(p − qm) = 0 ∩ ∩ · · · ∩ = C PL1 PL2 PLm L t. (5) If a plane rotates along an arbitrary axis in a 3D space, there exists an intersection point through which all planes pass through, excluding a specific case (Figure7a). We discuss this specific case with the simulation in Section 4.1. Simultaneously, the intersection point C L t exists on the rotation axis of the motor. Because the parameters n1...m, q1...m about these C planes PL1...m had noise data, we obtained the position vector L t using the least squared method, as represented by Equation (6).
T T n1 −n1 q1 nT −nTq 2 2 2 p Ax = . . ≈ 0. (6) . . 1 T T nm −nmqm
C This position vector p corresponds to the translation L t from the camera coordinate to the laser coordinate. In addition, Equation 6 can be interpreted as a geometric meaning that minimizes the distance to a common point for all planes. The shortest distance from a point to the plane was along a line perpendicular to the plane.
nT(p − q) T Sensors 2021, 21, x FOR PEER REVIEW d = = n (p − q) (∵ knk2 = 1).7 of 14 (7) knk2
(a) (b) FigureFigure 7. 7.(a)( Thea) The laser laser plane plane with with an angle an angle to the torotating the rotating axis is forming axis is forming a cone. (b a) cone.The trace (b) of The trace of thethe normal normal vector vector of the of the rotating rotating plane plane is forming is forming an inverted an inverted cone after cone starting after starting points pointsof them of them are are moved to a point on the rotating axis. moved to a point on the rotating axis. 3.3.2.As Findshown Rotating in Figure Orientation 7b, one end of the normal vector points toward the origin, and the other end points to the base of a cone in 3D space. The shape of the cone is deformed Similar to estimating the translation of the rotation axis, we used {PL} to estimate the according to the position of the axis of rotation and the position of the laser, but the central direction of ω. The parameters constituting each equation of PL equation are (n, q). n is axis of the cone and the axis of rotation of the motor are always the same. a normal vector of the plane P and q is a point on the plane. Because n is a free vector, The central axis of the cone alwaysL is parallel with the normal vector of the base of the starting point of the vector can be freely moved. If the starting points of the normal the cone. Finding the normal vector of the base of the cone can be regarded as the same task as finding the direction vector of the motor’s axis of rotation. Let us denote this base plane of the inverted cone as with ( − ) = 0. The process of finding the rotation axis direction from the collected plane { } is as follows: We generate a matrix consisting of corresponding to the normal vector of the
collected planes , , … , . Each row of a matrix consisting of refers to a point on the base ( ) of the cone, so it satisfies the plane equation. Because we have m normal vectors, this can be expressed as follows: : ( − ) = 0