A Method of Path Planning of Steel Mesh Based on Point Cloud for Welding

Yusen Geng Center for , School of Control Science and Engi- neering, Shandong University Jinan 250061, China Yuankai Zhang Center for Robotics, School of Control Science and Engi- neering, Shandong University Jinan 250061, China Xincheng Tian (  [email protected] ) Center for Robotics, School of Control Science and Engi- neering, Shandong University Jinan 250061, China Xiaorui Shi Sinotruk Industry Park Zhangqiu, Sinotruk Jinan Power Co.,Ltd. Jinan 250220, China Xiujing Wang Sinotruk Industry Park Zhangqiu, Sinotruk Jinan Power Co.,Ltd. Jinan 250220, China Yigang Cui Sinotruk Industry Park Zhangqiu, Sinotruk Jinan Power Co.,Ltd. Jinan 250220, China

Research Article

Keywords: Without teaching and programming, 3D structured light camera, Steel mesh point cloud, Welding path planning

Posted Date: April 7th, 2021

DOI: https://doi.org/10.21203/rs.3.rs-379414/v1

License:   This work is licensed under a Creative Commons Attribution 4.0 International License. Read Full License

Version of Record: A version of this preprint was published at The International Journal of Advanced Manufacturing Technology on July 15th, 2021. See the published version at https://doi.org/10.1007/s00170-021-07601-6. Noname manuscript No. (will be inserted by the editor)

1 2 3 4 5 6 A method of welding path planning of steel mesh based on 7 8 point cloud for welding robot 9 10 Yusen Geng1,2 · Yuankai Zhang1,2 · Xincheng Tian1,2 B · Xiaorui Shi3 · 11 Xiujing Wang3 · Yigang Cui3 12 13 14 15 16 17 18 Received: date / Accepted: date 19 20 21 Abstract At present, the operators needs to carry out 1 Introduction 22 complicated teaching and programming work on the 23 welding path planning for the welding robot before weld- With the rapid development of automation and robot 24 ing the steel mesh. In this work, an automatic weld- technologies, the welding are widely applied into 25 ing path planning method of steel mesh based on point the welding environment to replace human work. The 26 cloud is proposed to simplify the complicated teach- teaching-playback mode of welding robots still plays 27 ing and programming work in welding path planning. an important role in the current industrial production. 28 The point cloud model of steel mesh is obtained by However, the operator needs to carry out complicated 29 three-dimensional vision structured light camera. Then teaching and programming work on the welding path of 30 we use the relevant point cloud processing algorithm to this mode of welding robot before welding. Meanwhile, 31 32 calculate the welding path of the steel mesh, and obtain this work has high requirements for the operator’s op- 33 the 3D information of the welding path for the welding eration level and the accuracy. To conquer the above 34 localization of the robot welding process. Experimen- problems, many researchers study on weld extraction 35 tal results show that the method can accurately realize and welding planning using different sensors for differ- 36 the welding path planning of the steel mesh and accom- ent workpieces. 37 plish the welding task without teaching and program- In the application and research of welding robot, the 38 ming before welding, which improves the production mainly used sensor include infrared sensors [1], RGB-D 39 efficiency. sensors [2,3] and vision sensors [4,5]. The vision sensors 40 of welding robots could be divided into two-dimensional 41 Keywords Without teaching and programming · 3D (2D) vision sensors and three-dimensional(3D) vision 42 structured light camera · Steel mesh point cloud · sensors. The use of 2D vision sensor in welding process 43 Welding path planning 44 mostly needs to cooperate with laser sensor. For exam- 45 ple, Wang et al. [6] proposed a method of combining 46 B Xincheng Tian: laser and vision sensor to identify V-shaped welds of 47 E-mail: [email protected] oil pipelines through image processing, which can be 48 Yusen Geng: E-mail: gys [email protected] used for subsequent trajectory planning. Xu et al. [7] 49 designed a set of real-time welding seam tracking sys- 50 tem based on laser and vision sensor, through an im- 51 1 Center for Robotics, School of Control Science and Engi- proved Canny algorithm to detect the edges of seam 52 neering, Shandong University 53 Jinan 250061, China and pool, which could better overcome the deficiencies 54 2 Engineering Research Center of Intelligent Unmanned of the welding seam tracking control of the teaching- 55 System, Ministry of Education playback mode during welding process. Jinan 250061, China 56 3 Sinotruk Industry Park Zhangqiu, Sinotruk Jinan Power Compared with the 2D vision sensor, the 3D vi- 57 Co.,Ltd. sion sensor can obtain 3D coordinate information of the 58 Jinan 250220, China workpiece and accurate completely the welding task. 59 At present, Linear structured light vision sensors and 60 61 62 63 64 65 2 Yusen Geng1,2 et al. 1 2 stereo vision sensors are the commonly used 3D vision 2 Experiment platform configuration and 3 sensors in robot welding task. For the use of Linear framework 4 structured light vision sensors, Zeng et al. [8] proposed 5 a narrow butt 3D off-line welding path planning method 2.1 Experiment system 6 based on a laser structured light sensor. Hou et al. [9] 7 proposed non-instructional welding method of robotic The robot welding system of the experimental plat- 8 gas metal (GMAW) based on laser struc- form is shown in Fig. 1. It consists of two parts: the 9 tured light vision sensing system (LVSS) , and exper- welding execution system and the 3D vision system 10 iments on V-grooves and fillet welds were performed. [12]. The welding execution system includes the weld- 11 The 3D structured light could obtain the global infor- ing torch, the wire feeder, the manipulator, the robot 12 controller and the cross steel mesh, which is used to 13 mation of the welding environment. However, the laser complete the welding of the intersections of steel mesh. 14 structured light vision could only obtain the local infor- 15 mation. Therefore, the linear structured light is mostly The 3D vision system includes a 3D surface scanning 16 used for the online identification and tracking of the structured light camera and an industrial personal com- 17 weld seam. It is not suitable for the off-line 3D path puter(IPC), which is used to obtain the 3D information 18 planning of the welding robot. of the cross steel mesh in the camera field of vision. 19 20 To overcome the deficiency of linear structured light 21 sensor and realize accurate and efficient off-line 3D path 22 Welding Torch planning, using the stereo structured light sensor to 3D Surface Scanning 23 Wire Feeder generate point cloud and using the point cloud pro- Structured Light Camera 24 cessing method to process it has become a new scheme 25 to solve the path planning of welding robot without 26 Robot Controller 27 teaching and programming. Lei et al. [10] proposed a 28 novel 3D path extraction method of weld seams based 29 on point cloud , which could well serve for the 3D path Workpiece 30 teaching task before welding. Zhang et al. [11] proposed Manipulator Industrial Personal Computer 31 point cloud based approach to recognize working envi- 32 ronment and locate welding initial position using laser 33 stripe sensor. Fig. 1 The robot welding system 34 35 Few researches hammer at automatic welding path The manipulator used in the experimental platform 36 planning method of welding robot under the condition 37 is universal robots 5(UR5), and the 3D surface scanning of without teaching and programming. With the devel- 38 industrial camera is chishine surface 120, as shown in 39 opment of society and the infrastructure construction, Fig. 2. It should be noted that the installation position 40 structural parts, such as steel cage and steel mesh are of 3D surface scanning structured light camera needs to 41 widely used. At the same time, the welding of steel mesh ensure that the welding torch does not enter the field of 42 faces scenes with many crossing points, which results in vision of the camera. The characteristics of 3D surface 43 the cumbersome teaching process. Therefore, using the scanning structured light camera are shown in Table 1. 44 point cloud processing method to plan the welding path 45 of steel mesh is an important part of solving the com- 46 plex teaching and programming problems of welding 2.2 Steel mesh model 47 robot before welding. 48 In order to clearly introduce the method of plan- 49 In this paper, an automatic welding path planning ning the welding path of steel mesh, the steel mesh 50 method of steel mesh based on point cloud is proposed, model used in this paper is shown in Fig. 3a. For the 51 52 which realizes the welding path planning of welding convenience of robot welding, there is a gap at the in- 53 robot without teaching and programming. Section 2 in- tersection of the upper and lower steel bars of the steel 54 troduces the configuration of the experiment system; mesh model. The upper steel bars are supported by two 55 Section 3 illustrates the steps of point cloud preprocess- fixed-size support plates on the left and right sides, and 56 ing; Section 4 illustrates the step of welding path plan- the relevant dimensions of the model are marked in the 57 ning; Section 5 shows about the analysis of the experi- top view of the steel mesh with the main optical axis of 58 mental results, and finally, the conclusion and prospect the camera as the main viewing direction.Fig. 3b shows 59 of this paper are described. the size of the steel mesh from the top view. 60 61 62 63 64 65 A method of welding path planning of steel mesh based on point cloud for welding robot 3 1 132.0±0.1 2 3 4

5 53.6±0.1 6 7 8 Right Structured RGB Left 9 Camera Light Projector Camera Camera 10 Fig. 2 3D Surface scanning industrial camera of Chishine Surface 120 11 12 13 Table 1 The characteristics of 3D surface scanning struc- the steel mesh is too large or too small, as shown in 14 tured light camera Fig. 5a, the taken point cloud image will be the side 15 Case Parameter Value surface of the steel bar, which will affect the accuracy 16 of subsequent welding path planning of the steel mesh. 17 Binocular Structured 1 WorkingPrinciple Therefore, when recording the shooting point position, 18 Light 19 2 LightSource Infraredlaser it is necessary to adjust the shooting posture of the Optimum Working ± camera and record to ensure that the plane between 20 3 500 250 mm ◦ ◦ 21 Distance the main optical axis and the steel mesh is 90 ± 5 , Maximum Working 4 750 as shown in Fig. 5b. The red area in the Fig. 5 is the 22 Distance 23 5 FieldofView(FOV) H52◦ × V 31◦ area that the camera can capture. The position of the 24 6 RepeatAccuracy ±0.5 mm shooting point and the posture of the camera can be 1280× 800@max 2fps; 25 Depth Map Resolution determined through adjustment based on the number 7 640×400@max 8fps; 26 @ Maximum Frame Rate of display points of the steel mesh intersection in the 320×200@max 15fps; 27 × camera’s field of view. 28 8 ColorResolution 2560 1600@20fps 9 PointCloudOutput RGB-D 29 10 Shutter 1/142sto1/10s 30 11 Gain 1× to 16× 31 3 Point cloud preprocessing 32 33 After the 3D surface structured light camera takes 34 2.3 System framework pictures of the steel mesh at the shooting point, it will 35 form a point cloud of the steel mesh within the cam- 36 During the process of welding path planning of steel era’s field of view at the shooting point. It is the initial 37 mesh based on point cloud, move the manipulator to point cloud of the steel mesh without any processing, as 38 steel mesh intersection position and adjust the cam- shown in Fig. 6. Compared with the preprocessed steel 39 era field of view to the reasonable scope through the mesh point cloud, it has the characteristics of complex 40 controller. Simultaneously, we record 41 background, many irrelevant features and high density the shooting points {P |P P ... P } and generate the 42 i 1, 2, , n of point cloud. Therefore, in order to obtain a high- shooting path. Then form a point cloud image of the 43 quality point cloud of the steel mesh, it is necessary to 44 steel mesh through 3D surface scanning structured light preprocess the initial point cloud. 45 camera at shooting point. The image is transferred into 46 the IPC through the MicroB. Finally, the IPC obtains 47 the welding path of the steel mesh within the vision 3.1 Point cloud filtering 48 of the 3D camera through the relevant point cloud pro- 49 cessing method and sends the welding path to the robot The initial steel mesh point cloud contains all the 50 controller. After the robot completes the welding task of features in the camera’s field of view. In order to pre- 51 the current shooting point according to the robot con- 52 vent the interference of irrelevant features on the weld- troller instructions and then continue the welding task 53 ing path planning of the steel mesh and to reduce the of the next shooting point [13]. The specific operation 54 number of points to increase the calculation speed, the 55 process is shown in Fig. 4. initial steel mesh point cloud need filtering the irrele- 56 It should be noted that, since the shooting range of vant features using a pass-through filter. The principle 57 the 3D surface scanning industrial camera is the cov- of a pass-through filter is to perform a simple filtering 58 erage range of the structured light,if the angle between along a specified dimension, that is, cut off values that 59 the main optical axis of the camera and the plane of are either inside or outside a given user range. 60 61 62 63 64 65 4 Yusen Geng1,2 et al. 1 2 Steel bar 1 3 Steel bar 4 Steel bar 3 Support plate 2 4 Support plate 1 5 Steel bar 2 9.8mm 12.3mm 6 2.5mm 7 Support 8 plate 1 Steel bar 1 9.8mm 9 Support 10 plate 2 11 12 Steel bar 4 Steel bar 3 a 13 b 14 Fig. 3 The steel mesh model 15 16 17 Start The filtering of the initial point cloud in this method 18 is mainly to remove the point cloud of the support plat- 19 form. Therefore, there is no need to filter along the z- Record the shooting points 20 {Pi|P1 ,P2 ,…,Pn} axis. According to the display of the steel mesh point 21 cloud on the right of the Fig. 7, we can determine that 22 the initial point cloud only needs to be filtered along 23 Input the shooting points to robot controller and generate the shooting the x-axis. The accepted interval values of x-axis are 24 path 25 set to(−10, 140), and the rest are all removed. 26 The filtering point cloud of the steel mesh using a Form a point cloud image through 3D pass-through filter is shown in Fig. 8. The number of 27 surface scanning structured light 28 camera at Present shooting point points in the point cloud is reduced from 176843 to 29 110777. 30 In the welding of steel mesh with many intersec- 31 Transfer point cloud image to IPC The camera moves to the next through the MicroB shooting point tions, it is impossible to display all the intersections in 32 the camera’s field of view at the same time. Therefore, 33 it is necessary to record the shooting point and cam- 34 Through the point cloud processing 35 algorithm obtains the welding path and era pose for multi-point shooting. Since, the shooting 36 send to the robot controller distance and posture basically remain unchanged, this 37 filtering parameter can be used in subsequent shooting,

38 Completes the welding task of the which is determined in the first shooting. If the steel 39 current shooting point mesh point cloud is shown as Fig. 8 without any sup- 40 port platform or other irrelevant features point cloud, 41 the next step of point cloud plane segmentation can be 42 NO operated directly without using point cloud filtering. 43 Does the camera reached the last shooting point? 44 45 3.2 Background point cloud removal 46 YES 47 End After point cloud filtering, irrelevant features in the 48 steel mesh point cloud have been removed, and only the Fig. 4 The specific operation process 49 steel mesh point cloud and the background point cloud 50 are retained. At this time, the independent steel mesh 51 52 we use the graphic display method to determine the point cloud can be obtained by removing the back- 53 filtering range. The 3D coordinate system of the point ground point cloud through point cloud segmentation. 54 cloud is based on the camera position with the coor- In this experiment, the placement of the steel mesh is 55 dinate origin (0, 0, 0). Therefore, we display the steel a plane. The background plane point cloud is removed 56 mesh point cloud in this coordinate system through the by the plane point cloud segmentation algorithm in the 57 software, and determine the filtering range of irrelevant point cloud library. 58 features point cloud according to the display area of the When using the point cloud segmentation algorithm 59 point cloud on the x,y and z axes. in the point cloud library, the first step is to create an 60 61 62 63 64 65 A method of welding path planning of steel mesh based on point cloud for welding robot 5 1 2 Main optical axis 3 Main optical axis 4 3D Camera 5 3D Camera 6 7 Structured light 8 9 10 Structured light 11 12 Steel bar 1 Steel bar 1 13 Steel bar 3 Steel bar 3 ¦ 14 ¦ Steel mesh plane ¦ ¦ Steel mesh plane 15 16 17 The angle between the main optical axis and The angle between the main optical axis 18 the steel mesh plane(95°<¦ᡌ¦<85°) and the steel mesh plane(85°<¦<95°) 19 a b 20 21 Fig. 5 The diagram of 3D camera shooting posture 22 23 24 25 26 27 28 29 30 31 32 33 34 35 Fig. 6 The initial point cloud of the steel mesh 36 Fig. 8 The filtering point cloud of the steel mesh using a 37 Pass-Through filter 38 39 40 object in the program. The second step is to define the 41 model type. In this experiment, the model type is de- 42 fined as a plane model (SACMODEL PLANE) because 43 the placement of the steel mesh is a plane. Plane model 44 45 contains four parameters, shown in Table 2, which de- 46 termine the plane ax + by + cz + d = 0. The plane 47 is obtained by using the Random Sample Consensus 48 (RANSAC) method (SAC RANSAC) as the robust es- 49 timator of choice. The reason for choosing the RANSAC 50 method is based on RANSAC’s simplicity. Finally, the 51 plane model point cloud and outlier point cloud are 52 classified by setting the distance threshold. All points 53 with a distance less than the threshold are regarded 54 as interior points, and others are regarded as outlier 55 56 points. The retention of internal or outlier points can Fig. 7 The determination of the filtering range by the be achieved through the setting program. 57 graphic display method 58 The selection of the distance threshold uses the graphic 59 display method to display the filtered steel mesh point 60 61 62 63 64 65 6 Yusen Geng1,2 et al. 1 2 Table 2 The specific meaning of the parameters in plane 3.3 Independent steel bar point cloud acquisition 3 model 4 In the process of welding path planning of steel 5 Parameter Meaning mesh, it is necessary to carry out linear fitting of steel 6 bar point cloud and other operations. Therefore, point normal x the x coordinate of the plane’s normal 7 clouds belonging to the same steel bar can be grouped normal y the y coordinate of the plane’s normal 8 into same a class to form an independent steel bar point 9 normal y the z coordinate of the plane’s normal cloud by point cloud clustering method, which can fa- 10 the fourth Hessian component of the cilitate subsequent operation. 11 d plane’s equation The method of steel bars point cloud clustering is 12 13 similar to the point cloud plane segmentation. After cre- 14 ating the object , the first step is to define the model type as a linear model(SACMODEL LINE). The linear 15 cloud along z -axis in the camera coordinate system. Ac- 16 model contains six parameters, shown in Table 3, which cording to the point cloud along the z-axis in Fig. 9, it 17 jointly determine the straight line. The straight line is can be found that the thickness of the background plane 18 also obtained using the RANSAC method (SAC RAN point cloud is about 6mm, so set the distance threshold 19 SAC) as the robust estimator of choice. Finally, the in- to 6 and keep the outlier point cloud. It should be noted 20 terior point and outlier point are determined by setting that the background point cloud has the same thickness 21 the distance threshold. 22 when shooting at each shooting point. Therefore, the 23 distance threshold can be reused.The steel mesh point 24 cloud after background point cloud removal is shown in Table 3 The specific meaning of the parameters in line 25 Fig. 10. 26 model 27 28 Parameter Meaning 29 point on line.x the x coordinate of a point on the line 30 point on line.y the y coordinate of the plane’s normal 31 32 point on line.z the z coordinate of the plane’s normal 33 line direction.x the x coordinate of a line’s direction 34 line direction.y the y coordinate of a line’s direction 35 line direction.z the z coordinate of a line’s direction 36 37 Fig. 9 38 The filtered steel mesh point cloud display along z- axis in the camera coordinate system According to the point cloud after the segmentation 39 in the camera x-O-y coordinate system in Fig. 11, it 40 can be seen that the diameter of the steel bar is about 41 10mm. Compared with the actual size 9.8mm, it can be 42 determined that the maximum distance from a point in 43 the same bar to the straight line fitted by the RANSAC 44 45 method will not exceed 10mm. Hence, the the distance 46 threshold is set to 10, to ensure that all points belonging 47 to the same steel bar are regarded as interior points. 48 Through program setting, the interior points of each 49 line are reserved and stored separately to obtain four 50 independent steel bar point clouds. 51 Clustered points cloud are shown in Fig. 12, where 52 four colors represent four clustered steel bars. 53 54 55 56 4 Steps of welding path planning 57 Fig. 10 The steel mesh point cloud after background point cloud removal 58 Through the point cloud preprocessing operation, 59 the steel mesh point cloud is divided into independent 60 61 62 63 64 65 A method of welding path planning of steel mesh based on point cloud for welding robot 7 1 2 3 4 5$16$& 69' 5 6 Fig. 13 The Principle of straight line fitting based on 7 RANSAC method and SVD method 8 9 The purpose of clustering and segmentation of steel 10 mesh point cloud is to obtain independent steel bars 11 point cloud. This process has low requirements on the 12 accuracy of the straight line fitting method and high re- 13 quirements on the fitting speed. Therefore, the straight 14 line fitting based on RANSAC method is selected. How- 15 16 ever, the purpose of fitting the line of independent steel Fig. 11 The point cloud after the segmentation display in bar point cloud is to achieve the welding path planning. 17 the camera x-o-y coordinate system 18 This process has high requirements on the accuracy of 19 the straight line fitting method, where the straight line 20 fitting based on SVD method is selected. 21 22 23 4.1 The point cloud filtering based on radius outlier 24 removal 25 26 As shown in Fig. 14, when the angle between the 27 main optical axis of the 3D camera and the plane of 28 the steel mesh is 90◦ ± 5◦, the captured point cloud is 29 the upper half of the each steel bar, and the density 30 31 of the point cloud is gradually sparse along the z-axis. At this point, SVD method is used to carry out linear 32 Fig. 12 The Point cloud after clustering 33 fitting for each steel bar point cloud. According to the 34 principle of the SVD method, the fitted straight line is 35 steel bar point cloud. On this basis, we Obtain the weld- located between the central axis and the upper surface 36 ing point coordinates of the steel mesh, finally realizes of the reinforcing bar point cloud. 37 the welding path planning of the steel mesh. 38 It should be noted that the straight line fitting based 39 on SVD method is suitable for independent steel bar 40 point cloud. However, [14]the straight line fitting based 41 42 on RANSAC method is suitable for clustering and seg- 43 mentation of steel mesh point cloud. The main differ- 44 ences are listed as follows: 45 1)The RANCAC method principle is shown on the Front view Side view 46 left side of Fig. 13, which determines a line by two Top view 47 points on the basis of all sample points, and obtain the Fig. 14 Three views of the steel mesh point cloud 48 line model based on interior points by setting the dis- 49 tance threshold. The algorithm is simple and the calcu- From the top view of the cross steel bar point cloud 50 lation speed is fast. However the straight line obtained shown in Fig. 15, it can be seen that the common ver- 51 52 by the fitting must pass through two of the sample tical line of the cross steel bars point cloud is from the 53 points.It slightly reduces the fitting accuracy. straight line fitted by the upper surface point cloud of 54 2)he SVD method principle is shown on the right the lower steel bar to the straight line fitted by the up- 55 side of Fig. 13, which determines the line based on per surface point cloud of the upper steel bar. In order 56 sample points in the way of minimizing the distance to facilitate the accurate solution of the common ver- 57 between all sample points and the fitting line. The fit- tical line, it is necessary to constrain the fitting line of 58 ting process does not need pass through any sample the steel bar point cloud to locate on its upper surface. 59 point, leading to a high-accurate fitting results. Therefore, we need to remove the sparse points on both 60 61 62 63 64 65 8 Yusen Geng1,2 et al. 1 2 sides of the steel by point cloud filtering based on ra- straight line must pass through the position of the arith- 3 dius outlier removal. The remaining point cloud on the metic average of all sample points coordinates. 4 Upper surface of the steel bar are taken as the sample 5 point of the fitting line. n P xi 6 i=1 x = n 7  n  P yi (1) 8  y i=1  = n 9 Steel bar 3  n 10 P zi z = i=1 11  n Fitted line  12   The difference matrix A between the coordinates 13 Steel bar 1 common perpendicular of each sample point and the arithmetic average of all 14 sample points coordinates (x, y, z) is calculated by 15 Fitted line 16 17 Fig. 15 The top view of the cross steel bar point cloud x1 − x y1 − y z1 − z 18 x2 − x y2 − y z2 − z 19   According to the principle of radius outlier removal A = x3 − x y3 − y z3 − z (2) 20 in Fig. 16, calculate the number of other points within  . . .  21  . . .  the radius d of each point. When the number of other   22  x − xy − y z − z  points within the radius is less than the set number, the  n n n  23   24 point will be removed. The singular value decomposition of matrix A is per- 25 formed by 26 27 AT A = VST U T USV T = VST U T U SV T 28 σ2 29 d  max  d T T . T (3) 30 = V S S V = V  ..  V 31 σ2   min  32 Fig. 16 The principle of radius outlier removal   33 U is an n*n orthogonal matrix. S is a matrix 34 After repeated debugging in this experiment, the composed of r singular values from large to small along 35 A search radius is set to 2.5 and the number of points is the arranged diagonally, and r is the rank of matrix . 36 V set to 12. The point cloud after filtering based on radius is a 3*3 singular vector matrix arranged from large 37 to small along the column direction. The direction of 38 outlier removal is shown in Fig. 17. the obtained straight line is the same as the singular 39 40 vector corresponding to the maximum singular value. 41 Therefore, the first column of the V matrix is selected 42 as the direction of the fitted straight line, and the sin- 43 gular vector matrix with three rows and one column is 44 expressed as Vd. 45 We can define a straight line by the known direction 46 and one point. The coordinates (xl,yl,zl) of all points 47 Front view Top view Side view on this straight line satisfy Eq. 4. t is the relation vari- 48 able between the point coordinates on the straight line 49 Fig. 17 Three views of the point cloud after filtering based and the arithmetic average of all sample point coordi- 50 on radius outlier removal 51 nates. 52 4.2 The straight line fitting based on SVD method 53 xl = x + Vd (1, 1) × t 54 yl = y + Vd (1, 2) × t (4) 55 The idea of fitting a space line based on SVD method  z = z + V (1, 3) × t 56 is straightforward, that is, minimizing to minimize the  l d 57 distance from all sample points to the straight line.  The length of the fitting line can be determined ac- 58 Firstly, we calculate the arithmetic average of all sam- cording to Eq. 4. The direction of the steel bar is di- 59 ple points coordinates (x, y, z) according to Eq. 1. The vided into two types: extending along the x-axis and 60 61 62 63 64 65 A method of welding path planning of steel mesh based on point cloud for welding robot 9 1 2 extending along the y-axis. When the steel bar extend- 3 ing along the x-axis, firstly, we select the x coordinate 4 of the outermost points on both ends of the steel bar 5 point cloud extension direction as the x coordinate on 6 the two endpoints of the fitting line. Then we bring the 7 x coordinate of each endpoint into the x coordinate ex- Front view Top view Side view 8 pression in Eq. 4, and find the corresponding t of this 9 endpoint, which are t1 and t2 [15]. Finally, we bring t1 Fig. 20 Three views of the common vertical line of the fitting 10 and t into the expression of y coordinate and z coor- line 11 2 dinate in Eq. 4. At this point, we can determine the 12 13 coordinates of the two endpoints and length of the fit- according to Eq. 4. The relationship between straight 14 ting line. The relation variable t corresponding to all line AM and AB, CN and CD is 15 the points between the two endpoints of the line are all 16 the numbers between t1 and t2. The length of the fit- 17 ting straight line of the steel bar point cloud extending −−→ −−→ AM = k1AB = k1 (xb − xa,yb − ya,zb − za) 18 along the y-axis is solved in the same way as extending −−→ −−→ (5) CN = k CD = k (x − x ,y − y ,z − z ) 19 along the x-axis.The straight line fitting based on SVD ( 2 2 d c d c d c 20 method is shown in Fig. 18. 21 By solving Eq. 5, it can be known that the coor- 22 dinates of foot point M (xm,ym,zm) and foot point 23 N (xn,yn,zn) are expressed as 24 25 26 27 (xm,ym,zm)= 28 (k (x − x )+ x ,k (y − y )+ y ,k (z − z )+ z )  1 b a a 1 b a a 1 b a a 29 Front view Top view Side view  (xn,yn,zn)=  30 (k2 (xd − xc)+ xc,k2 (yd − yc)+ yc,k2 (zd − zc)+ zc) 31 Fig. 18 Three views of the straight line fitting based on SVD  (6) 32 method  33 34 4.3 Find the common vertical line Substituting Eq. 6 into Eq. 7, we can obtain k1 and 35 k2 according to Eq. 7 which denotes the inner product of 36 After obtaining the fitting straight line of the steel two perpendicular vectors is 0. Therefore, we can obtain 37 bar point cloud, we need to find the common vertical the coordinates of the foot point M and the foot point 38 line of the fitting line. As shown in Fig. 19, the fitting N . 39 straight lines of steel bar 1 and steel bar 3 are AB and 40 CD . The common vertical line of AB and CD is PQ. 41 M is the foot point of AB and N is the foot point of −−→ −−→ 42 AB · MN = 0 CD . −−→ −−→ (7) 43 ( CD · NM = 0 44 45 C Through the foot points M and N, we can deter- 46 M Steel bar 1 47 mine a straight line, which is the common vertical line A B 48 of the straight lines AB and CD. The coordinates of 49 the points between MN on the common vertical line 50 conform to Eq. 8, and l is the relation variable.The 51 N common vertical line of the fitting line is shown in Fig. 52 20. 53 D 54 Steel bar 3 55 Fig. 19 The Schematic diagram of finding the common ver- xmn = l × (xn − xm)+ xm 56 tical line y = l × (y − y )+ y (0 ≤ l ≤ 1) (8) 57  mn n m m z = l × (z − z )+ z 58 We can calculate the four points coordinates of A  mn n m m 59 (xa,ya,za), B (xb,yb,zb),C (xc,yc,zc) and D (xd,yd,zd)  60 61 62 63 64 65 10 Yusen Geng1,2 et al. 1 2 4.4 Find the gap width of crossed steel bars time, we could find l corresponding to point P accord- 3 ing to Eq. 11, and then bring it into Eq. 8 to find the 4 The welding path planning needs to be determined coordinate of point p (xp,yp,zp) 5 according to the gap width of crossed steel bars. When 6 the gap width is less than 2mm, the steel mesh welding dup + DFN /2 7 adopts . When the gap width is greater lP = (11) DMN 8 than 2mm, the steel mesh welding adopts arc weld- 9 ing. Therefore, before determining the welding path, The coordinates of point E1 (xE1,yE1,zE1) can be 10 we should calculate the gap width of the crossed steel calculated according to Eq. 12. It denotes the coordi- 11 bars on first. nates of two parallel vectors are proportional to each 12 other. 13 After obtaining the coordinates of the two endpoints 14 of the common vertical line MN, we can get the length of the common vertical line MN according to xE1 − xp yE1 − yp zE1 − zp 15 = = (12) 16 xc − xn yc − yn zc − zn 17 According to Eq. 12, the relationship between yE1 18 2 2 2 19 Dmn = (xm − xn) + (ym − yn) +(zm − zn) (9) and xE1, zE1 and xE1 can be summarized as 20 q

21 The length of the common vertical line MN refers to (xE1−xp)×(yc−yn) yE1 = − + yp 22 xc xn the distance between the upper surface of the lower steel − × − (13) 23  h (xE1 xp) (zc zn) i bar and the upper surface of the upper steel bar. There- zE1 = x −x + zp 24  c n fore, we can get the gap width between the crossed bars h i 25 −−→ by subtracting the diameter d of the upper steel bar  Substituting Eq. 13 into the PE1 distance calcula- 26 up d MN tion Eq. 14, we can get the x coordinate xE of point 27 from the length up of the common vertical line . 1 E 28 The upper steel bar diameter has been accurately mea- 1. 29 sured before welding. 30 −−→ PE1 = dup/2 31 (14) 32 2 2 2 DFN = DMN − dup (10) = ( xE1 − xp) +(yE1 − yp) +(zE1 − zp) 33 q 34 Substituting xE1 into Eq. 12, we can get the pro- 35 After getting the gap width of the crossed steel bars, portional relationship, and then acquire the coordinate 36 we began to plan the welding path. As shown in Fig. 21, (xE1,yE1,zE1) of point E1 . 37 point P is the midpoint of the gap between the upper Point F1 is obtained by offsetting point P along the 38 and lower steel bars, point E is obtained by offsetting −−→ −−→ 1 MB direction by the distance of the lower steel bar 39 point P along the NC direction by the distance of the radius. The way of calculating the coordinates of point 40 −−→ upper steel bar radius, and the length of is PE1 the 41 F1 is the same as point E1. Similar to Eq. 12, we can radius dup/2 of the upper steel bar. 42 get 43

44 (xF 1−xp)×(yb−ym) yF 1 = − + yp 45 C xb xm − × − (15) Steel bar 1 (xF 1 xp) (zb zm) 46 M  zF 1 = h − i + zp  xb xm 47 A B 48 h i −−→ H E  Substituting Eq. 15 into the PF distance calcula- 49 1 1 G 1 F 1 tion Eq. 16, we can get the x coordinate x of point F 50 F F 1 1 2 F x 51 P N 1 .Then, bring F 1 into vector parallel formula, we can 52 get the coordinate (xF 1,yF 1,zF 1) of point F1. 53 54 D Steel bar 3 −−→ 55 PF1 = ddown/2 (16) 56 Fig. 21 The schematic diagram of welding path planning 2 2 2 = ( xF 1 − xp) +(yF 1 − yp) +(zF 1 − zp) 57 q 58 Point is located on the common vertical line MN, Point G is obtained by offsetting point P along −−→ 1 59 so the coordinate of point P conforms to Eq. 8. At this the NC direction by the distance of the upper steel 60 61 62 63 64 65 A method of welding path planning of steel mesh based on point cloudforweldingrobot 11 1 −−→ 2 bar radius and then along the MB direction by the distance of the lower steel bar radius. We can get Eq. 3 −−−→ −−→ 4 17 according to E1G1 is parallel to PF1 and has the H G 5 same length. Then, the coordinates (xG1,yG1,zG1) of 1 1 Steel bar 1 6 point G1 can be obtained by Eq. 17. Similarly, we can 7 get the coordinates of point F2 and point H2. 8 E 1 9 xG − xE yG − yE zG − zE Steel bar 2 10 1 1 = 1 1 = 1 1 = 1 (17) 11 xF 1 − xP yF 1 − yP zF 1 − zP 12 13 14 4.5 Welding path planning 15 Steel bar 4 Steel bar 3 16 The final step is to plan the welding path, which can 17 be divided into the following two situations. Fig. 22 The positions of welding points required for welding path planning 18 1)When the gap width DFN between crossed bars 19 is less than 2mm, the steel mesh welding adopts spot 20 welding, and the welding point of the welding torch is 21 required for the welding path planning. We use the rel- E . 22 1 evant points to calculate the gap width of the crossed 23 2)When the gap width DFN between crossed bars steel bars, then compare with the actual gap width of 24 is greater than 2mm, the steel mesh welding adopts the crossed steel bars and calculate its error [17]. Fi- 25 arc welding. The initial point of the welding path is nally, we determine the feasibility of the welding path H . Then the welding torch passes through point E 26 1 −−−→ 1 planning method of the steel mesh based on point cloud 27 along the direction of H1G1 in a straight line and fi- according to this error. In this experiment, there are 28 nally reaches the endpoint G1 of the welding path to four crossed steel bars in the view field of the 3D cam- 29 accomplish the primary welding [16]. If the gap width era, generating four welding positions. Through the cal- 30 DFN between crossed bars is too large, it is necessary culation of the point cloud taken by the 3D camera, the 31 to weld the welding path several times according to the 32 coordinates of the two endpoints of the gap between actual situation. For example, after completing a weld- 33 crossed steel bars at four welding positions are calcu- 34 ing task, the welding torch reaches point G1, then the lated as shown in Table 4. welding torch passes through point E along the di- 35 −−−→ 1 36 rection of G1H1 in a straight line, and finally reaches 37 the endpoint H1and to complete the secondary weld- Table 4 The coordinates of the two endpoints of the gap at 38 ing. Actual welding times, can be set freely according four welding positions 39 to the gap width D . The positions of welding points FN Welding 40 Points x y z required for welding path planning are shown in Fig. positions 41 22. F 115.603 -51.223 382.744 1 42 N 115.828 -50.952 385.781 43 5 Experiments and results F 52.914 29.583 380.185 2 44 N 53.151 29.870 383.414 45 After obtaining the welding path of the steel mesh, F 52.254 -51.159 387.428 3 N 46 we verify the feasibility, efficiency and accuracy of the 52.492 -50.873 390.644 47 F 115.194 25.2727 376.003 welding path planning method of steel mesh based on 4 N 48 115.412 25.5382 378.973 point cloud through error analysis, method efficiency 49 50 and welding platform experiment results. According to the coordinates of the two endpoints of 51 52 the gap, we get the gap width and error of the four weld- 53 5.1 Method error analysis ing positions. The results are shown in Table 5.Through 54 the comparison between the calculated gap width of the 55 The error analysis is to verify whether the weld- crossed bars and the actual gap width, it can be found 56 ing path planning method of the steel mesh based on that the maximum error is 0.75mm, the minimum error 57 point cloud meets the welding accuracy requirements, is 0.49mm, and the average error is 0.635mm. Accord- 58 and to validate its feasibility. Through the welding path ing to Table 1, the repeatability of the camera is ±0.5 59 planning, we determine the coordinates of the points mm [18]. Taking into account the shooting error of the 60 61 62 63 64 65 12 Yusen Geng1,2 et al. 1 2 3D camera, it can be concluded that the error between Table 7 The welding point coordinates required for the four 3 the calculated gap width and the actual gap width of welding positions 4 the crossed steel bars meets the accuracy requirements. Welding Points x y z 5 Therefore, the welding path planning method of the positions 6 steel mesh based on point cloud s feasible, which meets H1 120.63 -55.97 384.34 G 7 the welding accuracy requirements. 1 1 110.86 -55.96 385.06 E 8 1 115.74 -55.97 384.70 H1 57.87 24.51 381.91 9 G 10 2 1 48.12 25.18 382.56 E 52.93 24.85 382.24 11 Table 5 The gap width and error of the four welding posi- 1 tions H1 57.22 -55.90 389.11 12 3 G1 47.45 -55.89 389.83 13 Welding The calculated The actual gap Error E1 52.33 -55.90 389.47 14 positions gap width(mm) width(mm) (mm) H1 120.21 20.19 377.60 4 G 110.45 20.86 378.25 15 1 3.06 2.5 0.56 1 E 115.33 20.52 377.92 16 2 3.25 2.5 0.75 1 17 3 3.24 2.5 0.74 18 4 2.99 2.5 0.49 19 welding torch to the position of the welding point [19]. 20 The principle of hand-eye calibration in this experiment 21 is shown in Fig. 23. 22 23 5.2 Method efficiency analysis 24 25 Running time is a key factor to reflect the method Camera 26 performance. Because there are large number of shoot- 27 ing points in industrial field, there are certain require- End Object 28 ments for the time of point cloud generation and weld- 29 ing path planning for each shooting. Through many ex- 30 periments, we record the total time for the welding path 31 planning at four welding positions,as shown in Table 6. 32 Base 33 Eye in Hand 34 Fig. 23 The principle of Eye-in-Hand calibration 35 Table 6 The total time for the welding path planning at four 36 welding positions In the eye-in-hand calibration method, Eq. 18 is ap- 37 Case Step Runningtime(ms) 38 plicable to any two postures of the robot in the process 3D camera shooting to 39 1 52 of moving. 40 form a point cloud 2 Pathplanning 2875 41 3 Thewholeprocess 2927 Base End2 Camera2 42 TEnd2 × TCamera2 × TObject Base End1 Camera1 (18) 43 = TEnd1 × TCamera1 × TObject 44 The process of welding path planning at four weld- End 45 ing positions in this method takes about 3s. Therefore, According to the Eq. 18, the external matrix TCamera 46 the efficiency of this method can fully adapt to the with the smallest error is selected as follows after mul- 47 needs of industrial production . tiple calibrations. 48 49 50 5.3 Welding platform experiment results analysis 0.9997 0.0221 −0.0051 −40.7215 51 End −0.0221 0.9996 0.0147 115.4470 52 TCamera =   53 Table 7 summarizes the results of the welding point 0.0054 −0.0146 0.9999 −126.3277 coordinates required for the four welding positions in  0.0000 0.0000 0.0000 1.0000  54   55 the 3D camera field of view obtained by the welding  (19) 56 path planning method based on point cloud. 57 The welding robot needs to carry out reasonable We record the shooting posture (-272.08,647.67,419.28, 58 and accurate hand-eye calibration experiments to re- -174.76,3.37,20.09) of the 3D camera. According to the 59 alize the accurate positioning of the front end of the shooting posture and the external matrix, we acquire 60 61 62 63 64 65 A method of welding path planning of steel mesh based on point cloudforweldingrobot 13 1 the coordinates of the front end of welding torch cor- 2 a 3 responding to the welding points at four welding posi- 0.6 0.4

4 tions in the robot base coordinate system. The results 0.2 5 are shown in Table 8. 0

x coordinate error 1 2 3 4 5 6 7 8 9 10 11 12 points 6 b 7 0 8 Table 8 The coordinates of the front of welding torch cor- 9 responding to the welding points acquired by hand-eye cali- -0.5 y coordinate error 1 2 3 4 5 6 7 8 9 10 11 12 10 bration points c 11 0.5 Welding Points x y z 12 positions 0 13 H1 -201.54 632.45 151.17 -0.5 z coordinate error 1 2 3 4 5 6 7 8 9 10 11 12 14 1 G1 -210.68 628.92 151.06 points d 15 E1 -206.11 630.68 151.11 0.8 16 H1 -230.77 534.64 151.21 G 0.6 17 2 1 -239.65 530.49 151.11

E1 -235.21 532.57 151.16 Distance error 18 0.4 H -260.85 609.55 150.35 1 2 3 4 5 6 7 8 9 10 11 12 19 1 points 3 G -270.00 606.02 150.24 20 1 E1 -265.42 607.78 150.29 Fig. 24 The error analysis between the welding point coor- 21 H1 -174.02 561.12 151.97 dinates acquired by the article method and manual teaching. 22 4 G1 -182.90 556.98 151.87 23 E1 -178.46 559.05 151.92 24 25 by the article method and manual teaching, which the 26 we acquire the coordinates of the front end of weld- error is within ±0.6 mm.Fig. 24d shows the distance 27 ing torch corresponding to the welding points at four error between the welding points acquired by the article 28 welding positions in the robot base coordinate system method and manual teaching, which the error is within 29 by manual teaching. The results are shown in Table 9. 1mm. 30 Through the actual operation of the experiments 31 and the error analysis , we found that all the errors are 32 33 Table 9 The coordinates of the front of welding torch corre- not more than 1mm and within the allowable reason- sponding to the welding points acquired by manual teaching 34 able range, which does not affect the welding effect.It is 35 Welding verified that the method in this paper can realize the ac- Points x y z 36 positions curacy of welding path planning without teaching and 37 H1 -200.93 632.11 150.93 programming. 38 1 G1 -210.02 628.75 151.20 E 39 1 -205.7 630.65 150.97 H 40 1 -230.5 534.41 150.73 2 G1 -239.41 530.29 151.40 6 Conclusion 41 E1 -235.28 532.22 151.55 42 H 1 -260.44 609.32 150.13 This paper studies a method of welding path plan- 43 3 G1 -269.65 605.83 149.98 44 E1 -265.46 607.40 150.18 ning of steel mesh based on point cloud for welding 45 H1 -174.08 560.7 152.21 robot, which lays the foundation for the accurate plan- G 46 4 1 -182.48 556.69 152.09 ning of the steel mesh welding path and independent E 47 1 -178.48 558.52 151.38 welding while eliminating the complicated teaching and 48 programming work in welding path planning. The main 49 After reasonable planning of the robot posture, the contributions of this paper are summarized as follows. 50 coordinates of the welding point in Table 8 are sent 1)The application of the 3D surface scanning struc- 51 52 to the welding robot through the controller. Then the tured light camera to the industrial welding scene can 53 welding robot drives the front end of the welding torch quickly and conveniently obtain the point cloud of the 54 to accurately find the welding point and execute the workpiece, which improves the welding efficiency. 55 welding task according to the planned welding path. 2)This method solved the complicated teaching and 56 Finally,we analyze the error of the corresponding programming problem of the welding robot before weld- 57 welding points coordinates in Table 8 and Table 9.Fig. ing the steel mesh. Through the combination of point 58 24a, 24b, and 24c respectively show the x, y, and z axes cloud library and mathematical theory, we can accu- 59 errors between the welding points coordinates acquired rately plan the welding path of the steel mesh and com- 60 61 62 63 64 65 14 Yusen Geng1,2 et al. 1 2 plete the welding task without teaching and program- 6. Wang LM, Yin Y and Yan XL,The Application Research of 3 ming before welding. Welding Line Distin-Guish Method Based on V Structure 3)We verify the feasibility, efficiency and accuracy Laser Line, in Artificial Intelligence and Electromechani- 4 cal Automation (AIEA),2020 International Conference on. 5 of the welding path planning method of steel mesh IEEE, 57-62 (2020) 6 based on point cloud through analysis of method er- 7 ror, method efficiency and welding platform experiment 7. Xu YL, Yu HW, Zhong JY, Lin T and Chen SB, Real- time seam tracking control technology during welding robot 8 results. 9 GTAW process based on passive vision sensor, Journal of In the future work, we will improve and complete Materials Processing Technology, 212, 1654-1662 (2012) 10 our work. Meanwhile, the proposed method also has 11 some weaknesses. For example, the proposed method 8. Zeng JL, Chang, BH, Du D, Peng GD and Shan JG, 12 A Vision-Aided 3D Path Teaching Method before Narrow 13 in this article is only suitable for the steel mesh work- Butt Joint Welding, Sensors, 17, 1099-1114 (2017) 14 pieces. We will improve our method to adapt to differ- 15 ent welding scenarios. 9. Hou Z, Xu YL, Xiao RQ and Chen SB, A teaching-free 16 welding method based on laser visual sensing system in robotic GMAW, The International Journal of Advanced 17 Manufacturing Technology, 109, 1755-1774 (2020) 18 Declarations 19 10. Yang L, Li E,Long T, Fan JF and Liang ZZ, A Novel 3-D 20 Ethical Approval Not applicable. Path Extraction Method for Arc Welding Robot Based on 21 Stereo Structured Light Sensor, IEEE Sensors Journal, 19, 763-773 (2018) 22 Consent to Participate Not applicable. 23 11. Zhang LZ,Xu YL, Du SF, Zhao WJ, Hou Z and Chen 24 Consent to Publish Not applicable. SB, Point Cloud Based Three-Dimensional Reconstruction 25 and Identification of Initial Welding Position, Transactions 26 on Intelligent Welding Manufacturing, 1, 61-77 (2018) 27 Authors Contributions Yusen Geng was a major 28 contributor in writing the manuscript. All authors read 12. Yang L, Liu YH, Peng JZ and Liang ZZ, A novel system 29 and approved the final manuscript. for off-line 3D seam extraction and path planning based on point cloud segmentation for arc welding robot, Robotics 30 and Computer Integrated Manufacturing, 64,(2020) 31 Funding The authors gratefully thank the research 32 funding the Shandong Provincial Key Research and De- 13. Yang L, Liu YH and Peng JZ , Advances techniques of 33 velopment Program (Major Scientific and Technological the structured light sensing in intelligent welding robots: 34 a review, The International Journal of Advanced Manufac- Innovation Project) under Grant No. 2019JZZY010441. 35 turing Technology, 110, 1027-1046 (2020) 36 Competing Interests The authors declare that they 14. Wang XF, Zhang XQ, Ren XK, Li LF and Feng HJ , 37 Point cloud 3D parent surface reconstruction and weld seam 38 have no conflict of interest. feature extraction for robotic grinding path planning, The 39 International Journal of Advanced Manufacturing Technol- 40 Availability of data and materials Not applicable. ogy, 107, 827-841 (2020) 41 42 15. Wang NF , Zhong KF, Shi XD and Zhang XM , A robust 43 weld seam recognition method under heavy noise based on References structured-light vision, Robotics and Computer Integrated 44 Manufacturing, 61, (2020) 45 1. Zhu J, Wang J, Su N, Xu G and Yang M, An infrared 46 visual sensing detection approach for swing arc narrow gap 16. Lei T , Huang, Y , Shao WJ, Liu WN and Rong YM, A 47 weld deviation, Journal of Materials Processing Technology, tactual weld seam tracking method in super narrow gap of 48 243, 258-268 (2017) thick plates, Robotics and Computer Integrated Manufac- 2. Rodriguez-Martin M, Rodriguez-Gonzalvez P, Gonzalez- turing, 62, (2020) 49 Aguilera D and Fernandez-Hernandez J, Feasibility Study 50 of a Structured Light System Applied to Welding Inspection 17. Wu KX, Wang TQ, He JJ, Liu Y and Jia ZW , 51 Based on Articulated Coordinate Measure Machine Data, Autonomous seam recognition and feature extraction for 52 IEEE Sensors Journal, 48, 4217-4224 (2017) multi-pass welding based on laser stripe edge guidance net- 53 3. Ahmed SM, Tan YZ, Lee GH, Chew CM and Pang work,The International Journal of Advanced Manufactur- 54 CK, Object detection and motion planning for automated ing Technology, 111, 2719-2731(2020) 55 welding of tubular joints, in Intelligent Robots and Sys- 56 tems (IROS), 2016 IEEE/RSJ International Conference on. 18. Dong ZX, Mai ZH, Yin SQ, Wang J, Yuan J and Fei IEEE, 2610-2615 (2016) YN, A weld line detection robot based on structure light 57 4. Li Y, Li YF, Wang QL, Xu D and Min T, Measurement for automatic NDT,The International Journal of Advanced 58 and Defect Detection of the Weld Bead Based on Online Manufacturing Technology, 111, 1831-1845(2020) 59 Vision Inspection, IEEE Transactions on Instrumentation 60 and Measurement, 59, 1841-1849 (2010) 19. Du RQ, Xu YL , Yin SQ, Hou Z, Shu J and Chen 61 5. Liu Y, Zhang Y, Iterative local anfis-based human SB, Strong noise image processing for vision-based seam 62 intelligence modeling and control in pipe gtaw process: a tracking in robotic ,The International 63 data-driven approach, IEEE/ASME Trans. Mechatron, 20, Journal of Advanced Manufacturing Technology, 101, 2135- 1079-1088 (2010) 2149(2019) 64 65 Figures

Figure 1

The robot welding system

Figure 2

3D Surface scanning industrial camera of Chishine Surface 120 Figure 3

The steel mesh model Figure 4

The speci c operation process Figure 5

The diagram of 3D camera shooting posture

Figure 6

The initial point cloud of the steel mesh Figure 7

The determination of the ltering range by the graphic display method Figure 8

The ltering point cloud of the steel mesh using a Pass-Through lter

Figure 9

The ltered steel mesh point cloud display along z-axis in the camera coordinate system Figure 10

The steel mesh point cloud after background point cloud removal Figure 11

The point cloud after the segmentation display in the camera x-o-y coordinate system Figure 12

The Point cloud after clustering

Figure 13

The Principle of straight line tting based on RANSAC method and SVD method Figure 14

Three views of the steel mesh point cloud

Figure 15

The top view of the cross steel bar point cloud

Figure 16 The principle of radius outlier removal

Figure 17

Three views of the point cloud after ltering based on radius outlier removal

Figure 18

Three views of the straight line tting based on SVD method Figure 19

The Schematic diagram of nding the common vertical line

Figure 20

Three views of the common vertical line of the tting line

Figure 21

The schematic diagram of welding path planning Figure 22

The positions of welding points required for welding path planning Figure 23

The principle of Eye-in-Hand calibration Figure 24

The error analysis between the welding point coordinates acquired by the article method and manual teaching.