<<

sensors

Article A Camera-Based Position Correction System for Autonomous Production Line Inspection

Amit Kumar Bedaka 1,* , Shao-Chun Lee 1, Alaa M. Mahmoud 1, Yong-Sheng Cheng 1 and Chyi-Yeu Lin 1,2,3,*

1 Department of Mechanical Engineering, National Taiwan University of Science and , Taipei 106, Taiwan; [email protected] (S.-C.L.); [email protected] (A.M.M.); [email protected] (Y.-S.C.) 2 Center for Cyber-Physical System Innovation, National Taiwan University of Science and Technology, Taipei 106, Taiwan 3 Taiwan Building Technology Center, National Taiwan University of Science and Technology, Taipei 106, Taiwan * Correspondence: [email protected] (A.K.B.); [email protected] (C.-Y.L.)

Abstract: Visual inspection is an important task in manufacturing industries in order to evaluate the completeness and quality of manufactured products. An autonomous -guided inspection system was recently developed based on an offline programming (OLP) and RGB-D model system. This system allows a non-expert automatic optical inspection (AOI) engineer to easily perform inspections using scanned data. However, if there is a positioning error due to displacement or rotation of the object, this system cannot be used on a production line. In this study, we developed   an automated position correction module to locate an object’s position and correct the robot’s pose and position based on the detected error values in terms of displacement or rotation. The proposed Citation: Bedaka, A.K.; Lee, S.-C.; module comprised an automatic hand–eye calibration and the PnP algorithm. The automatic hand– Mahmoud, A.M.; Cheng, Y.-S.; Lin, eye calibration was performed using a calibration board to reduce manual error. After calibration, the C.-Y. A Camera-Based Position PnP algorithm calculates the object position error using artificial marker images and compensates for Correction System for Autonomous the error to a new object on the production line. The position correction module then automatically Production Line Inspection. Sensors 2021, 21, 4071. https://doi.org/ maps the defined AOI target positions onto a new object, unless the target position changes. We 10.3390/s21124071 performed experiments that showed that the robot-guided inspection system with the position correction module effectively performed the desired task. This smart innovative system provides a Academic Editors: Abir Hussain, novel advancement by automating the AOI process on a production line to increase productivity. Dhiya Al-Jumeily, Hissam Tawfik and Panos Liatsis Keywords: automatic optical inspection (AOI); offline programming; position correction; image processing; production line; ; camera-based system Received: 30 May 2021 Accepted: 12 June 2021 Published: 13 June 2021 1. Introduction Publisher’s Note: MDPI stays neutral The development of industrial became quite mature by the 1970s, and with regard to jurisdictional claims in many companies have been gradually introducing automated production technology published maps and institutional affil- to assist production lines since [1]. The automation of the production line has enabled iations. the manufacturing process to be more efficient, data-based, and unified. Automation have heavily influenced many processes, from loading and unloading objects to sorting, assembly, and packaging [2]. In 2012, the German government proposed the concept of Industry 4.0 [3]. Many Copyright: © 2021 by the authors. companies have started to develop automation technologies, from upper layer software Licensee MDPI, Basel, Switzerland. or hardware providers and middle layer companies that store, analyze, manage, and This article is an open access article provide solutions, to lower layer companies, so that they can apply these technologies distributed under the terms and in their factories [4]. In addition, a smarter integrated sensing and control system based conditions of the Creative Commons Attribution (CC BY) license (https:// on existing automation technologies has begun to evolve as part of the development of a creativecommons.org/licenses/by/ highly automated or even fully automated production model [5]. 4.0/).

Sensors 2021, 21, 4071. https://doi.org/10.3390/s21124071 https://www.mdpi.com/journal/sensors Sensors 2021, 21, 4071 2 of 19

In modern automation technology, development is mainly based on a variety of algorithms and robot arm control systems [6]. The robotic arm has the advantages of high precision, fast-moving speed, and versatile motion. In advanced applications, ex- tended sensing modules, such as vision modules, distance sensors, or force sensors, can be integrated with the robotic arm to give human sense to robotic systems [7]. Robot systems in general have the ability to “see” things through various vision systems [8]. These help to detect and locate objects, which speeds up the technical process. In addition, they improve the processing accuracy and expand the range of applications for automated systems. The type of data obtained can be approximately divided into a 2D vision system (RGB/Mono Camera system), which receives planar images [9], and a 3D vision system (RGB-D Camera system), which receives depth images in the form of vision modules [10]. These two systems have their own advantages and disadvantages and suitable application scenarios. In the field of industrial manufacturing automation, the 2D vision system is commonly used to meet the needs of high precision, ease of installation, and use. The robotic arm with a 2D vision camera is the most commonly used setup in today’s industrial automation applications. Pick and place [11] and random bin picking [12] are among the most frequent applications that use an industrial camera with a manipulator. Several open-source and commercial toolboxes, such as OpenCV [13] and Halcon [14], are already available for use in vision systems. However, regardless of how mature the software support is, a lot of hardware is still needed with human intervention to complete the process. Camera calibration is needed before operation and is applied to various image processing algorithms so that important parameters, such as focal length, center of image, intrinsic parameter, extrinsic parameter, and lens distortion [15,16], can be learnt. Furthermore, camera calibration is a challenging and time-consuming process for an operator unfamiliar with the characteristics of the camera on the production line. Camera calibration is a major issue that is not easily solved in the automation industry and complicates the introduction of production line automation. Traditional vision-based methods [17–20] require 3D fixtures corresponding to a reference coordinate system to calibrate a robot. These methods are time-consuming, inconvenient, and may not be feasible in some applications. A camera installed in the working environment or mounted on a robotic arm can be categorized as an eye-to-hand (camera-in-hand) calibration or stand-alone calibration [21]. The purpose of the hand–eye correction is similar to that of the robot arm end-point tool calibration (TCP calibration), which obtains a convergence homogeneous matrix between the robot end-effector and the tool end-point [22]. However, unlike TCP calibration, it can be corrected by using tools to touch fixed points in different positions. To obtain a hand–eye conversion matrix, the visual systems use different methods, such as the parametrization of a stochastic mode [23] and dual-quaternion parameterization [24], since the actual image center cannot be used. For self-calibration methods, the camera is rigidly linked to the robot end-effector [25]. A vision-based measurement device and a posture measuring device have been used in a system that captures robot position data to model manipulator stiffness [26] and estimate kinematic parameters [27–29]. The optimization technique is based on the end-effector’s measured positions. However, these methods require offline calibration, which is a limitation. In such systems, accurate camera model calibration and model calibration are required for accurate positioning. The camera calibration procedure required to achieve high accuracy is, therefore, time-consuming and expensive [30]. Positioning of the manufactured object is an important factor in industrial arm appli- cations. If the object is not correctly positioned, it may cause assembly failure or destroy the object. Consequently, the accuracy of object positioning often indirectly influences the processing accuracy of the automated system. Although numerous studies have been conducted to define object positioning accurately based on vision systems [31–33], no system has been found with an offline programing platform to perform AOI inspection Sensors 2021, 21, 4071 3 of 19

on a production line. Table1 shows a comparison between the system we propose here and to existing vision-based position correction systems in terms of performance. The proposed system expedites the development of a vision-based object position correction module for a robot-guided inspection system. This allows the robot-guided inspection system to complete AOI tasks automatically in a production line regardless of the object’s position. In the proposed system, the AOI targets can be automatically mapped onto a new object for the inspection task, whereas existing vision systems have been developed to locate the object’s position but not for the production line. To operate these systems, the operator needs to be skillful, and integration with the other system is a tedious process. Furthermore, user-defined robot target positions cannot be updated for inspection if there is a change in the object’s position, which makes it more challenging to perform tasks on the production line. The proposed position correction system is capable of self-calibration and can update the object position and AOI targets automatically in the production line.

Table 1. Comparison between existing vision-based position correction systems and the proposed system.

Features Vision-Based Systems [31–33] Proposed System Self-calibration Yes Yes Errors and failures High Low Online position correction in a production line No Yes Setup complexity with OLP High Low Auto correct robot targets No Yes

Here, we propose a novel approach to automate manufacturing systems for various applications in order to solve the object position error encountered on the production line. We developed an automated position correction module to locate an object’s position and adjust the robot pose and position in relation to the detected error values on displacement or rotation. The proposed position correction module is based on an automatic hand–eye calibration and the PnP algorithm. The automatic hand–eye calibration was performed using a calibration board to reduce manual error, whereas the PnP algorithm calculates the object position error using artificial marker images. The position correction module identifies the object’s current position and then measures and adjusts the robot work points for a defined task. This developed module was integrated with the autonomous robot-guided inspection system to build a smart system to perform AOI tasks on the production line. The robot-guided inspection system based on the offline programming (OLP) platform was developed by integrating a 2D/3D vision module [34]. In addition, the position correction module maps the defined AOI target positions to a new object unless they are changed. The effectiveness and robustness of the proposed system was indicated by conducting two tests and comparing captured images with sets of standard images. This innovative system minimizes human effort and time consumption to expedite the AOI setup process in the production line, thereby increasing productivity. The remainder of this paper is organized as follows: in Section2, we give an overview of the position correction system integration with robot-guided inspection architecture; in Section3, we provide an overview of the position correction system and introduce the proposed method; in Section4, we detail the integration of the position correction module with the OLP platform and report the system performance; in Section5, we report the conclusions of the proposed system.

2. System Overview The robot-guided inspection system was designed and developed with the vision module by Amit et al. [34]. The robot-guided inspection system, shown in the blue dashed boxes in Figure1, consists of an OLP platform and vision module. The OLP platform was designed and developed using OCC open source libraries to generate a robot trajectory for 3D scanning and to define AOI target positions using CAD information [35]. The robot- guided inspection efficiently performs AOI planning tasks using only scanned data and Sensors 2021, 21, x FOR PEER REVIEW 4 of 19

2. System Overview The robot-guided inspection system was designed and developed with the vision module by Amit et al. [34]. The robot-guided inspection system, shown in the blue dashed boxes in Figure 1, consists of an OLP platform and vision module. The OLP platform was designed and developed using OCC open source libraries to generate a robot trajectory for 3D scanning and to define AOI target positions using CAD information [35]. The robot- guided inspection efficiently performs AOI planning tasks using only scanned data and Sensorsdoes2021, 21not, 4071 require a physical object or an industrial manipulator. This developed system 4 of 19 can also be used in different production lines based on robot-guided inspection. However, the developed system is not comprehensive enough to be used in an assembly line to perform reliable AOIdoes tasks. not Therefore, require a physical the ro objectbot-guided or an industrial inspection manipulator. system This was developed integrated system can also be used in different production lines based on robot-guided inspection. However, the with the position correctiondeveloped module system is not(red comprehensive dashed boxes enough in toFigure be used 1) in anto assemblyresolve lineissues to perform related to object displacementreliable AOI or tasks. rotation Therefore, erro thers in robot-guided a production inspection line for system AOI was tasks. integrated Figure with the 1 presents a completeposition overview correction of the module prop (redosed dashed system boxes architecture, in Figure1) to resolvewhich issuesincludes related the to object OLP platform, visiondisplacement module, and or position rotation errors correction in a production module. line In forthis AOI study, tasks. the Figure primary1 presents a objective of the positioncomplete correction overview system of the proposedwas to calculate system architecture, the rotation which and includes translation the OLP of platform, vision module, and position correction module. In this study, the primary objective of the the new object over theposition production correction line system using was artificial to calculate markers the rotation on andthe translationobject. Moreover, of the new object the proposed systemover was the developed production to line minimize using artificial the complexity markers on theof hand–eye object. Moreover, calibration the proposed and position correctionsystem within was developedthe production to minimize line. the In complexityaddition, ofwe hand–eye aimed calibrationfor the user and of position the integrated autonomouscorrection robot-guided within the production system line. to In not addition, have we to aimed define for the userAOI of target the integrated autonomous robot-guided system to not have to define the AOI target positions unless positions unless theythey are arechanged. changed. This This would not not only only save save time time and effort, and buteffort, increase but productivity.increase The productivity. The proposedproposed positionposition correction correction system system consisted consisted of a simple of hand–eye a simple calibration hand–eye method for calibration method forthe the development development of the positionof the position correction correction method. method.

FigureFigure 1.1.Overview Overview of theof the position position correction correction module module integration in withtegration the robot-guided with the inspectionrobot-guided system inspection architecture. system architecture. The flowchart shown in Figure2 explains the automatic hand–eye calibration method, which is part of the position correction system. The calibration method was divided into The flowchart shownthree main in stages:Figure “environment 2 explains setup the andautomatic initial settings”, hand–eye “robot calibration scan and trajectory method, which is partplanning”, of the and position “image capturecorrection based system. on the scan The trajectory”. calibration In the environmentmethod was setup and divided into three maininitialization stages: “environment phase, the user must setup prepare and theinitial environment settings”, for “robot calibration scan by and measuring trajectory planning”,the and position “image of the calibrationcapture boardbased in theon workspace, the scan helping trajectory”. the arm see In the the calibration board and providing other simple basic settings. The robot scan and trajectory planning environment setup andstage initialization recorded the optimal phase, end-effector the user must position, prepare while capturingthe environment the calibration for board calibration by measuringimage the at eachposition position of the during calibration the image board captures in basedthe workspace, on the scan helping trajectory stage. the arm see the calibrationThe environment board and setup providing and initial other settings simple is thebasic only settings. part of theThe system robot that scan requires and trajectory planningmanual stage operation recorded (Figure the2 ).op Thetimal proposed end-effector calibration position, method was while implemented capturing to initiate the calibration boardthe image position at each correction position module during to measure the image and compensate captures forbased the positionon the scan error before defining the AOI target positions on new objects.

Sensors 2021, 21, x FOR PEER REVIEW 5 of 19

trajectory stage. The environment setup and initial settings is the only part of the system that requires manual operation (Figure 2). The proposed calibration method was Sensors 2021, 21, 4071 that requires manual operation (Figure 2). The proposed calibration method5 ofwas 19 implemented to initiate the position correction module to measure and compensate for the position error before defining the AOI target positions on new objects.

FigureFigure 2.2. FlowchartFlowchart ofof thethe hand–eyehand–eye calibrationcalibration methodology.methodology.

TheThe flowchartflowchart ofof thethe proposed proposed object object correction correction methodology methodology is is presented presented in inFigure Figure3 and3 and is dividedis divided into into the the offline offline registration registrati processon process and theand real-time the real-time online online object object posi- tioning.positioning. In the In offline the offline process, process, the user the can user specify can thespecify robot’s the artificial robot’s markerartificial detection marker position,detection designposition, the design marker the pattern, marker and pattern, work and points. work In points. the online In the process, online process, the system the takessystem a picturetakes a ofpicture the artificial of the artificial marker atmarker the specific at the specific position position based on based the user’s on the offline user’s settings.offline settings. Subsequently, Subsequently, the system the identifies system the identifies current objectthe current position object and autonomously position and measuresautonomously and adjustsmeasures the and robot adjusts work the points robot for work the AOIpoints inspection for the AOI task. inspection The position task. correctionThe position system correction is then integratedsystem is withthen theintegrated autonomous with robot-guidedthe autonomous optical robot-guided inspection systemoptical toinspection demonstrate system the to performance demonstrate of the the performance proposed system. of the proposed system.

Figure 3. Flowchart of the object position correction methodology.

3. Overview of the Position Correction System

In this study, an image-based object position correction system was designed and developed. Object positioning has always been a key component of the automated man- Sensors 2021, 21, x FOR PEER REVIEW 6 of 19

Figure 3. Flowchart of the object position correction methodology.

3. Overview of the Position Correction System In this study, an image-based object position correction system was designed and developed. Object positioning has always been a key component of the automated Sensors 2021, 21, 4071 manufacturing process. The proposed position correction system6 of 19was developed based on a calibration and a position correction methodology. The calibrated camera, with a specific artificial marker on the object for PnP image processing, identifies and defines the ufacturing process. The proposed position correction system was developed based on a calibrationposition and[36,37], a position as shown correction in methodology. Figure 4. TheA transformation calibrated camera, withmatrix a specific T is then obtained from artificialthe marker marker coordinate on the object ( forPmarker PnP) imageto the processing, camera coordinate identifies and ( definesPCamera). the The posi- displacement error of tionthe [36work,37], aspoint shown is in determined Figure4. A transformation by the difference matrix T isbetween then obtained the fromobject the before and after the marker coordinate (Pmarker) to the camera coordinate (PCamera). The displacement error of thetransformation work point is determined matrix. byThe the system difference must between then the objectcompensate before and for after the the position error before transformationdefining the matrix. AOI target The system positions must then on compensate the new forobject. the position The system error before simulation results were definingevaluated the AOI before target being positions integrated on the new with object. the The robot-guided system simulation inspection results were system. This system evaluated before being integrated with the robot-guided inspection system. This system assistsassists the the OLP OLP platform platform with performing with performing robot-guided robot-guided AOI applications AOI to automatically applications to automatically inspectinspect misplaced misplaced components components of manufactured of manufa objectsctured on a production objects line.on a production line.

FigureFigure 4. Artificial4. Artificial marker marker attached atta to theched object. to the object. 3.1. Automatic Robot Hand–Eye Calibration Methodology 3.1. WeAutomatic evaluated theRobot performance Hand–Eye of the Calibration proposed automatic Methodology hand–eye calibration process shownWe in Figure evaluated2. The firstthe step performance of this process of was the to measureproposed the positionautomatic of the hand–eye calibration calibration board in the work space. The calibration board (an 8 × 6 chess board) was fixed onprocess the robot shown arm and in moved Figure closer 2. The to the first second step calibration of this process board on was the table to measure [38,39], the position of the ascalibration shown in Figure board5. We in thenthe adjustedwork space. the robot The arm calibration and camera positionboard (an so that 8 × both 6 chess board) was fixed calibrationon the robot boards arm were and visible moved simultaneously. closer to Twothe transformationsecond calibration matrices, boardA and Bon, the table [38,39], as were used to calculate the relationship between the calibration boards. Transformation matrixshownA was in obtainedFigure using5. We the robotthen arm adjusted controller the to record robot the arm end-effector and coordinates.camera position so that both Transformationcalibration boards matrix B waswere calculated visible by simultaneously. solving the camera image Two PnP transformation equations. The matrices, A and B, transformationwere used to matrix calculate between the the robotrelationship arm and calibration between board the 2 wascalibration calculated afterboards. Transformation determining the relationship between A and B, as shown in Figure5. matrixOnce A the transformationwas obtained matrix using was obtained,the robot the initialarm position controller of the robotto record was the end-effector adjustedcoordinates. to perform Transformation automatic hand–eye matrix calibration, B was as shown calculated in Figure by6. The solving proposed the camera image PnP systemequations. enabled The the robottransformation to follow the hand–eye matrix calibration between trajectory the robot and takearm pictures and calibration of board 2 was the calibration board at different robot positions, as shown in the Figure7. Table2 presents eachcalculated position ofafter the robotdetermining end-effector the relative relationship to the robot between arm’s initial A position and B and, as theshown in Figure 5. hand–eyeOnce calibration the transformation results are shown matrix in Table 3was. Using obtained, these results, the theinitial conversion position of the robot was matrixadjusted between to perform the robot arm automatic end-effector hand–eye and the camera calibration, connector as was shown calculated in andFigure 6. The proposed compared with the conversion matrix derived from the original connector design. The proposedsystem calibrationenabled methodthe robot had anto error follow on the thez-axis hand–e of aroundye 2calibration mm and a greater trajectory error and take pictures ofof nearlythe calibration 9 mm on the xboard-axis. The at resultsdifferent obtained robot from po thesitions, hand–eye as correction shown werein the Figure 7. Table 2 sufficientlypresents similareach position to the designed of the connector robot results.end-effector Further analysisrelative was to performed the robot to arm’s initial position identify potential causes of calibration error in order to make further improvements. During thisand process, the hand–eye errors may havecalibration occurred atresults the 3D printedare sh connector,own in andTable the connector3. Using these results, the centerconversion to place thematrix camera between was difficult the to estimate.robot arm In addition, end-effector the hand–eye and calibration the camera connector was calculated and compared with the conversion matrix derived from the original connector design. The proposed calibration method had an error on the z-axis of around 2 mm and a greater error of nearly 9 mm on the x-axis. The results obtained from the hand–eye correction were sufficiently similar to the designed connector results. Further analysis was

Sensors 2021, 21, x FOR PEER REVIEW 7 of 19

Sensors 2021, 21, x FOR PEER REVIEW 7 of 19

performed to identify potential causes of calibration error in order to make further

Sensors 2021, 21, 4071 improvements.performed to identifyDuring potentialthis process, causes errors of calibration may have error occurred in order at to the 7make of 193D furtherprinted connector,improvements. and the During connector this process,center to errors place maythe camerahave occurred was difficult at the to 3D estimate. printed In addition,connector, the andhand–eye the connector calibration center itself to is place a complex the camera process, was and difficult the proposed to estimate. approach In uses the simple AX = ZB closed conversion relationship for inference, so there is a itselfaddition, is a complex the hand–eye process, andcalibration the proposed itself is approach a complex uses process, the simple and AX the = proposed ZB closed approach possibilityconversionuses the relationshipofsimple error. AX The for= calibrZB inference, closedation so conversionresults there is obtained a possibility relationship were of error.us fored inference, Thein the calibration proposed so there position is a correctionresultspossibility obtained methodology of wereerror. used The into calibr thecompute proposedation andresults position compensate obtained correction werefor methodology the us positioned in the to error proposed compute on a new position object inandcorrection a compensateproduction methodology for line. the position to compute error on aand new compensate object in a production for the position line. error on a new object in a production line.

FigureFigureFigure 5.5. The 5.The The relationship relationship relationship between between between robot robot robot and two andand calibration two calibration boards. boards.boards.

FigureFigure 6. Initial6. Initial steps steps of the of calibrationthe calibration process. process. (a) Initial (a) Initial robot armrobot position, arm position, (b) image (b) captured image captured Figurew.r.t.w.r.t. initial 6.initial Initial position, position, steps (c) robotof (c the) robot arm calibration position arm position after process. being after ( manually abeing) Initial manually adjusted, robot arm adjusted, (d) position, image ( capturedd) image(b) image after captured captured w.r.t.manualafter initial manual adjustment. position, adjustment. (c) robot arm position after being manually adjusted, (d) image captured after manual adjustment.

Sensors 2021, 21, 4071 8 of 19 Sensors 2021, 21, x FOR PEER REVIEW 9 of 19

FigureFigure 7. 7. SixSix different different robot robot positions positions for for automatic automatic hand–eye hand–eye calibration. calibration. (a) 1st (a) robot 1st robot position, position, (b) image w.r.t. 1st position, (c) 2nd robot position, (d) image w.r.t. 2nd position, (e) 3rd robot (b) image w.r.t. 1st position, (c) 2nd robot position, (d) image w.r.t. 2nd position, (e) 3rd robot position, position, (f) image w.r.t. 3rd position, (g) 4th robot position, (h) image w.r.t. 4th position, (i) 5th (f) image w.r.t. 3rd position, (g) 4th robot position, (h) image w.r.t. 4th position, (i) 5th robot position, robot position, (j) image w.r.t. 5th position, (k) 6th robot position, (l) image w.r.t. 6th position. (j) image w.r.t. 5th position, (k) 6th robot position, (l) image w.r.t. 6th position. 3.2. Object Position Correction Methodolgy Table 2. The six different positions of the robot end-effector relative to the initial position. After obtaining the calibration results, the user must set the checkpoint position (PCheckPosition) for the robotXYZR arm with the camera in the initial “templateX(α)R loginY( βphase”)R to captureZ(γ) the fullA artificial marker 0 image. 0The robot arm 0 moves to the 0 (PCheck) position 0 and uses 0 the cameraB to detect the 0 artificial+ 30marker cm and+ 10solve cm the PnP+30 image◦ problem.0 Therefore, 0 the transformation matrix T of the artificial marker coordinate (Pmarker) and camera coordinate C 0 –30 cm –10 cm –30◦ 0 0 (PCamera) was obtained using Equation (1). D –30 cm 0 0 0 +30◦ 0 ◦ E +30 cm 0𝑃 = 𝑇 0𝑃 0 –30 0 (1) F 0 0 0 0 0 +30◦

Sensors 2021, 21, 4071 9 of 19

Table 3. Hand–eye calibration results and error.

Connector Design Hand–Eye Position Error Size Calibration Results dX (mm) 0 −8.559 8.559 dY (mm) 0 −0.227 0.227 dZ (mm) 190 189.387 1.613 Rx (degrees) 0 −0.079 0.079 Ry (degrees) 0 −0.868 0.868 Rz (degrees) 0 −0.874 0.874

3.2. Object Position Correction Methodolgy After obtaining the calibration results, the user must set the checkpoint position (PCheck) for the robot arm with the camera in the initial “template login phase” to capture the full artificial marker image. The robot arm moves to the (PCheck) position and uses the camera to detect the artificial marker and solve the PnP image problem. Therefore, the transformation matrix T of the artificial marker coordinate (Pmarker) and camera coordinate (PCamera) was obtained using Equation (1).

PCamera = [T]PMarker (1)

where T is a standard position (TS) and is used as a standard sample to verify and measure the change in object position. Once the standard position has been obtained, it undergoes the “error compensation phase”, which compensates for the object’s position error during various manufacturing applications. In practice, there are several different work points for various processes of AOI tasks, machining, and assembly applications. The positions of these work points are recorded and are collectively known as P. If the object shifts during the process, the robot arm moves to the checkpoint position (PCheck) to detect and resolve the image PnP problem and obtain a new marker position (Pmarker). This will calculate a new transformation matrix TN between the new marker coordinate Pmarker and camera coordinate PCamera. The offset transformation matrix TD shown in Equation (2) shows the displacement and rotation of the manual markers and is calculated using the standard transformation matrix TS and the new transformation matrix TN.

PCamera = [TN] PMarker2 = [TS] PMarker1 −1 PMarker2 = [TN] [TS] PMarker1 = [TD] PMarker1 (2) −1 [TD] = [TN] [TS]

The rotation and translation error for defining new work points, PNew, were calculated using the offset transformation matrix, artificial marker center point offset (S), and the robot arm’s original coordinates. Following this, the work point P is translated back to the origin of the robot arm coordinate system with the artificial marker center point, as shown in Equation (3):

 O −S  P = P (3) New O 1

Work point P is then rotated with reference to the artificial marker rotation:

 R 0  O −S  P = 3×3 P (4) New O 1 O 1 Sensors 2021, 21, 4071 10 of 19

Work point P is translated back to the original artificial coordinate, as shown in Equation (5):  OS  R 0  O −S  P = 3×3 P (5) New O 1 O 1 O 1

Using Equation (6), a new work point (PNew) is defined for the AOI task after er- ror compensation:

 O t  OS  R 0  O −S  P = 3×1 3×3 P (6) New O 1 O 1 O 1 O 1

In summary, translation and rotation error are computed once the object positioning system recognizes and evaluates the artificial marker in the actual application. Based on the proposed approach, this position correction system successfully adjusts and generates new work point coordinates for AOI inspection tasks. The schematic diagram shown in Figure8 was generated using MATLAB software and was based on the proposed position correction system. In the simulation, the com- puter server board (object) was moved to an unknown position, after which the position correction system identified the AOI camera target point based on the artificial marker. The performance of the position correction system was high for the AOI target point and object position, regardless of the error caused by the hardware device. The system suc- cessfully found the position error and compensated for it to calculate the robot’s new AOI Sensors 2021, 21, x FOR PEER REVIEW 11 of 19 target point, irrespective of the object position. Thus, the object positioning correction system effectively utilized the automatic hand–eye calibration method and simple artificial markers to detect the object’s current position prior to any shifting. Section4 discusses the shifting. Sectionintegration 4 discusses and implementationthe integration and of the implementation proposed method of the in theproposed AOI application, method as well as in the AOI application,the experimental as well results. as the experimental results.

(a) Original position (b) Position after a shift

Figure 8. SimulationFigure 8. results Simulation of the results position of the correction position system.correction The system. large The black large rectangle black rectangle represents represents the entire object (computer serverthe board);entire object the blue (computer rectangle server is the board); artificial the marker; blue rectangle the three blackis the intermediateartificial marker; size rectanglesthe three black represent the intermediate size rectangles represent the AOI capture position of the camera; the red and green AOI capture position of the camera; the red and green lines represent the camera’s plane coordinates used to capture the lines represent the camera’s plane coordinates used to capture the AOI images and the camera AOI images and the camera coordinate axis, respectively. coordinate axis, respectively.

4. Integration4. of Integration Position Corr of Positionection Module Correction with Module OLP Platform with OLP Platform The positionThe correction position system correction we developed system we was developed integrated was with integrated the autonomous with the autonomous robot-guidedrobot-guided optical inspection optical system inspection to bu systemild a smart to build system a smart for the system production for the line. production line. Prior to performingPrior to position performing correction, position a correction,path for real-time a path forscanning real-time and scanning target positions and target positions for AOI tasksfor was AOI generated tasks was and generated visualized and graphically visualized in graphically the OLP platform, in the OLP as shown platform, in as shown Figure 9 [34].in Furthermore, Figure9[ 34]. Furthermore,the generated therobot generated program robot was program sent to the was HIWIN-620 sent to the HIWIN-620 industrial robotindustrial to capture robot to AOI capture images, AOI images,which whichwere werecompared compared to virtual to virtual images. images. However, However, the developed system was still unable to reliably perform AOI tasks in a production line. Therefore, the robot-guided inspection system was integrated with the position correction module to resolve the issues related to object displacement or rotation errors in a production line for AOI tasks. Here, we report and discuss experiments performed using the proposed position correction module for robot-guided inspection. On a production line, the robot arm employs the object position correction module to perform an AOI operation autonomously. To execute an AOI inspection, the robot arm gathers photos of the target object from various angles. If the object shifts, the proposed system detects this and adjusts the robot’s AOI image shooting position. Once positional changes are made, the system captures the defined AOI target images. The object used in this experiment was a large computer server with four artificial markers, as shown in Figure 10. During the system execution process, we first manually guided the robot arm to shoot the positioning marker image, as shown in Figure 11. We obtained the sample image of the positioning marker as a reference for the position correction system, as shown in Figure 12, and recorded the positioning marker position and its transformation matrix relative to the camera.

Sensors 2021, 21, 4071 11 of 19

the developed system was still unable to reliably perform AOI tasks in a production line. Therefore, the robot-guided inspection system was integrated with the position correction Sensors 2021, 21, x FOR PEER REVIEW 12 of 19 module to resolve the issues related to object displacement or rotation errors in a production line for AOI tasks.

Sensors 2021, 21, x FOR PEER REVIEW 12 of 19

Figure 9.9. Robot-guided AOI inspection system.system.

Here, we report and discuss experiments performed using the proposed position correction module for robot-guided inspection. On a production line, the robot arm employs the object position correction module to perform an AOI operation autonomously. To execute an AOI inspection, the robot arm gathers photos of the target object from various angles. If the object shifts, the proposed system detects this and adjusts the robot’s AOI image shooting position. Once positional changes are made, the system captures the defined AOI target images. The object used in this experiment was a large computer server withFigure four 9. Robot-guided artificial markers, AOI inspection as shown system. in Figure 10.

Figure 10. Object positioning system’s target object with artificial markers. The red circle shows the artificial marker (square containing four crosses) used for positioning. The remaining three markers were used to analyze positional errors after camera shooting.

FigureFigure 10.10. Object positioning system’ssystem’s targettarget objectobject withwith artificialartificial markers.markers. TheThe redred circlecircle showsshows thethe artificialartificial markermarker (square(square containingcontaining fourfour crosses)crosses) usedused forfor positioning.positioning. TheThe remainingremaining threethree markersmarkers werewere usedused toto analyzeanalyze positionalpositional errorserrors afteraftercamera camerashooting. shooting.

During the system execution process, we first manually guided the robot arm to shoot the positioning marker image, as shown in Figure 11. We obtained the sample image of the positioning marker as a reference for the position correction system, as shown in Figure 12, and recorded the positioning marker position and its transformation matrix relative to the camera. Figure 11. Camera shooting pose at the positioning artificial marker.

Once the sample image of the positioning marker was captured, the robot’s target positions were selected for the AOI inspection task before any displacement or rotation. In the experiment, four different robot shooting positions were selected at different heights and angles, as shown in Figure 13, and the images captured at these positions are

Figure 11. Camera shooting pose at the positioning artificial marker.

Once the sample image of the positioning marker was captured, the robot’s target positions were selected for the AOI inspection task before any displacement or rotation. In the experiment, four different robot shooting positions were selected at different heights and angles, as shown in Figure 13, and the images captured at these positions are

Sensors 2021, 21, x FOR PEER REVIEW 12 of 19

Figure 9. Robot-guided AOI inspection system.

Figure 10. Object positioning system’s target object with artificial markers. The red circle shows the Sensors 2021, 21, 4071 12 of 19 artificial marker (square containing four crosses) used for positioning. The remaining three markers were used to analyze positional errors after camera shooting.

Sensors 2021, 21, x FOR PEER REVIEW 13 of 19

shown in Figure 14. These were the target points used to perform the position correction Figureprocedure.Figure 11. 11.Camera Camera shooting shooting pose atpose the positioning at the positioning artificial marker. artificial marker.

Once the sample image of the positioning marker was captured, the robot’s target

Sensors 2021, 21, x FOR PEER REVIEWpositions were selected for the AOI inspection task before any displacement13 of 19 or rotation. In the experiment, four different robot shooting positions were selected at different heights and angles, as shown in Figure 13, and the images captured at these positions are shown in Figure 14. These were the target points used to perform the position correction procedure.

FigureFigure 12. 12.Sample Sample image image of the positioning of the positioning marker w.r.t. themarker robot posew.r.t. shown the inrobot Figure pose 11. shown in Figure 11. Once the sample image of the positioning marker was captured, the robot’s target positions were selected for the AOI inspection task before any displacement or rotation. In the experiment, four different robot shooting positions were selected at different heights and angles, as shown in Figure 13, and the images captured at these positions are shown in FigureFigure 12. 14 .Sample These image were theof the target positioning points marker used to w.r.t. perform the robot the positionpose shown correction in Figure procedure. 11.

Figure 13. Four different robot poses to capture standard AOI targets images before manual FigureFiguredisplacement 13. FourFour different or rotation. robot posesposes Robot toto capturecapture pose standardstan at dardthe AOI (AOIa) targets1st, targets (b images )images 2nd, before before(c) 3rd, manual manual and displace- (d) 4th AOI target displacementment or rotation. or rotation. Robot pose Robot at thepose (a )at 1st, the ( b(a)) 2nd,1st, ( (bc) 3rd,2nd, and(c) 3rd, (d) 4thand AOI (d) 4th target AOI positions. target positions.positions.

Figure 14. Standard AOI images captured at the four target positions. (a) Standard image captured w.r.t Figure 13a, (b) Standard image captured w.r.t Figure 13b, (c) Standard image captured w.r.t Figure 13c, (d) Standard image captured w.r.t Figure 13d.

FigureFollowing 14. Standard this preparatory AOI images procedure, captured the system at the already four target had all positions. of the parameters (a) Standard image captured and specification data required to perform object image repositioning. Figure 14 shows thew.r.t original Figure standard 13a, (b ) imageStandard without image disp capturedlacement. w.r.tTwo Figureexperiments 13b, (withc) Standard random image captured w.r.t manualFigure displacement13c, (d) Standard were conductedimage captured to evaluate w.r.t the Figure performance 13d. of the proposed position correction module, as shown in Figure 15. Once the manual displacement was carriedFollowing out, the proposed this positionpreparatory correction procedure, modules calculated the system the new already AOI position had to all of the parameters and specification data required to perform object image repositioning. Figure 14 shows the original standard image without displacement. Two experiments with random manual displacement were conducted to evaluate the performance of the proposed position correction module, as shown in Figure 15. Once the manual displacement was carried out, the proposed position correction modules calculated the new AOI position to

Sensors 2021, 21, x FOR PEER REVIEW 13 of 19

shown in Figure 14. These were the target points used to perform the position correction procedure.

Figure 12. Sample image of the positioning marker w.r.t. the robot pose shown in Figure 11.

Sensors 2021, 21, 4071 Figure 13. Four different robot poses to capture standard AOI targets images before manual 13 of 19 displacement or rotation. Robot pose at the (a) 1st, (b) 2nd, (c) 3rd, and (d) 4th AOI target positions.

FigureFigure 14. 14. StandardStandard AOI AOI images images captured captured at the atfour the target four positions. target positions. (a) Standard (a) Standard image captured image captured w.r.tw.r.t Figure Figure 13a, 13 a,(b)( Standardb) Standard image image captured captured w.r.t Figure w.r.t 13b, Figure (c) Standard13b, (c) Standardimage captured image w.r.t captured w.r.t Figure 13c, (d) Standard image captured w.r.t Figure 13d. Figure 13c, (d) Standard image captured w.r.t Figure 13d. Following this preparatory procedure, the system already had all of the parameters Following this preparatory procedure, the system already had all of the parameters and specification data required to perform object image repositioning. Figure 14 shows and specification data required to perform object image repositioning. Figure 14 shows Sensors 2021, 21, x FOR PEER REVIEWthe original standard image without displacement. Two experiments with random 14 of 19 manualthe original displacement standard were image conducted without displacement.to evaluate the Two performance experiments of the with proposed random manual Sensors 2021, 21, x FOR PEER REVIEWpositiondisplacement correction were module, conducted as shown to evaluate in Figure the 15. performance Once the manual of the displacement proposed position14 was of 19 correc-

carriedtion module, out, the asproposed shown position in Figure correction 15. Once modules the manual calculated displacement the new AOI was position carried to out, the proposedcapture images, position ascorrection shown in modulesFigures 16–19. calculated These the were new then AOI compared position towith capture the standard images, ascaptureimages shown images, shown in Figures asin shown Figure 16– 19in 14. .Figures These 16–19. were thenThese compared were then withcompared the standard with the standard images shown in Figureimages shown14. in Figure 14.

Figure 15. Two different object positions after random manual displacement and rotation to test the FigureFigure 15. 15. Two Two different different object object positions positions after afterrandom random manual manual displacement displacement and rotation and to rotation test to test objectthethe object object correction correction correction system. system. system. ( a(a)) The The(a) The firstfirst firstand and (and (bb) )second second (b) second random random random positions positions positions of the of object. the of object.the object.

Figure 16. Position correction system test results for two random positions. Standard image without displacement or rotation corresponding to (a) Figure 14a, and new images after (b) the FigurefirstFigure random 16.16. PositionPosition position correctioncorrectionshown in Figure systemsystem 15a testtest and results results(c) the forsecond for two two random random random position positions. positions. shown Standard Standard in Figure image image without displacement15b.without displacement or rotation or correspondingrotation corresponding to (a) Figure to ( a14) a,Figure and new14a, imagesand new after images (b) the after first (b random) the positionfirst random shown position in Figure shown 15a andin Figure (c) the 15a second and ( randomc) the second position random shown position in Figure shown 15b. in Figure 15b.

Figure 17. Position correction system test results for two random positions. Standard image without displacement or rotation corresponding to (a) Figure 14b, and new images after (b) the first random position shown in Figure 15a and (c) the second random position shown in Figure 15b. Figure 17. Position correction system test results for two random positions. Standard image without displacement or rotation corresponding to (a) Figure 14b, and new images after (b) the first random position shown in Figure 15a and (c) the second random position shown in Figure 15b.

Sensors 2021, 21, x FOR PEER REVIEW 14 of 19

capture images, as shown in Figures 16–19. These were then compared with the standard images shown in Figure 14.

Figure 15. Two different object positions after random manual displacement and rotation to test the object correction system. (a) The first and (b) second random positions of the object.

Figure 16. Position correction system test results for two random positions. Standard image Sensors 2021, 21, 4071 without displacement or rotation corresponding to (a) Figure 14a, and new images after (b) 14the of 19 first random position shown in Figure 15a and (c) the second random position shown in Figure 15b.

FigureFigure 17.17. PositionPosition correctioncorrection systemsystem test test results results for for two two random random positions. positions. Standard Standard image image without Sensors 2021, 21, x FOR PEER REVIEWdisplacementwithout displacement or rotation or correspondingrotation corresponding to (a) Figure to ( a14) b,Figure and new14b, imagesand new after images (b) the after first (b random) 15the of 19 Sensors 2021, 21, x FOR PEER REVIEW 15 of 19 positionfirst random shown position in Figure shown 15a andin Figure (c) the 15a second and ( randomc) the second position random shown position in Figure shown 15b. in Figure 15b.

Figure 18.18. Position correctioncorrection systemsystem testtest resultsresults for for two two random random positions. positions. Standard Standard image image without Figure 18. Position correction system test results for two random positions. Standard image displacementwithout displacement or rotation or shownrotation in shown (a) Figure in ( a14) c,Figure and new 14c, imagesand new after images (b) the after first (b random) the first position randomwithout positiondisplacement shown or in rotation Figure shown15a and in ( c()a )the Figure second 14c, random and new position images shown after ( bin) theFigure first 15b. shownrandom in position Figure 15 showna and in (c) Figure the second 15a and random (c) the position second shownrandom in position Figure 15 shownb. in Figure 15b.

Figure 19. Position correction system test results for two random positions. Standard image without FiguredisplacementFigure 19.19. Position or rotation correction shown systemsystem in (a) test testFigure resultsresults 14d, forfor and twotwo new randomrandom images positions.positions. after (b) the StandardStandard first random imageimage withoutpositionwithout displacementshowndisplacement in Figure oror rotation rotation15a and shown shown(c) the insecondin ((aa)) FigureFigure random 14 14d,d, position andand newnew shown imagesimages in afterafter Figure ((bb)) 15b. thethe first first randomrandom positionposition shown in Figure 15a and (c) the second random position shown in Figure 15b. shown in Figure 15a and (c) the second random position shown in Figure 15b. Two experiments were conducted to measure the performance of the position correctionTwo experimentssystem. In test were 1, theconducted object was to measmovedure bythe 49.1 performance mm, with ofa finalthe positionaverage residualcorrection error system. of 5.15 In mm, test and 1, thethe proposedobject was sy movedstem compensated by 49.1 mm, for with 91.5% a offinal the positionaverage error.residual In errortest 2, of the 5.15 object mm, was and movedthe proposed 50.6 mm, system with compensated a final average for residual 91.5% of error the position of 3.80 mm,error. and In test the 2,proposed the object system was moved compensated 50.6 mm, for with 92.5% a finalof the average position residual error. Newerror imagesof 3.80 weremm, andcaptured the proposed and compared system compensatedwith the standard for 92.5% images of the for position the two error. random New imagesobject positions.were captured The captured and compared images differedwith the slight standardly from images the standard for the image.two random These resultsobject demonstratepositions. The that captured the proposed images differedsystem slightefficielyntly from compensates the standard for image. most Theseof the results error causeddemonstrate by object that displacement. the proposed Table system 4 presents efficiently the compensateserror analysis for of themost object of the position error correctioncaused by objectsystem, displacement. in which new Table images 4 presents have a the distance error analysiserror (norm) of the of object 10–40 position pixels comparedcorrection tosystem, the standard in which image new (Figure images 20), have with a distancean average error of 21.09(norm) pixels. of 10–40 The errorspixels measuredcompared into pixelsthe standard were then image converted (Figure to 20), mm with (Table an 5).average The distance of 21.09 error pixels. was The between errors 6measured and 2 mm in pixels (Figure were 21) then and converted the mean to error mm (Tablewas 3.97 5). The mm. distance The proposed error was method’s between efficiency6 and 2 mm could (Figure be improved 21) and by the accurately mean error positioning was 3.97 the mm. camera The and proposed replacing method’s the 3D printefficiency camera could holder, be improved as well as by minimizing accurately calibrationpositioning error, the camera which mayand replacingimprove systemthe 3D accuracyprint camera by up holder, to 95 .3%.as well The as developed minimizing system calibration resolved error, the which issues may associated improve with system the objectaccuracy translation by up to or95 .3%.rotation The errordeveloped in the systemproduction resolved line. the The issues robot-guided associated inspection with the systemobject translation with position or correctionrotation error module in the enhanced production the ability line. Theto perform robot-guided user-defined inspection AOI taskssystem autonomously with position on correction any production module line.enhanced the ability to perform user-defined AOI tasks autonomously on any production line.

Sensors 2021, 21, 4071 15 of 19

Two experiments were conducted to measure the performance of the position cor- rection system. In test 1, the object was moved by 49.1 mm, with a final average residual error of 5.15 mm, and the proposed system compensated for 91.5% of the position error. In test 2, the object was moved 50.6 mm, with a final average residual error of 3.80 mm, and the proposed system compensated for 92.5% of the position error. New images were captured and compared with the standard images for the two random object positions. The captured images differed slightly from the standard image. These results demonstrate that the proposed system efficiently compensates for most of the error caused by object displacement. Table4 presents the error analysis of the object position correction system, in which new images have a distance error (norm) of 10–40 pixels compared to the standard image (Figure 20), with an average of 21.09 pixels. The errors measured in pixels were then converted to mm (Table5). The distance error was between 6 and 2 mm (Figure 21) and the mean error was 3.97 mm. The proposed method’s efficiency could be improved by accurately positioning the camera and replacing the 3D print camera holder, as well as minimizing calibration error, which may improve system accuracy by up to 95.3%. The developed system resolved the issues associated with the object translation or rotation error in the production line. The robot-guided inspection system with position correction module enhanced the ability to perform user-defined AOI tasks autonomously on any production line.

Table 4. Error analysis of the object position correction system (measured in pixels).

Error First Displacement Second Displacement (in Pixel) X Y Norm X Y Norm AOI position 1 P1 1.11 24.39 24.41 −7.29 −18.92 20.28 P2 0.26 23.89 23.89 −4.26 −17.44 17.95 P3 0.32 23.69 23.69 −4.52 −13.72 14.45 P4 1.46 25.02 25.06 −7.50 −13.87 15.77 AOI position 2 P1 0.44 19.16 19.16 −14.16 −9.02 17.04 P2 −3.52 18.91 19.23 −10.49 −9.60 14.22 P3 −3.50 14.05 14.48 −10.59 −5.62 11.99 P4 0.87 14.31 14.34 −14.40 −4.59 15.11 AOI position 3 P1 −25.55 15.93 29.83 0.64 −26.58 26.59 P2 −31.64 15.09 35.06 0.01 −27.12 27.12 P3 −32.66 8.48 33.74 0.01 −26.13 26.13 P4 −26.29 8.76 27.71 0.56 −25.08 25.08 AOI position 4 P1 16.36 8.34 18.36 −25.67 4.65 26.08 P2 12.36 10.61 16.28 −21.25 1.20 21.28 P3 10.16 5.75 11.67 −18.65 5.03 19.31 P4 14.52 3.36 14.90 −22.78 8.97 24.49 Sensors 2021, 21, x FOR PEER REVIEW 17 of 19 Sensors 2021, 21, 4071 16 of 19

Figure 20. The distance error between test 1 and test 2. Figure 20. The distance error between test 1 and test 2. Table 5. Error analysis of object position correction system (in mm).

Error First Displacement Second Displacement (in mm) X Y Norm X Y Norm AOI position 1 P1 0.22 4.79 4.80 −1.43 −3.72 3.98 P2 0.05 4.69 4.69 −0.83 −3.43 3.53 P3 0.06 4.65 4.65 −0.89 −2.70 2.84 P4 0.29 4.92 4.92 −1.47 −2.73 3.10 AOI position 2 P1 0.09 3.79 3.79 −2.86 −1.79 3.37 P2 −0.70 3.74 3.80 −2.08 −1.90 2.81 P3 −0.69 2.78 2.86 −2.10 −1.11 2.37 P4 0.17 2.83 2.84 −2.85 −0.91 2.99 AOI position 3 P1 −4.57 2.75 5.33 0.11 −4.76 4.76 P2 −5.65 2.70 6.26 0.00 −4.85 4.85 FigureP3 21. Distance−5.84 error between 1.52 test 1 6.03and test 2. 0.00 −4.68 4.68 P4 −4.70 1.57 4.95 0.10 −4.49 4.49 5. Conclusions AOI position 4 P1In this study, 3.05 a novel 1.55 position correctio 3.43 n module−4.79 was 0.87 developed 4.87 and integrated with an autonomousP2 2.31 robot-guided 1.98 optical 3.04 inspection−3.96 system 0.22 to perform 3.97 AOI tasks on productionP3 lines 1.90 and correct 1.07 for object 2.18 displacement−3.48 and 0.94 rotation. 3.60 The robot-guided systemP4 assisted 2.71 the user to 0.63 select the 2.78 AOI targ−ets4.25 and capture 1.67 target 4.57 images in the virtual environment. Real-time images were captured using the industrial manipulator for the corresponding positions. However, this system was still not reliable enough to be used in an assembly line to perform AOI tasks. Therefore, the robot-guided inspection system was integrated with a position correction module to resolve object displacement or rotation error issues in a production line for AOI tasks. The position correction system calculates and compensates for the position error of the new object on the production line using artificial markers. We performed two tests to evaluate the effectiveness of the proposed position correction module. The proposed system had a mean error of 3.97 mm or 21.09 pixels. These results indicated that the robot-guided system with a position correction module was capable of performing AOI tasks on a production line. In addition, the user of the integrated autonomous robot-guided system is not required to define the AOI target on a new object position unless they are changed. This not only saves time and effort, but also increases productivity. The integration of the position correction module with the robot-guided system led to the development of a smart system, which enables inspection

Sensors 2021, 21, x FOR PEER REVIEW 17 of 19

Sensors 2021, 21, 4071 17 of 19 Figure 20. The distance error between test 1 and test 2.

Figure 21. Distance error between test 1 and test 2. Figure 21. Distance error between test 1 and test 2. 5. Conclusions 5. ConclusionsIn this study, a novel position correction module was developed and integrated with an autonomous robot-guided optical inspection system to perform AOI tasks on production lines andIn this correct study, for object a novel displacement position andcorrectio rotation.n module The robot-guided was developed system assisted and integrated with anthe userautonomous to select the robot-guided AOI targets and captureoptical targetinspection images insystem the virtual to environment.perform AOI tasks on productionReal-time images lines were and captured correct using for the object industrial displacement manipulator forand the rotation. corresponding The robot-guided positions.system assisted However, the this user system to was select still the not reliableAOI targ enoughets and to be capture used in an target assembly images line in the virtual to perform AOI tasks. Therefore, the robot-guided inspection system was integrated with environment.a position correction Real-time module images to resolve were object captured displacement using or rotation the industrial error issues manipulator in a for the productioncorresponding line for positions. AOI tasks. TheHowever, position this correction system system was calculates still not andreliable compensates enough to be used in anfor assembly the position line error to ofperform the new AOI object tasks. on the Theref productionore, the line robot-guided using artificial inspection markers. system was Weintegrated performed with two testsa position to evaluate correction the effectiveness module of to the resolve proposed object position displacement correction or rotation module. The proposed system had a mean error of 3.97 mm or 21.09 pixels. These results errorindicated issues that in the a robot-guided production system line for with AOI a position tasks. correctionThe position module correction was capable system calculates andof performing compensates AOI tasks for onthe a position production error line. of In addition,the new theobject user on of thethe integrated production line using artificialautonomous markers. robot-guided We systemperformed is not requiredtwo tests to defineto evaluate the AOI the target effectiveness on a new object of the proposed position unless correction they are module. changed. The This notproposed only saves system time and had effort, a mean but also error increases of 3.97 mm or 21.09 productivity.pixels. These The results integration indicated of the positionthat the correction robot-guided module system with the with robot-guided a position correction system led to the development of a smart system, which enables inspection tasks on the productionmodule was line capable without prior of performing knowledge of AOI the object’s tasks position.on a production However, theline. proposed In addition, the user ofmethod the integrated is currently autonomous limited to using robot-guided artificial markers system and is the not simple required AX = ZBto define closed the AOI target onconversion a new forobject hand–eye position calibration. unless they are changed. This not only saves time and effort, but also increases productivity. The integration of the position correction module with the Author Contributions: Conceptualization, C.-Y.L.; funding acquisition, C.-Y.L.; investigation, A.K.B., A.M.M.,robot-guided and S.-C.L.; system methodology, led to A.K.B. the development and C.-Y.L.; software, of a A.K.B., smart A.M.M., system, and S.-C.L.;which super- enables inspection vision, C.-Y.L.; validation, A.K.B. and S.-C.L.; visualization, A.K.B.; writing—original draft, A.K.B. and Y.-S.C.; writing—review and editing, A.K.B. and C.-Y.L. All authors have read and agreed to the published version of the manuscript.

Funding: This work was financially supported by both the Taiwan Building Technology Center and the Center for Cyber-Physical System Innovation from the Featured Areas Research Center Program within the framework of the Higher Education Sprout Project by the Ministry of Education (MOE) in Taiwan. Additionally, this work was financially supported by Inventec Co. Ltd., Taiwan (R.O.C). Conflicts of Interest: The authors declare no conflict of interest. Sensors 2021, 21, 4071 18 of 19

References 1. De Backer, K.; DeStefano, T.; Menon, C.; Suh, J.R. Industrial and the Global Organisation of Production; OECD Science, Technology and Industry Working Papers, No. 2018/03; OECD Publishing: Paris, France, 2018. 2. Tripathi, S.; Shukla, S.; Attrey, S.; Agrawal, A.; Bhadoria, V.S. Smart industrial packaging and sorting system. In Strategic System Assurance and Business Analytics; Springer: Singapore, 2020; pp. 245–254. 3. Vaidya, S.; Ambad, P.; Bhosle, S. Industry 4.0–a glimpse. Procedia Manuf. 2018, 20, 233–238. [CrossRef] 4. Marcon, P.; Arm, J.; Benesl, T.; Zezulka, F.; Diedrich, C.; Schröder, T.; Belyaev, A.; Dohnal, P.; Kriz, T.; Bradac, Z. New approaches to implementing the SmartJacket into industry 4.0. Sensors 2019, 19, 1592. [CrossRef] 5. Bahrin, M.A.K.; Othman, M.F.; Azli, N.H.N.; Talib, M.F. Industry 4.0: A review on industrial automation and robotic. Jurnal Teknologi 2016, 78, 6–13. 6. Ermolov, I. Industrial robotics review. In Robotics: Industry 4.0 Issues & New Intelligent Control Paradigms; Springer: Cham, Germany, 2020; pp. 195–204. 7. Pérez, L.; Rodríguez, Í.; Rodríguez, N.; Usamentiaga, R.; García, D.F. Robot guidance using vision techniques in industrial environments: A comparative review. Sensors 2016, 16, 335. [CrossRef] 8. Forsyth, D.A.; Ponce, J. Computer Vision: A Modern Approach, 2nd ed.; Pearson Education: New York, NY, USA, 2012. 9. Ali, M.H.; Aizat, K.; Yerkhan, K.; Zhandos, T.; Anuar, O. Vision-based robot manipulator for industrial applications. Procedia Comput. Sci. 2018, 133, 205–212. [CrossRef] 10. Nakhaeinia, D.; Fareh, R.; Payeur, P.; Laganière, R. Trajectory planning for surface following with a manipulator under RGB-D visual guidance. In Proceedings of the 2013 IEEE International Symposium on Safety, Security, and Rescue Robotics (SSRR), Linkoping, Sweden, 21–26 October 2013; pp. 1–6. 11. Fan, X.; Wang, X.; Xiao, Y. A combined 2D-3D vision system for automatic robot picking. In Proceedings of the 2014 International Conference on Advanced Mechatronic Systems, Kumamoto, Japan, 10–12 August 2014; pp. 513–516. 12. Kim, K.; Kim, J.; Kang, S.; Kim, J.; Lee, J. Vision-based bin picking system for industrial robotics applications. In Proceedings of the 2012 9th International Conference on Ubiquitous Robots and Ambient Intelligence (URAI), Daejeon, Korea, 26–28 November 2012; pp. 515–516. 13. Camarillo, D.B.; Loewke, K.E.; Carlson, C.R.; Salisbury, J.K. Vision based 3-D shape sensing of flexible manipulators. In Proceedings of the 2008 IEEE International Conference on Robotics and Automation, Pasadena, CA, USA, 19–23 May 2008; pp. 2940–2947. 14. Lin, Z.; Shao, J.; Zhang, X.; Deng, X.; Zheng, T. Multiclass Fruit Packing System Based on Computer Vision. J. Phys. Conf. Ser. 2020, 1449, 012097. [CrossRef] 15. Heikkila, J.; Silven, O. A four-step camera calibration procedure with implicit image correction. In Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Juan, PR, USA, 17–19 June 1997; pp. 1106–1112. 16. Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [CrossRef] 17. Tsai, R. A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses. IEEE J. Robot. Autom. 1987, 3, 323–344. [CrossRef] 18. Zhuang, H.; Roth, Z.S.; Wang, K. Robot calibration by mobile camera systems. J. Robot. Syst. 1994, 11, 155–167. [CrossRef] 19. Renaud, P.; Andreff, N.; Marquet, F.; Martinet, P. Vision-based kinematic calibration of a H4 parallel mechanism. In Proceedings of the 2003 IEEE International Conference on Robotics and Automation, Taipei, Taiwan, 14–19 September 2003; Volume 14, pp. 1191–1196. 20. Daney, D.; Andreff, N.; Papegay, Y. Interval method for calibration of parallel robots: A vision-based experimentation. Mech. Mach. Theory 2006, 41, 929–944. [CrossRef] 21. Carlson, F.B.; Johansson, R.; Robertsson, A. Six DOF eye-to-hand calibration from 2D measurements using planar constraints. In Proceedings of the 2015 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS), Hamburg, Germany, 28 September 2015; pp. 3628–3632. 22. Bayro-Corrochano, E.; Daniilidis, K.; Sommer, G. Motor algebra for 3D kinematics: The case of the hand-eye calibration. J. Math. Imaging Vis. 2000, 13, 79–100. [CrossRef] 23. Strobl, K.H.; Hirzinger, G. Optimal hand-eye calibration. In Proceedings of the 2006 IEEE/RSJ international conference on intelligent robots and systems, Beijing, China, 9–15 October 2006; pp. 4647–4653. 24. Daniilidis, K. Hand-eye calibration using dual quaternions. Int. J. Robot. Res. 1999, 18, 286–298. [CrossRef] 25. Motta, J.M.S.T.; De Carvalho, G.C.; Mcmaster, R.S. Robot calibration using a 3D vision-based measurement system with a single camera. Robot. Comput. Integr. Manuf. 2001, 17, 487–497. [CrossRef] 26. Alici, G.; Shirinzadeh, B. Enhanced stiffness modeling, identification and characterization for robot manipulators. IEEE Trans. Robot. 2005, 21, 554–564. [CrossRef] 27. Newman, W.; Birkhimer, C.; Horning, R. Calibration of a motomanP8 robotbased on laser tracking. In Proceedings of the IEEE Conference on Robotics Andautomation, San Francisco, CA, USA, 24–28 April 2000; pp. 3597–3602. 28. Omodei, A.; Legnani, G.; Adamini, R. Three methodologies for the calibration of industrial manipulators: Experimental results on a SCARA robot. J. Robot. Syst. 2000, 17, 291–307. [CrossRef] 29. Alici, G.; Shirinzadeh, B. A systematic technique to estimate positioning errors for robot accuracy improvement using laser interferometry-based sensing. Mech. Mach. Theory 2005, 40, 879–906. [CrossRef] Sensors 2021, 21, 4071 19 of 19

30. Zhang, B.; Wang, J.; Rossano, G.; Martinez, C. Vision-guided robotic assembly using uncalibrated vision. In Proceedings of the IEEE International Conference on Mechatronics and Automation, Beijing, China, 7–10 August 2011; pp. 1384–1389. 31. Meng, Y.; Zhuang, H. Autonomous robot calibration using vision technology. Robot. Comput. Integr. Manuf. 2007, 23, 436–446. [CrossRef] 32. Michalos, G.; Makris, S.; Eytan, A.; Matthaiakis, S.; Chryssolouris, G. Robot path correction using stereo vision system. Procedia Cirp. 2012, 3, 352–357. [CrossRef] 33. Luo, Z.; Zhang, K.; Wang, Z.; Zheng, J.; Chen, Y. 3D pose estimation of large and complicated workpieces based on binocular stereo vision. Appl. Optics. 2017, 56, 6822–6836. [CrossRef] 34. Bedaka, A.K.; Mahmoud, A.M.; Lee, S.C.; Lin, C.Y. Autonomous robot-guided inspection system based on offline programming and RGB-D model. Sensors 2018, 18, 4008. [CrossRef] 35. Bedaka, A.K.; Lin, C.Y.; Huang, S.T. Autonomous Cad Model–Based Motion Planning Platform. Int. J. Robot. Autom. 2019, 34.[CrossRef] 36. Quan, L.; Lan, Z. Linear n-point camera pose determination. IEEE Trans. Pattern Anal. Mach. Intell. 1999, 21, 774–780. [CrossRef] 37. Horaud, R.; Conio, B.; Leboulleux, O.; Lacolle, B. An analytic solution for the perspective 4-point problem. Comput. Vis. Graph. Image Process. 1989, 47, 33–44. [CrossRef] 38. De la Escalera, A.; Armingol, J.M. Automatic chessboard detection for intrinsic and extrinsic camera parameter calibration. Sensors 2010, 10, 2027–2044. [CrossRef] 39. Du, G.; Zhang, P. Online robot calibration based on vision measurement. Robot. Comput. Integr. Manuf. 2013, 29, 484–492. [CrossRef]