geosciences

Article Combined Close Range Photogrammetry and Terrestrial Laser Scanning for Ship Hull Modelling

Pawel Burdziakowski and Pawel Tysiac * Faculty of Civil and Environmental Engineering, Gdansk University of Technology, 80-233 Gda´nsk,Poland; [email protected] * Correspondence: [email protected]

 Received: 8 April 2019; Accepted: 23 May 2019; Published: 26 May 2019 

Abstract: The paper addresses the fields of combined close-range photogrammetry and terrestrial laser scanning in the light of ship modelling. The authors pointed out precision and measurement accuracy due to their possible complex application for ship hulls inventories. Due to prescribed vitality of every ship structure, it is crucial to prepare documentation to support the vessel processes. The presented methods are directed, combined photogrammetric techniques in ship hull inventory due to submarines. The class of photogrammetry techniques based on high quality photos are supposed to be relevant techniques of the inventories’ purpose. An innovative approach combines these methods with Terrestrial Laser Scanning. The process stages of data acquisition, post-processing, and result analysis are presented and discussed due to market requirements. Advantages and disadvantages of the applied methods are presented.

Keywords: terrestrial laser scanning; ; photogrammetry; surveying engineering; geomatics engineering

1. Introduction The present-day market requirements address application development in the cases of inventories and reverse engineering. Attention is paid to autonomous, fast, and accurate measurements covering a broad goal domain, e.g., the human body studies [1,2], coastal monitoring [3–5], or civil engineering [6–8]. The measurements involving a single technique only may lead to unacceptable results in terms of complex services in variable conditions (e.g., various weather conditions). Moreover, limitation to a single method may yield lower accuracy or neglect an entire spatial information in the course of data acquisition. Let us revise the case of a building inventory with the use of a laser scanner. The building side walls will be reproduced in contrast to a roof [9]. In order to solve this problem, it is reasonable to use another measurement method to collect spatial information on the object (e.g., with the additional use of aerial photos from the aircraft or UAV (Unmanned Aerial Vehicle)) [10,11]. Generally speaking, as stated in Reference [12], no single modelling technique meets all requirements of high geometric accuracy, portability, full automation, photo-realism, and low cost as well as flexibility and efficiency. The article addresses the ship hull as-build modelling and reverse engineering of a submarine by means of photogrammetry and terrestrial laser scanning. A hull manufacturing precision regarding the 3D model and the real manufactured object may be assessed by random measurement of a ship hull element. It is worth mentioning that, due to the final results, a single technique does not deliver a full range of data to cover the modelled object. In the presented case, each 3D measurement technique was affected by its typical problems. However, each case was a distinct one. Hence, we proposed to aggregate the methods of close range photogrammetry and terrestrial laser scanning to obtain spatial information about the submarine with high precision and accuracy, which reduces the disadvantages of both approaches.

Geosciences 2019, 9, 242; doi:10.3390/geosciences9050242 www.mdpi.com/journal/geosciences Geosciences 2019, 9, 242 2 of 16

The ICP (Iterative Closest Points) algorithm was employed to connect scan positions originating from an active sensor in the form of a laser scanner. It is one of the most popular algorithms used to connect two sets of data, which iteratively matches the transformation by means of the matching points and minimization of the error. The algorithm has been widely presented in the literature. The initial variant of the algorithm was presented in Reference [13]. However, the methods have been relevantly shown and compared in Reference [14]. Based on this reference, the ICP algorithm may be insufficient in either the cases of large distance, rotation, or coverage between datasets. The simplest solution is to initially establish the sets indicating the corresponding points in both sets, and to apply the ICP algorithm. In some cases, despite a correct reference, the algorithm is unable to find adequate coverage. Such a situation makes the solutions in the articles suitable for use [15,16]. In order to recover a complete and accurate 3D model from the images, the state-of-the-art commercial software (Bentley Context Capture) was employed. The photogrammetry process of the presented case includes geodetic control point measurements, image data acquisition, and data processing. In the course of data processing, the phase camera was calibrated by means of the Bentley’s Company intrinsic camera parameters. Next, automatic tie points were detected, an additional manual user defined the tie points, and the bundle adjustment was carried out. Those operations led to conducting additional camera autocalibration and the computation of the camera exterior orientation parameters. Lastly, coordinates of the image-based 3D object point coordinates were stated (dense point cloud). Subsequently, mesh and texture generation was carried out. The marine industry successfully applies photogrammetric and laser scanning techniques for as-build modelling and quality check. An interesting combination of laser scanning and close range photogrammetry is proposed in Reference [17]. The research is based on laboratory models, which do not incorporate real objects and only real hulls. In this case, the photogrammetric technique is applied for control points measurements, not performing a 3D point cloud hull modelling. The photogrammetric technique is applied in the research [18]. In this case, the authors implemented a single measurement technique and the laboratory model cased showed the same results. This approach distinguishes the signalized targets to measure. The results showed noticeable differences between the predicted and the real data. The main differences provided a need to use more retro reflective targets with more illumination in the survey location instead of using a camera flash. The study is a step forward in the ship and boat survey using the close-range photogrammetry method. In this research [18], combined laser scanning and multi-image photogrammetry measurements have been employed to document a wooden ancient shipwreck. During the survey, the 3D data have been integrated and compared after the excavation of the ship wreck. Additionally, a documentation including a virtual three-dimensional model was created from the point clouds, and a texture was applied herein. In this approach, two distinct point clouds have been integrated to provide the model of the ancient shipwreck. An innovative work is presented in Reference [19]. The authors combine above and under water photogrammetry for ship hull modelling. The paper provides an innovative procedure for the alignment of two photogrammetric models, deriving from two separate and independent photogrammetric surveys above and below the water level. It also shows a possibility to model and control ship hulls without docking. Similar studies involving an underwater photogrammetry experiment were shown on the example of the Costa Concordia wreck, where the worldwide shock catastrophe includes fatalities [20]. The research [21] suggested photogrammetry as a suitable method for coordinate measurement of decks in recreational ships. The research involves the photogrammetry modelling process, noting that, in the cases of wide corridors for the ships, with few obstacles, the direct method is relevant, when compared to the photogrammetric method. The maritime and shipyard industries clearly demand the new 3D modelling technique for the construction process control of ships, control during operation, and model deformation in the rebuilding process. The research [21] shows that, in selected cases, the photogrammetry technique is sufficient for ship modelling, which sometimes needs additional, supportive measurements. Geosciences 2019, 9, 242 3 of 16

A number of researchers [22] uses photogrammetric techniques to support reengineering of ship hulls and ship parts. This research discusses market demand for measurement and modelling of a variety of objects encountered in shipbuilding and repair businesses. This addresses partial or complete ship hulls to planar (2D) structural parts. The research was focused on evaluation of laser scanning and photogrammetry. To summarize, laser scanning is a powerful tool due to its high resolution, while photogrammetry is bound to recognize specific parts and features of the object, and photogrammetry was chosen as a preferred measurement method. The research clearly shows that both techniques are complementary 3D measurement techniques for ship hull modelling. Thus, in order to provide comprehensive results, these are intended to act simultaneously. The measurement methods [23] used in ship hull inventory are covered in Reference [24], but there are still problems to integrate the data from various sensors in the light of proper use and interpretation. Such procedures are aimed at the entire spatial information of the measured object.

2. Materials and Methods The submarine hull was modelled with the help of close-range photogrammetry and terrestrial laser scanning techniques. The research object is a decommissioned Polish Navy Ship Jastrzab (PNS Jastrzab), ex-HNoMS “Kobben” (S-318) (presented in Figure1). The submarine represents the Kobben class (also known as Type 207), which is a customized version of the German Type 205 submarine. Fifteen vessels of this class were built for the Royal Norwegian Navy in the 1960s. Later on, the class has been withdrawn from service in the Norwegian and Danish Navy. Some of them was transferred to Poland and The Polish Navy still operates Kobben-class submarines. The PNS Jastrzab submarine was commissioned on 15, August 1964, which was transferred to Poland in 2002 for spare parts on 17, December 2011, which moved to the Polish Naval Academy in Gdynia as a base for the crew Geosciencestraining 2019 laboratory., 9, x FOR PEER REVIEW 4 of 16

(a) (b)

FigureFigure 1. 1. PNSPNS Jastrzab—decommissioned Jastrzab—decommissioned submarine—object submarine—object of of research: research: (a (a) )wet wet object object during during laser laser scanningscanning with with control control points, points, ( (bb)) general general view view of of the the submarine submarine (s (source:ource: by by Pawel Pawel Kusiak Kusiak, , CC CC BY-SA BY-SA 4.0,4.0, https://commons.wikimed https://commons.wikimedia.orgia.org/w/index.php?curid=42676502)./w/index.php?curid=42676502).

The presented ship hull is a challenging object due to remote sensing modelling techniques. Thus, it was intentionally chosen to stress the applied modelling methods and to highlight their pros and cons. The hull is painted with black paint, which highly absorbs the laser light beam. In addition, a rainy measurement day was chosen in order to point out the methods greater and to simulate real industrial measurement environment, where the freshly docked hulls are wet. The absence of texture and tie points makes this object hard to model by means of photogrammetric methods. In the case of object measurement, the hull was wet and it triggered an additional measurement demand for the sensors. Highly demanding object size, shape, and surface, complemented by challenges faced during the research, made a clear explicit display of the presented techniques in highly unfavorable conditions due to sensorsFigure and algorithms. 2. Block diagram The measurementof the proposed conditions framework. were prescribed in order to simulate and consequently assess the efficiency and accuracy of the applied sensors. The accuracy of 2.1.sensors Photogrammetry in unfavorable measurement conditions was assessed separately in the case of each applied The photogrammetry technique encompasses methods of image measurement and interpretation in order to derive the shape and location of an object based on a single number up to several photographs. The photogrammetric methods can be applied in the case the object can be photographically recorded. The purpose of the photogrammetric measurement is the three- dimensional reconstruction in digital or graphical form [23]. The measurements (photos of the modelled object) and a mathematical transformation between the image and the object space have the means to model the object. The presented case showed a structure from motion (SfM) applied for hull modelling (Figure 3a). A structure from motion is the process of estimating a 3-D structure of a scene from a set of 2-D images. In the considered case, the image series was taken by means of a single calibrated Digital Single Lens Reflex Camera (DSLR). The initial camera calibration parameters compared with the optimized ones are shown in Table 1. The initial calibration parameters were taken from Bentley’s database. Further on, these values were optimized in the autocalibration process.

Table 1. Camera calibration results (camera Canon EOS 550D, lens EF-S18-55 mm f/3.5-5.6 IS 18 mm).

Focal Focal Length Principal Principal Length Equivalent 35 Point X Point Y K1 K2 K3 P1 P2 (mm) mm (mm) (pixels) (pixels) Initial Values 19.23 31.04 2618.21 1715.30 −0.1699 0.1792 −0.0335 0 0 Optimized 19.22 31.03 2618.93 1714.47 −0.1697 0.1776 −0.0315 0 0 Values Difference Initial/ −0.01 −0.01 0.71 −0.83 0.0002 −0.0017 0.002 0 0 Optimized

The 3-D structure and camera exterior orientation parameters were only recovered up to the scale, i.e., observing the structure and the magnitude of the camera motion in some units, which do not correspond to the world dimensions. In order to compute the actual scale of the structure of

Geosciences 2019, 9, x FOR PEER REVIEW 4 of 16

Geosciences 2019, 9, 242 4 of 16 technique. In order to eliminate the weakness of each separate technique with regard to the final object, multiple spatial data combining techniques were applied in the post processing phase of the research. The diagram (Figure2) shows the sequence of activities leading to the 3D model of the hull. First of all, the measurements were made using a laser scanner (in order to detect the edge and shape of the hull) and the cloud of points obtained(a) from the photographs. After combining,(b) the photograph based data was supposed to fill the spaces between the edges, registered by the laser scanner impulse. The object Figure 1. PNS Jastrzab—decommissioned submarine—object of research: (a) wet object during laser modellingscanning means with fitting control the points, shape (b) ofgeneral the hull view to of the the obtained submarine point (source: cloud. by Pawel In connection Kusiak , CC with BY-SA market expectations4.0, https://commons.wikimed and previously conductedia.org/w/index.php?curid=42676502). tests described in the introduction, accuracy analysis of the model was made, assuming a boundary value equal to 1 cm.

Figure 2. Block diagram of the proposed framework. Figure 2. Block diagram of the proposed framework. 2.1. Photogrammetry 2.1. Photogrammetry The photogrammetry technique encompasses methods of image measurement and interpretation in orderThe to derivephotogrammetry the shape and locationtechnique of anencompasse object baseds onmethods a single numberof image up to severalmeasurement photographs. and Theinterpretation photogrammetric in order methods to derive can the be shape applied and in locati the caseon of the an object object can based be photographically on a single number recorded. up to Theseveral purpose photographs. of the photogrammetric The photogrammetric measurement methods is the can three-dimensional be applied in the reconstruction case the object in digitalcan be orphotographically graphical form [recorded.23]. The measurementsThe purpose of (photos the photogrammetric of the modelled object)measurement and a mathematicalis the three- transformationdimensional reconstruction between the in image digital and or thegraphical object form space [23]. have The the measurements means to model (photos the object.of the Themodelled presented object) case and showed a mathematical a structure transformation from motion (SfM)between applied the image for hull and modelling the object (Figurespace have3a). Athe structure means to from model motion the isobject. the process The presented of estimating case showed a 3-D structure a structure of a scenefrom motion from a set(SfM) of 2-D applied images. for Inhull the modelling considered (Figure case, the 3a). image A structure series wasfrom taken motion by meansis the process of a single of estimating calibrated Digitala 3-D structure Single Lens of a Reflexscene from Camera a set (DSLR). of 2-D images. The initial In the camera considered calibration case, parameters the image series compared was taken with by the means optimized of a single ones arecalibrated shown inDigital Table1 .Single The initial Lens calibration Reflex Camera parameters (DSLR). were The taken initial from camera Bentley’s calibration database. Furtherparameters on, thesecompared values with were the optimized optimized in theones autocalibration are shown in process. Table 1. The initial calibration parameters were taken from Bentley’s database. Further on, these values were optimized in the autocalibration process.Table 1. Camera calibration results (camera Canon EOS 550D, lens EF-S18-55 mm f/3.5-5.6 IS 18 mm).

Focal Focal Length Principal Principal Table 1. Camera calibrationLength resultsEquivalent (camera CanonPointX EOS Point550D, Y lens EF-S18-55K1 mm K2 f/3.5-5.6 K3 IS 18 mm). P1 P2 (mm) 35 mm (mm) (pixels) (pixels) Focal Focal Length Principal Principal Initial Values 19.23 31.04 2618.21 1715.30 0.1699 0.1792 0.0335 0 0 − − Optimized ValuesLength 19.22Equivalent 35 31.03 Point 2618.93 X 1714.47Point Y 0.1697K1 0.1776K2 0.0315K3 0P1 0P2 − − Difference Initial/Optimized(mm) 0.01mm (mm) 0.01(pixels) 0.71 (pixels)0.83 0.0002 0.0017 0.002 0 0 − − − − Initial Values 19.23 31.04 2618.21 1715.30 −0.1699 0.1792 −0.0335 0 0 Optimized 19.22 31.03 2618.93 1714.47 −0.1697 0.1776 −0.0315 0 0 TheValues 3-D structure and camera exterior orientation parameters were only recovered up to the scale, i.e.,Difference observing the structure and the magnitude of the camera motion in some units, which do not correspondInitial/ to the world−0.01 dimensions.−0.01 In order0.71 to compute the−0.83 actual scale0.0002 of the−0.0017 structure 0.002 of modelled 0 0 Optimized objects and the camera motion in world units, additional information is required, e.g., the object size in the scene or distance between some known points (constrains). The object true size can be computed The 3-D structure and camera exterior orientation parameters were only recovered up to the while the camera Exterior Orientation Parameters (EOP) are acquired by other means. This process is scale, i.e., observing the structure and the magnitude of the camera motion in some units, which do referred to as direct georeferencing. In traditional photogrammetry, the EOP are derived from Aerial not correspond to the world dimensions. In order to compute the actual scale of the structure of Triangulation (AT), where the Control Points (CP) are required. Generally speaking, when the camera is able to record EOP directly from a navigation and orientation system, GCP (Ground Control Points) are not required to compute the object scale. In the presented case, in order to achieve real-world units Geosciences 2019, 9, x FOR PEER REVIEW 5 of 16

modelled objects and the camera motion in world units, additional information is required, e.g., the object size in the scene or distance between some known points (constrains). The object true size can be computed while the camera Exterior Orientation Parameters (EOP) are acquired by other means. This process is referred to as direct georeferencing. In traditional photogrammetry, the EOP are derived from Aerial Triangulation (AT), where the Control Points (CP) are required. Generally

Geosciencesspeaking,2019 when, 9, 242 the camera is able to record EOP directly from a navigation and orientation system,5 of 16 GCP (Ground Control Points) are not required to compute the object scale. In the presented case, in order to achieve real-world units and dimensions, photogrammetric control points are introduced to andthe dimensions,modelled object photogrammetric and measured control by traditional points are high introduced accuracy to geodetic the modelled techniques object andincorporating measured bytachymeter traditional (Figure highaccuracy 3(b)). All geodetic control techniquespoints are incorporatingapplied in order tachymeter to align (Figurethe model3b). scale All control and to pointsdetermine are applied its real dimensions. in order to align the model scale and to determine its real dimensions.

(a) (b)

FigureFigure 3.3. TheThe objectobject modelledmodelled withwith aa photogrammetryphotogrammetry technique:technique: ((aa)) tietie pointspoints andand cameracamera positionspositions forfor SfMSfM modellingmodelling andand ((bb)) photogrammetricphotogrammetric controlcontrol point point placed placed on on the the object. object.

TheThe objectobject has has been been modelled modelled by by means means of of 159 159 photos, photos, acquired acquired by Canonby Canon EOS EOS 550D 550D (Canon (Canon Inc., Tokyo, Japan), with the image dimensions of 5184 3456 pixels recorded in the JPEG format. In the Inc., Tokyo, Japan), with the image dimensions of 51× 84 × 3456 pixels recorded in the JPEG format. In datathe data acquisition acquisition process, process, more more than than 600 600 photos photos have have been been taken taken according according to to the the presented presented objectobject size.size. TheThe darkdark color ofof the object, and highly light absorbing paintpaint withwith a relatively vividvivid backgroundbackground mademade thethe acquisitionacquisition aa highhigh demand.demand. TheThe photosphotos rejectedrejected fromfrom thethe finalfinal calculationcalculation werewere eithereither tootoo darkdark oror overexposedoverexposed toptop partsparts ofof thethe hull.hull. InIn thatthat case,case, thethe surplussurplus imagesimages acquiredacquired allowedallowed usus toto selectselect thethe onesones withwith correctcorrect exposition.exposition. OnOn thethe incorrectlyincorrectly exposedexposed photos,photos, thethe algorithmalgorithm areare notnot ableable toto detectdetect a sufficient sufficient number number of of automatic automatic tie tie points points.. The The overall overall number number of oftie tiepoints points detected detected in the in theobject object is 15314 is 15314 (Figure (Figure 3a),3 a),with with a median a median number number of 330 of 330 points points per perphoto photo and and with with the themedian median 0.44 0.44pixels pixels reprojection reprojection error. error. The Themedian median number number of ph ofotos photos per perpoint point was was equal equal to 4. to Figure 4. Figure 4a 4showsa shows the thescene scene with with marked marked camera camera positions positions and and position position uncertainties uncertainties and and Figure Figure 4b4b presents thethe setset ofof photosphotos toto potentiallypotentially covercover eacheach area.area. TheThe objectobject inin greengreen denotesdenotes thethe amountamount ofof informationinformation onon thethe object,object, takentaken fromfrom approximatelyapproximately 4343 photos,photos, duedue toto thethe respectiverespective partsparts ofof thethehull. hull. TheThe minimumminimum reprojectionreprojection error error is is 0.01 0.01 pixels pixels and and the the maximum maximum is 1.85is 1.85 pixels pixels due due to theto the automatic automatic tie pointtie point (Figure (Figure4c). The4c). The automatic automatic tie pointstie points have have not not been been stated stated in in selected selected parts parts of of the the object object (top (top and andbottom bottom partpart ofof the hull). It It is isthe the result result of insufficiency of insufficiency of textures. of textures. Nevertheless, Nevertheless, exploring exploring the automatic the automatic tie points tie pointsdone, the done, SfM the algorithm SfM algorithm unable unableto recover to recover the ego the motion ego motion(external (external orientation orientation parameters parameters - camera - cameraposition position and orientation), and orientation), so the object so the could object not could be reconstructed not be reconstructed correctly. correctly. The problem The problemof wrongly of wronglymatched matchedimages is imagesbound to is appear bound at to the appear bow atand the stern bow parts and of stern the object. parts ofIn thethis object. case, the In 33 this manual case, the(user-defined) 33 manual (user-defined)tie points have tie been points stated have in been the statedentire inobject, the entire most object,of them most situated of them within situated the withinproblematic the problematic areas. The areas.manual The tie manualpoints are tie pointsdefined are by defined the user. by Their the user. position Their on position the images on the is imagespointed is manually, pointed manually, theoretically theoretically on a minimum on a minimum of three consecutive of three consecutive images. images.In the presented In the presented case, all case,33 user-defined all 33 user-defined tie points tie pointshave been have beenpointed pointed on all on images all images to achieve to achieve sufficient sufficient visibility. visibility. It Itwas was a anecessary necessary step step to to start start the the reconstruction reconstruction stage. stage.

Geosciences 20192019,, 99,, 242x FOR PEER REVIEW 66 of of 16

(a)

(b)

(c)

Figure 4. (a) Camera position uncertainties: top view (XY plane) of computed photo positions (black dots). BlueBlue ellipsesellipses indicate indicate the the position position uncertainty, uncertainty, scaled scaled for readability,for readability, (b) number (b) number of photos of photos seeing theseeing scene the (XY scene plane), (XY (plane),c) reprojection (c) reprojection Errors per Errors Tie Point: per Tie top Point: view top (XY view plane) (XY displays plane) of displays all tie points, of all withtie points, colors with representing colors representing the reprojection the reprojection error in pixels. error in pixels. 2.2. Terrestrial Laser Scanning 2.2. Terrestrial Laser Scanning The second measurement method was applied for ship hull as-build modelling and reverse The second measurement method was applied for ship hull as-build modelling and reverse engineering is the Terrestrial Laser Scanning (TLS). The laser scanning technology is based on engineering is the Terrestrial Laser Scanning (TLS). The laser scanning technology is based on measurements with the laser application. This technique is also called LiDAR (Light Detection and measurements with the laser application. This technique is also called LiDAR (Light Detection and Ranging). Involving automatic data acquisition TLS is able to register huge amounts of data to be widely Ranging). Involving automatic data acquisition TLS is able to register huge amounts of data to be explored in many applications. A possible application range covers a method to assess suspensions widely explored in many applications. A possible application range covers a method to assess of harmful dust [25], deformation monitoring [26], or concrete diagnosis [27]. However, due to the suspensions of harmful dust [25], deformation monitoring [26], or concrete diagnosis [27]. However, development of techniques and striving to achieve maximum effects in the shortest possible time, due to the development of techniques and striving to achieve maximum effects in the shortest significant progress has been made in the use of a vast domain of mobile systems, where the measuring possible time, significant progress has been made in the use of a vast domain of mobile systems, device can be located on a moving unit (for example, aircraft, car, or boat). Calibration of the scanner where the measuring device can be located on a moving unit (for example, aircraft, car, or boat).

Geosciences 2019, 9, x FOR PEER REVIEW 7 of 16

Geosciences 2019, 9, x 242 FOR PEER REVIEW 7 7of of 16 16 Calibration of the scanner on a mobile unit has been widely described in Reference [28]. This solution Calibrationis applied forof the popular scanner navigation on a mobile and unit measurements has been widely in buildings described [29] in or Reference in tunnels [28]. [30]. This solution ison applied a mobileIn order for unit popular to hasprovide beennavigation data widely acquisition, and described measurements the in Riegl Reference inVZ-400 buildings [28 ].Terrestrial This [29] solution or Laserin tunnels is Scanner applied [30]. (RIEGL for popular Laser navigationMeasurementIn order and to Systems measurementsprovide GmbH,data acquisition, in Horn, buildings Austria) the [29 ]Riegl orwas in VZ-400us tunnelsed in Terrestrial [courtesy30]. of Laser the Apeks Scanner Company, (RIEGL Laserwhich MeasurementrentedIn orderthe equipment toSystems provide GmbH,for data research. acquisition, Horn, The Austria) instrument the Rieglwas us VZ-400 spedecification in courtesy Terrestrial denotes of Laserthe thatApeks Scanner the Company, VZ-400 (RIEGL is whichable Laser to rentedMeasurementregister the 122,000 equipment Systems thousands for GmbH, research. of points Horn, The Austria)per instrument second was on usedsp aecification distance in courtesy not denotes exceeding of the that Apeks the 350 VZ-400Company, meters isin able whicha High to registerrentedSpeed the mode.122,000 equipment The thousands accuracy for research. of of points the scanner The per instrumentsecond equals on 5 mma specification distance. Its precision not denotes exceeding is 3 mm. that 350 theThus, meters VZ-400 the accuracyin is a able High to of Speedregisterthe ship-hull mode. 122,000 The model, thousands accuracy to be of of post-processed points the scanner per second equals in onthe 5 a next distancemm .steps, Its precision not should exceeding isbe 3 equivalentmm. 350 meters Thus, to inthe 1 a accuracycm High accuracy Speed of themode.value. ship-hull The accuracy model, to of be the post-processed scanner equals in 5 the mm. next Its steps, precision should is 3 be mm. equivalent Thus, the toaccuracy 1 cm accuracy of the value.ship-hullIn order model, to toprovide be post-processed spatial data in acquisition, the next steps, a number should of be 20 equivalent scan posi totions 1 cm was accuracy conducted value. in differentInIn order scan to patterns. provide Thespatial scan data positions acquisition, are pres a a entednumber in Figureof 20 scan5. Bad posi positions weathertions was including conducted rain andin in differentdiinsolationfferent scan appeared patterns. to The negatively scan positions act upon are the pres presented reentedsults. inAlthough Figure 5 5.. Terrestrial BadBad weatherweather Laser includingincluding Scanning rainrain shows andand insolationinsolationmany applications, appeared appeared itto to can negatively be hard actto act assess upon thethe geometry results.results. Although Although of the submarine Terrestrial Terrestrial based Laser only Scanning on this methodshows manydue to applications, bad weather it canconditions. be hard Into theassess article, the geometry geometrywe present of the the submarinecombination based possibility only on between this method method point dueclouds to bad from weather TLS and conditions. terrestrial In Inphotographs. the article, we present the combination possibility between point cloudsIt from is worth TLS and mentioning te terrestrialrrestrial that photographs. the black nature covering the submarine upon the results of laser scanningItIt is worth is due mentioning to light absorption that that the the of black black nature mate rial.co coveringvering Thus, thethe submarine results were upon not straightforward. the results of laser scanning is due to light absorption of black mate material.rial. Thus, Thus, the results were not straightforward.

Figure 5. Scan positions during data acquisition. Figure 5. ScanScan positions positions during during data data acquisition. acquisition. The data post-processing was performed in two steps. The first step concerned manual alignmentThe datadata of post-processingscanpost-processing positions by was wasmeans performed performed of indicating in two in steps. twfouro steps. Thecorresponding first The step first concerned pairs step of concerned points. manual The alignment manual second alignmentofstep scan applied positions of scan the by ICPpositions means (Iterative of by indicating meansClosest of Points) four indicating corresponding algo rithmfour corresponding for pairs accurate of points. and pairs precise The of secondpoints. scan alignment. stepThe appliedsecond In steptheorder ICPapplied to (Iterative minimize the ICP Closest the (Iterative error, Points) the Closest parameters algorithm Points) of foralgo ro accuratetationrithm andfor and accurate translation precise and scan matrices precise alignment. scanwere alignment. appropriately In order In to orderminimizecomputed. to minimize the The error, least the the squareerror, parameters the fitting parameters of method rotation ofwas and ro usedtation translation to and automa translation matricestically were matchmatrices appropriately corresponding were appropriately computed. pairs of computed.Thepoints. least square The least fitting square method fitting was method used to was automatically used to automa matchtically corresponding match corresponding pairs of points. pairs of points.TheThe results results of of scan scan alignment alignment are are shown shown in Figure in Figure6 in the 6 in form the ofform the histogramof the histogram of residues of residues where thewhere distancesThe the results distances between of scan between the alignment aligned the aligned pairs are shown of pairs points in of areFigure points displayed. 6are in displayed. the form of the histogram of residues where the distances between the aligned pairs of points are displayed.

Figure 6. Figure 6.Histogram Histogram of of residues. residues. Figure 6. Histogram of residues.

Geosciences 2019,, 99,, 242x FOR PEER REVIEW 88 of of 16

According to Figure 6, it may be clear that the maximum distance between the alignment planes is equalAccording to 6 mm. to FigureMost of6, itthe may planes be clear belong that to the the maximum group between distance 1 betweenmm and the−1 mm, alignment according planes to istheir equal spatial to 6 mm.position. Most Based of the on planes the values belong of to resi thedues, group the between standard 1 mmdeviation and may1 mm, be according estimated to where their − spatialthe accuracy position. of the Based scan on alignment the values can of be residues, approximated. the standard The standard deviation deviation may be estimatedequals 2 mm. where Based the accuracyon that information, of the scan alignment the accuracy can be of approximated. the point cloud The can standard be less deviation than 1 equalscm, according 2 mm. Based to the on thatpossibilities information, of spatial the accuracy data acquisition of the point by cloud the Riegl can be VZ-400. less than The 1 cm, accuracy according of the to themeasured possibilities points of spatialequal 5 datamm. acquisition by the Riegl VZ-400. The accuracy of the measured points equal 5 mm. The assessmentassessment ofof acquiredacquired data data was was performed performed by by a visuala visual evaluation. evaluation. First, First, cross cross sections sections of the of buildingsthe buildings and and ground ground were were created created where where the noise the no wasiseestimated. was estimated. Figure Figure7 presents 7 presents the results the results of the noiseof the evaluation.noise evaluation.

Figure 7. The noise level evaluation on the example of a wall of a building in meters.

Figure7 7 points points itit outout thatthat thethe deviation deviation from from various various scan scan positions positions does does not not exceed exceed 8 8 mm. mm.

3. Results 3.1. Photogrammetry 3.1. Photogrammetry The results of final hull photogrammetry modelling are presented in Figure8. It is shown that The results of final hull photogrammetry modelling are presented in Figure 8. It is shown that almost all photographed surfaces are modelled and represented in the point could. The very top of almost all photographed surfaces are modelled and represented in the point could. The very top of the object and some parts on the bottom have not been modelled at all. These parts have not been the object and some parts on the bottom have not been modelled at all. These parts have not been photographed due to the object specific shape and size. Basically, the model was mapped almost in photographed due to the object specific shape and size. Basically, the model was mapped almost in entirety, which shows no significant shortages. The possible output for further analysis is the point entirety, which shows no significant shortages. The possible output for further analysis is the point cloud (*.las file format), triangulated mesh (*.obj) with material information (*.mtl) and texture map cloud (*.las file format), triangulated mesh (*.obj) with material information (*.mtl) and texture map (*.jpg), orthophotoplans (*.jpeg, *.tiff). (*.jpg), orthophotoplans (*.jpeg, *.tiff).

Geosciences 2019, 9, x FOR PEER REVIEW 9 of 16 Geosciences 2019, 9, 242 9 of 16 Geosciences 2019, 9, x FOR PEER REVIEW 9 of 16 Geosciences 2019, 9, x FOR PEER REVIEW 9 of 16

Figure 8. The object modelled with a photogrammetry technique (colored point cloud). FigureFigure 8. 8.TheThe The object object modelled modelled with with aa photogramme photogrammetryphotogrammetrytry techniquetechnique (colored (colored point point cloud). cloud). 3.2. Terrestrial Laser Scanning 3.2.3.2. Terrestrial Based Terrestrial on Laserthe Laser results, Scanning Scanning presented in chapter 2, the authors assumed that the accuracy of the point cloud may reach 1 cm. Data filtration and extraction results in the point cloud are presented in Figure BasedBased on on the the results, results, presented presented in in chapterchapter 2,2, thethe authors assumed assumed that that the the accuracy accuracy of ofthe the point point 9. cloud may reach 1 cm. Data filtration and extraction results in the point cloud are presented in Figure cloud may reachreach 11 cm.cm. Data filtrationfiltration andand extractionextraction results results in in the the point point cloud cloud are are presented presented in in Figure Figure9. 9. 9.

Figure 9 9.. TheThe obtained obtained point point cloud cloud of the ship hull hull.. Figure 9. The obtained point cloud of the ship hull. Mapping of the bottomFigure parts of9. theThe modelobtained is muchpoint cloud better of in the the ship case hull of.the upper ones. The reasons MappingMapping of ofthe the bottom bottom parts parts of of the the modelmodel isis much better inin thethe case case of of the the upper upper ones. ones. The The reasonsare rain, are insolation, rain, insolation, and black materialand black covering material the covering submarine. the The submarine. results provide The results a valuable provide spatial a reasonsMapping are rain,of the insolation, bottom parts and ofblack the materialmodel is covering much better the submarine.in the case Theof the results upper provide ones. Thea valuableinformation spatial for further information analysis for and further modelling. analysis The collectedand modelling. data may The be acollected modelling data dataset may only be ora reasonsvaluable are spatialrain, insolation,information and for blackfurther material analysis covering and modelling. the submarine. The collected The resultsdata may provide be a a modellingpossibly combined dataset only with or other possibly techniques combined presented with other in the techniques article. In pres orderented to assess in the the article. quality In order of the valuablemodelling spatial dataset information only or possibly for further combined analysis with other and techniquesmodelling. pres Theented collected in the article. data mayIn order be a toobtained assess the data, quality the cross of the section obtained was data, created the and cross presented section was in Figure created 10 .and presented in Figure 10. modellingto assess datasetthe quality only of or the possibly obtained combined data, the withcross othersection techniques was created pres andented presented in the inarticle. Figure In 10. order to assess the quality of the obtained data, the cross section was created and presented in Figure 10.

FigureFigure 10. 10.Cross Cross sectionsection through the ship ship hull hull in in meters. meters. Figure 10. Cross section through the ship hull in meters.

Figure 10. Cross section through the ship hull in meters.

Geosciences 2019, 9, 242 10 of 16

Geosciences 2019,, 9,, xx FORFOR PEERPEER REVIEWREVIEW 10 of 16 3.3. Data Merging 3.3. Data Merging Figure 11 compares two modelling techniques: Photogrammetry RGB point cloud and laser (a) Figure 11 compares two modelling techniques: Photogrammetry RGB point cloud and laser (a) and pure laser scanning (b) and two point clouds merged (white—photogrammetry, blue—laser). and pure laser scanning (b) and two point clouds merged (white—photogrammetry, blue—laser). The TLS technique complements photogrammetry and vice versa. The parts of the hull not modelled The TLS technique complements photogrammetry and vice versa. The parts of the hull not modelled by TLS (top of the hull) were covered by a photogrammetric point cloud. The bottom parts of the by TLS (top of the hull) were covered by a photogrammetric point cloud. The bottom parts of the model, not covered by the photos, were modelled by the TLS. Moreover, some micro details are model, not covered by the photos, were modelled by the TLS. Moreover, some micro details are shown, e.g., welding lines, Kingston valves, torpedo launcher muzzle door, water intake, and outtake. shown, e.g., welding lines, Kingston valves, torpedo launcher muzzle door, water intake, and Small, detailed elements are visible in the laser point cloud, and they may be visually investigated in outtake. Small, detailed elements are visible in the laser point cloud, and they may be visually the photogrammetry model. The orthophoto plane clearly shows the details. investigated in the photogrammetry model. The orthophoto plane clearly shows the details.

(a) (b) (c)

Figure 11.11. Figure(a) Photogrammetry 11. (a) Photogrammetry RGB point RGB cloud point and cloud laser, and (b) laser,laser (scanningb) laser scanning point cloud, point and cloud, (c) mergedand (c) merged TLS (blue) TLS and (blue) photog andrammetry photogrammetry (white) (white)point clouds point. clouds.

The ideaidea of of data data merging merging indicates indicates the same the tie same points tie on thepoints model on by the means model of photogrammetric by means of photogrammetrictechniques and laser techniques scanning. and Figure laser 12 scanning. shows the Figure representation 12 shows the of pointsrepresentation visible in of the points point visible cloud inby the means point of cloud photos, byand means by meansof photos, of laser and scanning.by means Theof laser aggregation scanning. standard The aggregation deviation standard is about 1deviation cm. The coordinatesis about 1 ofcm. the The points coordinates were read of from the the points laser were scanner read and from implemented the laser into scanner the model and implementedfrom the photographs. into the model from the photographs.

Figure 12 12.. TheThe representation representation of control points ( a) on a laser scanning view and ( b) on images images..

As aa result result of of the the data data merging merging from from terrestrial terrest photogrammetryrial photogrammetry and laser and scanning, laser scanning, the computed the computeddistances to distances individual to points individual were presentedpoints were by meanspresented of the by module means in of the the Cloud module Compare in the software. Cloud CompareIt requires software. a more extensive It requires explanation. a more extensive Comparison explanation. between Comparison two point cloud between models two are point shown cloud in modelsFigure 13 are in shown the form in Figure of distances 13 in the between form of points. distances The between distances points. where The the distances laser data where were availablethe laser data(bottom were part) available range (bottom by a few part) centimeters range by a only. few centimeters The lack of only. TLS The data lack results of TLS in adata green results to a in red a greencolor change,to a red color which change, exceeds which the value exceeds by the 1 meter. value Itby clearly 1 meter. shows It clearly what shows can be what expected can be from expected both fromtechnologies, both technologies, especially withinespecially a highly within reflective a highly or reflective absorbing or area,absorbing which area, is also which highly is also inclined highly to inclinedthe laser to beam. the laser In order beam. to In assess order the to resultsassess the of data resu aggregation,lts of data aggregation, it is crucial it to is create crucial a to histogram create a histogramin groups wherein groups the datawhere overlapped the data overlapped and no registration and no fromregistration one of thefrom sensors. one of Thethe sensors. histogram The is histogrampresented inis presented Figure 14. in Figure 14.

Geosciences 2019, 9, x FOR PEER REVIEW 11 of 16 Geosciences 20192019,, 99,, 242x FOR PEER REVIEW 1111 of of 16

Figure 13. Distances between point clouds. Figure 13. Distances between point clouds. Figure 13. Distances between point clouds.

FigureFigure 14 14.. HistogramHistogram of of distance’s distance’s values values between between point point clouds clouds.. Figure 14. Histogram of distance’s values between point clouds. ItIt is is clearly clearly visible visible in Figure Figures 13 13 and and Figure 14 that 14 distances that distances from data from aggregation data aggregation reach 10 reach cm. In 10 order cm. Into order assessIt is to clearly the assess error visible the of dataerror in Figure aggregation,of da 13ta aggregation,and weFigure separate 14 we that separate the distances values the fromfrom values 0data to from 10aggregation cm 0 to and 10 fitcm reach a Gaussianand 10 fit cm. a GaussianmodelIn order to modelto estimate assess to estimate the standarderror the of standardda deviationta aggregation, deviation corresponding wecorresponding separate to the the Gaussian-type to values the Gaus fromsian-type variable.0 to 10 variable. cm This and action This fit a actionindicatesGaussian indicates themodel accuracy the to estimateaccuracy of data the of aggregation, standarddata aggregation, deviation how they how corresponding fit eachthey other,fit each toto formother,the Gaus the to basisformsian-type forthe hull basisvariable. modelling. for hullThis modelling.Theaction results indicates The of suchresults the accuracy an of operation such of an data areoperation aggregation, shown are in Figureshown how they15in Figureas fit a each fit 15 of other,as a Gaussa fit to of form model. a Gauss the Thebasis model. standard for Thehull standarddeviationmodelling. deviation between The results thebetween common of such the set ancommon ofoperation points set is areof about points shown 0.03 is m,in about Figure but, in0.03 terms15 m,as abut, of fit modelling inof aterms Gauss accuracy,of model.modelling there The accuracy,isstandard a need there todeviation reflect is a need distance between to reflect distribution the distancecommon in distri set a point ofbution points cloud in isa model. pointabout cloud The0.03 results m,model. but, are Thein shownterms results of in are modelling Figure shown 16 . inTheaccuracy, Figure values 16. there reachingThe is values a need 3 cmreaching to reflect represent 3 distancecm therepresent parts distri ofthbutione the parts hull in ofa modelledpointthe hull cloud modelled from model. the from The scanner resultsthe scanner with are similarshown with similarprecision.in Figure precision. 16. The The excess Thevalues excess values reaching values can be 3 cmcan observed represent be observed especially the especiallyparts on of the the on boat’s hull the modelled edges,boat's edges, and from at and its the bottom. at scanner its bottom. with similar precision. The excess values can be observed especially on the boat's edges, and at its bottom.

Geosciences 2019, 9, x FOR PEER REVIEW 12 of 16 Geosciences 2019, 9, x FOR PEER REVIEW 12 of 16

Geosciences 20192019,, 99,, 242x FOR PEER REVIEW 1212 of of 16

Figure 15. Gaussian pdf (probability density function) presented on the set of common points between Figure 15. Gaussian pdf (probability density function) presented on the set of common points between those obtained from photogrammetry and those based on laser scanning. those obtained from photogrammetry and those based on laser scanning. Figure 15. Gaussian pdf (probability density function) presented on the set of common points between Figure 15. Gaussian pdf (probability density function) presented on the set of common points between those obtained from photogrammetry and those based on laser scanning. those obtained from photogrammetry and those based on laser scanning.

Figure 16. Points distribution presented in three dimensions, which is based on the values from Figure Figure 16 16.. PointsPoints distribution distribution presented presented in three in three dimensio dimensions,ns, which which is based is based on the on values the valuesfrom Figure from 13. Figure13. 13. 3.4. As-Build Figure 16 Modeling . Points distribution presented in three dimensions, which is based on the values from Figure 3.4. As-Build Modeling 13. The results of 3D CAD (Autodesk Inventor) is presentedpresented in Figure 1717.. This part of the modelling The results of 3D CAD (Autodesk Inventor) is presented in Figure 17. This part of the modelling process isis basedbased onon thethe mergedmerged pointpoint clouds clouds form form TLS TLS and and photogrammetry. photogrammetry. The The modelling modelling process process is process3.4. As-Build is based Modeling on the merged point clouds form TLS and photogrammetry. The modelling process notis not possible possible without without the the data data merging, merging, as notas not all theall objectthe object parts parts had had its point its point representation representation of each of is not possible without the data merging, as not all the object parts had its point representation of parteach objectThepart resultsobject in each in of each single3D CAD single technique. (Autodesk technique. The Inventor) hullThe hull was iswas divided pres dividedented into in into equal Figure equal sections 17. sect Thisions (Figure part (Figure of 17 thea), 17(a)), theirmodelling crosstheir each part object in each single technique. The hull was divided into equal sections (Figure 17(a)), their sectionscrossprocess sections is have based beenhave on created. beenthe merged created. Each point crossEach clouds sectioncross sectform wasion TLS a basicwas and a element basicphotogrammetry. element to draw to a draw CAD The lineamodelling CAD on theline basisprocess on the of cross sections have been created. Each cross section was a basic element to draw a CAD line on the thebasisis not section of possible the shape.section without A shape. number the A datanumber of shapes merging, of served shapes as fornot served the all modelling the for objectthe modelling processparts had in process the its CADpoint in the software. representation CAD software. The hull, of basis of the section shape. A number of shapes served for the modelling process in the CAD software. asTheeach a hull, 3D part CAD asobject a object,3D in CAD each is object, a single perfect istechnique. a tool perfect for documentation Thetool hullfor documentation was divided preparing, into preparing, equal reverse sect engineering,reverseions (Figure engineering,and 17(a)), general their and The hull, as a 3D CAD object, is a perfect tool for documentation preparing, reverse engineering, and engineeringgeneralcross sections engineering work have with been work the created. with modelled the Each modelled object, cross without sectobject,ion without anywas numericala basic any elementnumerical representation. to representation.draw a CAD line on the generalbasis of engineeringthe section shape. work Awith number the modelled of shapes ob servedject, without for the modellingany numerical process representation. in the CAD software. The hull, as a 3D CAD object, is a perfect tool for documentation preparing, reverse engineering, and general engineering work with the modelled object, without any numerical representation.

(a) (b) (c) (a) (b) (c) Figure 17. The hull cross sections (CAD line) ( a),), merged merged point point cloud cloud and and 3D 3D CAD CAD modelled modelled hull hull ( (bb),), Figure 17. The hull cross sections (CAD line) (a), merged point cloud and 3D CAD modelled hull (b), 3D modelled hull (c). 3D modelled hull ((ca).) (b) (c) 4. Discussion Figure 17. The hull cross sections (CAD line) (a), merged point cloud and 3D CAD modelled hull (b),

At3D themodelled very beginninghull (c). of the discussion, there is a need to focus on the measurement conditions.

They were marked by high insolation and rain, to possibly prevent the inventory from being carried

Geosciences 2019, 9, x FOR PEER REVIEW 13 of 16

4. Discussion Geosciences 2019, 9, 242 13 of 16 At the very beginning of the discussion, there is a need to focus on the measurement conditions. They were marked by high insolation and rain, to possibly prevent the inventory from being carried out. The main purpose of choosing such didifficultfficult weather conditions was to assess the limitationslimitations ofof combiningcombining methods such as laser scanning and phot photogrammetry.ogrammetry. It It is is shown shown in in Figures Figures 66,,7 7, and and9 9 thatthat thethe cloudcloud ofof pointspoints waswas obtainedobtained withwith aa precisionprecision ofof 55 mm,mm, losinglosing somesome informationinformation aboutabout thethe upper partpart ofof thethe ship’sship's hull, but, considering the edges, necessary to register in order to properlyproperly createcreate thethe model.model. In the case of a cloud of pointspoints obtainedobtained fromfrom photographs,photographs, the edges were notnot representedrepresented well.well. Therefore, the standard deviation of connections between particular measurements has increased. It should also be noted that the distances in Figure 13 result from variations in density and completenesscompleteness ofof informationinformation betweenbetween clouds.clouds. Based on these results, the estimatedestimated standardstandard deviation atat thethe levellevel ofof 33 cmcm isis high.high. However, limitinglimiting aa singlesingle method,method, otherother authorsauthors maymay attemptattempt toto modelmodel thethe hullhull withwith muchmuch greatergreater accuracy.accuracy. TheThe results,results, presentedpresented in FigureFigure 1717,, areare extendedextended inin Figure 1818,, ofof thethe followingfollowing features:features: a) registrationregistration didifferencesfferences on thethe edgeedge betweenbetween thethe pointpoint cloudcloud and thethe cloudcloud obtainedobtained from thethe picturespictures areare shown,shown, b)b) thethe averageaverage deviationdeviation of thethe sectionsection of thethe createdcreated modelmodel toto thethe resultingresulting pointpoint cloudcloud isis presented,presented, and,and, at thethe samesame time,time, attemptingattempting to applyapply thethe modelmodel withwith anan accuracyaccuracy greatergreater thanthan 33 cm.cm. AA blackblack colorcolor inin FigureFigure 1818aa presentspresents thethe pointpoint cloudcloud obtained from photographsphotographs purple from the laserlaser scannerscanner and, in FigureFigure 1818b,b, presentspresents thethe pointspoints applied forfor modelling.modelling.

Figure 18.18The. The diff erencesdifferences between between models (modelsa) differences (a) differences between laser between scanner andlaser photogrammetry scanner and onphotogrammetry edges. (b) Deviation on edges. between (b) Deviation the contour between model the and contour the point model cloud, and used the for point modelling. cloud, used for modelling. Overall, the review of point cloud registration methods provides wide opportunities to apply themOverall, during datathe review processing of point to achieve cloud greaterregistration accuracy methods of data provides combining. wide Currently,opportunities hundreds to apply of publicationsthem during aredata related processing to this totopic. achieve One greater of the accu simplestracy of and data widely combining. applied Currently, algorithm hundreds is the ICP of algorithm.publications The are algorithm related to itself this topic. is based One on of coverage the simplest between and twowidely sets applied of data. algorithm However, is applying the ICP filtrationalgorithm. and The optimization algorithm itself methods, is based it is on possible coverage to increase between their two reliability sets of data. as a However, merged set applying of data. Suchfiltration examples and optimization are shown in methods, Reference it is [31 possible] where, to showing increase the their use reliability of the ICP as algorithm a merged inset robotics, of data. withoutSuch examples any general are shown comparison in Reference of this [31] method where, depending showing on the the use field of the of science. ICP algorithm This is reasonable,in robotics, mentioningwithout any ageneral variety comparison of applications, of this sensor method characteristics, depending on and the the field characteristics of science. This of environments is reasonable, thatmentioning make it a di varietyfficult toof chooseapplicatio anns, appropriate sensor characteristics, algorithm. Thus,and the it providescharacteristics multiplication of environments of this simplethat make solution. it difficult It is worthto choose mentioning an appropriate that another, algorithm. common Thus, algorithm it provides used multiplication to connect data of setsthis issimple the RANSAC solution. algorithm.It is worth mentioning This algorithm that cananot beher, described common asalgorithm a method used of datasetto connect merging data sets with is athe large RANSAC number algorithm. of outliers This and algorithm noise. In can our be future described research, as a method we will of focus dataset on merging automation with of a large data aggregationnumber of outliers in the and case noise. of ship In hullsour future inventories research analyzing, we will focus the method’s on automation possibilities of data published aggregation in literature. The featured publication [32] shows the use of the 4PCS. This algorithm makes it possible to register raw point cloud data characterized by a high level of noise and outliers. In addition, this Geosciences 2019, 9, 242 14 of 16 method reduces the number of attempts required to obtain reliable accuracy while making registrations. The publications [33,34] address novel methods of combining subsets of data. The authors showed that their method makes the multimodal models automatically merged, regardless of differences in noise, detail, scale, and unknown relative coverage. In order to increase the accuracy of merging, adequate filtration has to be made in the data preparation course for further processing.

5. Conclusions Concluding the article, we focused on the proposed method of combining two measurements: photogrammetry and laser scanning suitable for modelling ship hulls in unfavorable conditions. Based on interviews with shipbuilding companies, the maximum value of the analyzed model should be about 1 cm. Statistical analyses showed that this value is difficult to obtain based on the accuracy of model generation and their subsequent merger. In the case of hull modelling, it is possible to achieve the accuracy of 1 cm when laser scanning provides edge information, while photogrammetry achieves spatial data between these edges. The idea for this task was to take pictures with a camera to connect them, generate a point cloud, appy laser scanning technology, and then scan around the hull of the ship. The limitations noted by the authors during the development of the results are significant on the generated edges. A problem is shown in Figure 18a where the edge registered by the laser scanner is not represented in the cloud, but is represented from photos. Noting this dependence and knowing that the accuracy of edge logging by the laser scanner is not greater than 1 cm, the edges recorded by the scanner and their filling registered by the camera were applied to fit appropriate cross-sections from the exact 3D model that was generated. Such a fitted section is located within a 1-cm accuracy. It was the main purpose of the article. Concluding, we suggest that the use of combined data from laser scanning and photogrammetry make it reasonable to provide engineering measurements requiring high accuracy (less than 1 cm). When using these methods, there is some awareness of their advantages and disadvantages.

Author Contributions: Conceptualization, P.B. and P.T.; Methodology, P.B. and P.T.; Software, P.B. and P.T.; Validation, P.B., and P.T.; Formal Analysis, P.T.; Investigation, P.B. and P.T.; Resources, P.B. and P.T.; Data Curation, P.B.; Writing—Original Draft Preparation, P.B. and P.T.; Writing—Review and Editing, P.B. and P.T.; Visualization, P.B. and P.T.; Supervision, P.T.; Project Administration, P.B.; Funding Acquisition, P.B. and P.T. Funding: This research received no external funding. Acknowledgments: The authors express their words of gratitude to the Apeks Company for the equipment and software that we applied in data post-processing. We are also grateful to the Polish Naval Academy Gdynia for making the research available. Conflicts of Interest: The authors declare no conflict of interest.

References

1. Rey-Barroso, L.; Burgos-Fernández, F.J.; Delpueyo, X.; Ares, M.; Royo, S.; Malvehy, J.; Puig, S.; Vilaseca, M. Visible and Extended Near-Infrared Multispectral for Skin Cancer Diagnosis. Sensors 2018, 18, 1441. [CrossRef] 2. Narayanamurthy, V.; Padmapriya, P.; Noorasafrin, A.; Pooja, B.; Hema, K.; Yuhainis Firus Khan, A.; Nithyakalyanic, K.; Samsurib, F. Skin cancer detection using non-invasive techniques. RSC Adv. 2018, 8. [CrossRef] 3. Bitenc, M.; Lindenbergh, R.C.; Khoshelham, K.; Van Waarden, A.P. Evaluation of a LIDAR Land-Based Mobile Mapping System for Monitoring Sandy Coasts. Remote Sens. 2011, 3.[CrossRef] 4. Ercoli, L.; Zimbardo, M.; Nocilla, N.; Nocilla, A.; Ponzoni, E. Evaluation of cliff recession in the Valle dei Templi in Agrigento (Sicily). Eng. Geolo. 2015, 192, 129–138. 5. Kuhn, D.; Prufer, S. Coastal cliff monitoring and analysis of mass wasting processes with the application of terrestial laser scanning: A case study of Rugen, Germany. Geomorphology 2014, 213, 153–165. [CrossRef] Geosciences 2019, 9, 242 15 of 16

6. Baqersad, J.; Poozesh, P.; Niezrecki, C.; Avitabile, P. Photogrammetry and optical methods in structural dynamics–A review. Mech. Syst. Signal Process. 2017, 86, 17–34. [CrossRef] 7. Lahamy, H.; Lichti, D.; Steward, J.; El-Badry, M.; Moravvej, M. Measurement of Deflection in Concrete Beams During Fatigue Loading Test Using the Microsoft 2.0. J. Appli. Geod. 2016, 10, 71–77. [CrossRef] 8. Ziolkowski, P. Processing of Point Cloud Data Retrieved from Terrestrial Laser Scanning for Structural Modeling by Finite Element Method. In Proceedings of the 17th International Multidisciplinary Scientific GeoConference SGEM2017, Bulgaria, 27 June–6 July 2017; pp. 211–218. 9. Przyborski, M.; Tysiac, P. As- built inventory of the office building with the use of terrestrial laser scanning, E3S Web Conf. Semina. Geomat. Civ. Environ. Eng. (2017 BGC) 2018, 26.[CrossRef] 10. Holopainen, M.; Vastaranta, M.; Kankare, V.; Vaaja, M.; Hyyppä, H.; Liang, X.L.; Litkey, P.; Yu, X.W.; Kaartinen, H.; Jakkola, A.; et al. The use of ALS, TLS and VLS measurements in mapping and monitoring urban trees. In Proceedings of the 2011 Joint Urban Remote Sensing Event, Munich, Germany, 11–13 April 2011. [CrossRef] 11. Mikrut, S.; Moskal, A.; Marmol, U. Integration of Image and Laser Scanning Data Based on Selected Example. Image Process. Commun. 2014, 19, 37–44. [CrossRef] 12. Remondino, F.; El-Hakim, S. Image-based 3D modelling: A review. Photogramm. Rec. 2006, 21, 269–291. [CrossRef] 13. Chen, Y.; Medioni, G. Object Modeling by Registration of Multiple Range Images. In Proceedings of the IEEE Conference on Robotics and Automation, Sacramento, CA, USA, 9–11 April 1991. [CrossRef] 14. Rusinkiewicz, S.; Levoy, M. Efficient variants of the ICP algorithm. In Proceedings of the Third International Conference on 3-D Digital Imaging and Modelling, Quebec City, QC, Canada, 28 May–1 June 2001; pp. 145–152. 15. He, Y.; Liang, B.; Yang, J.; Li, S.Z.; He, J. An Iterative Closest Points Algorithm for Registration of 3D Laser Scanner Point Clouds with Geometric Features. Sensors 2017, 17, 1862. [CrossRef] 16. Du, S.Y.; Xu, Y.T.; Wan, T.; Hu, H.Z.; Zhang, S.R.; Xu, G.L.; Zhang, X.T. Robust iterative closest point algorithm based on the global reference point for rotation invariant registration. PLoS One 2017, 12, e0188039. [CrossRef] 17. Abbas, M.A.; Lichti, D.D.; Chong, A.K.; Setan, H.; Majid, Z.; Lau, C.L.; Ariff, M.F.M. Improvements to the accuracy of prototype ship models measurement method using terrestrial laser scanner. Measurement. J Int. Meas.Confed. 2017, 100, 301–310. [CrossRef] 18. Ahmed, Y.M.; Jamail, A.B.; Yaakob, O.B. Boat survey using photogrammetry method. Int. Rev. Mech. Eng. 2012, 6, 1643–1647. 19. Menna, F.; Nocerino, E.; Troisi, S.; Remondino, F. Joint alignment of underwater and above-the-water photogrammetric 3D models by independent models adjustment. Int. Archives Photogramm. Remote Sens. Spat. Inf. Sci. 2015, 40, 143–151. [CrossRef] 20. Menna, F.; Nocerino, E.; Troisi, S.; Remondino, F. A photogrammetric approach to survey floating and semi-submerged objects. In Proceedings of the Videometrics, Range Imaging and Applications XII, Munich, Germany, 14–16 May 2013; pp. 1–15. 21. Ordóñez, C.; Riveiro, B.; Arias, P.; Armesto, J. Application of close range photogrammetry to deck measurement in recreational ships. Sensors 2009, 9, 6991–7002. [CrossRef][PubMed] 22. Koelman, H.J. Application of a photogrammetry-based system to measure and re-engineer ship hulls and ship parts: An industrial practices-based report. Comput. Aided Des. 2010, 42, 731–743. [CrossRef] 23. Luhmann, T.; Robson, S.; Kyle, S.; Boehm, J. Close-range Photogrammetry and 3D Imaging; Walter de Gruyter: Berlin, Germany, 2014. 24. Tang, C.H.H.; Tang, H.E.; Tay, P.K.J. Low cost digital close range photogrammetric measurement of an as-built anchor handling tug hull. Ocean Eng. 2016, 119, 67–74. [CrossRef] 25. Zhao, Y.; Hu, Q.; Li, H.; Wang, S.; Ai, M. Evaluating Carbon Sequestration and PM2.5 Removal of Urban Street Trees Using Mobile Laser Scanning Data. Remote Sens. 2018, 10, 1759. [CrossRef] 26. Scaioni, M.; Marsella, M.; Crosetto, M.; Tornatore, V.; Wang, J. Geodetic and Remote-Sensing Sensors for Dam Deformation Monitoring. Sensors 2018, 18, 3682. [CrossRef] 27. Janowski, A.; Nagrodzka-Godycka, K.; Szulwic, J.; Ziolkowski, P. Remote sensing and photogrammetry techniques in diagnostics of concrete structures. Comput. Concr. 2016, 18, 405–420. [CrossRef] Geosciences 2019, 9, 242 16 of 16

28. Hong, S.; Park, I.; Lee, J.; Lim, K.; Choi, Y.; Sohn, H.-G. Utilization of a Terrestrial Laser Scanner for the Calibration of Mobile Mapping Systems. Sensors 2017, 17, 474. [CrossRef] 29. Zheng, Y.; Peter, M.; Zhong, R.; Oude Elberink, S.; Zhou, Q. Space Subdivision in Indoor Mobile Laser Scanning Point Clouds Based on Scanline Analysis. Sensors 2018, 18, 1838. [CrossRef][PubMed] 30. Zhou, Y.; Wang, S.; Mei, X.; Yin, W.; Lin, C.; Hu, Q.; Mao, Q. Railway Tunnel Clearance Inspection Method Based on 3D Point Cloud from Mobile Laser Scanning. Sensors 2017, 17, 2055. [CrossRef][PubMed] 31. Pomerleau, F.; Colas, F.; Siegwart, R. A Review of Point Cloud Registration Algorithms for Mobile Robotics. Found. Trends Robot. 2015, 4, 1–104. [CrossRef] 32. Aiger, D.; Mitra, N.J.; Cohen-Or, D. 4-points congruent sets for robust pairwise surface registration. ACM Trans. Graph. 2008, 27, 1–10. [CrossRef] 33. Mellado, N.; Dellepiane, M.; Scopigno, R. Relative Scale Estimation and 3D Registration of Multi-Modal Geometry Using Growing Least Squares. IEEE Trans. Vis. Comput. Graph. 2016, 22, 2160–2173. [CrossRef] 34. Corsini, M.; Dellepiane, M.; Ganovelli, F.; Gherardi, R.; Fusiello, A.; Scopigno, R. Fully Automatic Registration of Image Sets on Approximate Geometry. Int. J. Comput. Vis. 2013, 102, 91–111. [CrossRef]

© 2019 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).