Journal of Imaging

Article A Cost-Effective System for Aerial 3D of Buildings

Claudia Daffara 1,2, Riccardo Muradore 1 , Nicola Piccinelli 1 , Nicola Gaburro 1, Tullio de Rubeis 3 and Dario Ambrosini 2,3,*

1 Dept. of Computer Science, University of Verona, str. Le Grazie 15, I-37134 Verona, Italy; claudia.daff[email protected] (C.D.); [email protected] (R.M.); [email protected] (N.P.); [email protected] (N.G.) 2 ISASI-CNR, Institute of Applied Science and Intelligent Systems, Via Campi Flegrei 34, I-80078 Pozzuoli (NA), Italy 3 Dept. of Industrial and Information Engineering and Economics, University of L’Aquila, P.le Pontieri 1, I-67100 L’Aquila, Italy; [email protected] * Correspondence: [email protected]

 Received: 30 June 2020; Accepted: 30 July 2020; Published: 2 August 2020 

Abstract: Three-dimensional (3D) imaging and (IR) thermography are powerful tools in many areas in engineering and sciences. Their joint use is of great interest in the buildings sector, allowing inspection and non-destructive testing of elements as well as an evaluation of the energy efficiency. When dealing with large and complex structures, as buildings (particularly historical) generally are, 3D thermography inspection is enhanced by Unmanned Aerial Vehicles (UAV—also known as drones). The aim of this paper is to propose a simple and cost-effective system for aerial 3D thermography of buildings. Special attention is thus payed to instrument and reconstruction software choice. After a very brief introduction to IR thermography for buildings and 3D thermography, the system is described. Some experimental results are given to validate the proposal.

Keywords: infrared thermography; unmanned aerial vehicles; 3D modelling; energy efficiency; cultural heritage

1. Introduction Three-dimensional (3D) imaging [1] is an important tool in many fields, ranging from industrial and architectural design to diagnostics of materials and artifacts, from medicine to entertainment (cinema, video games) and the fruition of historical and artistic heritage (augmented reality, virtual reconstruction). Infrared (IR) thermography (IRT) [2] is also a technique that has grown very rapidly in recent years, now characterized by increasingly advanced applications. Therefore, the joint use of these two techniques is of great interest and potential and represents a very current research topic. 3D thermography can be very useful, for example, in structural diagnostics, energy efficiency assessment of buildings, inspection and monitoring, and these evaluations may be enhanced by performing them from UAVs (Unmanned Aerial Vehicles—also known as drones) [3]. IRT is based on the fact that all bodies having temperature above absolute zero emit radiation; from this radiation, it is possible to trace the temperature of the body. Therefore, thermography is a method capable of detecting the temperature of objects under investigation without contact. A thermographic is a calibrated device capable of measuring the radiation emitted by objects and calculating their temperature. The radiation measured by their sensor also depends on the properties of the investigated surface (emissivity) and the environment (radiation absorbed or emitted

J. Imaging 2020, 6, 76; doi:10.3390/jimaging6080076 www.mdpi.com/journal/jimaging J. Imaging 2020, 6, x FOR PEER REVIEW 3 of 14

(BIM) [39] or associated with a high-quality color laser scanner for cultural heritage monitoring and documentation [40]. The idea for the present work stems from the observation that several economical yet reasonably well-performing thermal are on the market. They are integrated in (e.g., CAT S60 and CAT S61) or can be added to them as external modules (e.g., FLIR One and Seek Thermal) and available as stand-alone devices (e.g., FLIR C2). As their prices are typically within $1000, they help to increase the spread of IRT applications in many fields, such as, for example, biomedicine [41,42], agriculture [43], buildings inspection [44], cultural heritage diagnostics [45] and mass human temperature screening [46]. In this work, we propose a simple and cost-effective system to perform 3D aerial thermography J. Imaging 2020, 6, 76 2 of 13 of buildings. Particular attention is devoted to the choice of instruments and software for reconstruction. The article is structured in the following way: we describe the proposed system, with bydetails the atmosphere on the choice between of instruments, sensor and object calibration and contribution and reconstruction of other objects software. in the The environment). system was Basicvalidated principles in a virtual are summarized environment in Figure. Finally,1. we show some experimental results.

Figure 1. Basic principle of a thermal camera. Figure 1. Basic principle of a thermal camera. The result is an image of the object in which the color or gray levels correspond to the different temperatures2. Materials onand the Methods object’s surface. The measurement accuracy also depends on parameters such as ambient temperature, wind or solar radiation. Possible variations in temperature may be due to2.1. diff Aerences Cost-Effective in materials System surface finish (intrinsic or as a result of ageing/damage) or subsurface defectsThis [2]. paper is devoted to proposing a simple and cost-effective system for aerial 3D thermographyDiagnostic capabilitiesof buildings. can be enhanced by a quantitative analysis of the thermographic data. Here, weTo arethis particularly aim, some key interested features in of IRT the for system buildings, can be a flourishingdefined: application. An interrogation of the database1. Thermal SCOPUS and using geometric the search data term should “thermography be recorded AND by buildings” the same in device “article and title, in abstract a single and keywords”measurement returned aboutprocess 1725. papers, while the same interrogation only in “article title” returned 178 papers2. This (data device accessed should 4 May be commercially 2020). Table 1available reports someand cost recent-effective references,. identifying review papers3. and The the reconstruction main topics discussed. software should not require images taken by more than one recording device. 4.Table The 1. Areconstruction (not exhaustive) software list of recent should papers be about as simple infrared as thermographypossible without (IRT) many for buildings. parameters to tune. Authors Year Main Topic Notes Garrido et al. [4] 2020 Post-processing Review Huang et al. [5] 2020 Facades diagnostics Teni et al. [6] 2019 Thermal transmittance Review Bienvenido-Huertas et al. [7] 2019 Thermal transmittance Review Soares et al. [8] 2019 Thermal transmittance Review Glavaš et al. [9] 2019 Cultural heritage Royuela-del-Val et al. [10] 2019 Air infiltration Neural network Nardi et al. [11] 2018 Heat losses Review Kirimtat et al. [12] 2018 Thermal performance Review Baldinelli et al. [13] 2018 Thermal bridges Lucchi [14] 2018 Energy audit Review Lerma et al. [15] 2018 Air infiltration O’Grady et al. [16] 2017 Heat losses J. Imaging 2020, 6, 76 3 of 13

Table 1. Cont.

Authors Year Main Topic Notes Barreira et al. [17] 2017 Air leakage Fox et al. [18] 2016 Diagnostics Nardi et al. [19] 2016 Thermal transmittance Djupkep Dizeu et al. [20] 2016 Indoor conditions Barreira et al. [21] 2016 Moisture Sfarra et al. [22] 2016 Cultural heritage Solar heating Fox et al. [23] 2015 Diagnostics Albatici et al. [24] 2015 Thermal transmittance Kylili et al. [25] 2014 Diagnostics Review Nardi et al. [26] 2014 Thermal transmittance Krankenhagen et al. [27] 2014 Cultural heritage Solar heating Paoletti et al. [28] 2013 Cultural heritage Dall’O’ et al. [29] 2013 Energy audit

The breadth of applications gives a fair idea of the current importance of IRT in buildings diagnostics and evaluations. A further step ahead relates to 3D imaging. 3D imaging and displays are getting more and more important. For a general review of the topic, the interested reader is referred to References [1,30]: a comprehensive handbook and a recent extensive tutorial. The great interest in 3D technologies also flourished in cultural heritage and buildings studies [31–34]. All this naturally leads to 3D thermography that is usually realized by combining 3D geometric data and two-dimensional (2D) thermographic data [35], and different setups are available according to the different 3D geometric acquisition systems and the different data fusion. Thus, for example, 3D geometry and 2D thermal images can be simply compared [36], infrared images can be mapped to 3D point clouds [37,38], integrated at different times in a Building Information Model (BIM) [39] or associated with a high-quality color laser scanner for cultural heritage monitoring and documentation [40]. The idea for the present work stems from the observation that several economical yet reasonably well-performing thermal cameras are on the market. They are integrated in smartphones (e.g., CAT S60 and CAT S61) or can be added to them as external modules (e.g., FLIR One and Seek Thermal) and available as stand-alone devices (e.g., FLIR C2). As their prices are typically within $1000, they help to increase the spread of IRT applications in many fields, such as, for example, biomedicine [41,42], agriculture [43], buildings inspection [44], cultural heritage diagnostics [45] and mass human temperature screening [46]. In this work, we propose a simple and cost-effective system to perform 3D aerial thermography of buildings. Particular attention is devoted to the choice of instruments and software for reconstruction. The article is structured in the following way: we describe the proposed system, with details on the choice of instruments, calibration and reconstruction software. The system was validated in a virtual environment. Finally, we show some experimental results.

2. Materials and Methods

2.1. A Cost-Effective System This paper is devoted to proposing a simple and cost-effective system for aerial 3D thermography of buildings. To this aim, some key features of the system can be defined:

1. Thermal and geometric data should be recorded by the same device and in a single measurement process. 2. This device should be commercially available and cost-effective. J. Imaging 2020, 6, 76 4 of 13

3. The reconstruction software should not require images taken by more than one recording device. J. Imaging 2020, 6, x FOR PEER REVIEW 4 of 14 4. The reconstruction software should be as simple as possible without many parameters to tune. 2.2. Choice of Recording Device 2.2. Choice of Recording Device Recent developments in sensor technology led to the production of miniaturized Recent developments in sensor technology led to the production of miniaturized bolometers with with LWIR (Long Infrared band: 8–14 m) sensitivity that are commercially available as LWIR (Long Wavelength Infrared band: 8–14 µm) sensitivity that are commercially available as camera camera core or mounted in compact thermal cameras. Two thermal cameras were chosen from the core or mounted in compact thermal cameras. Two thermal cameras were chosen from the FLIR family FLIR family of compact ones with visible imaging: FLIR C2 and FLIR Duo R, and their respective of compact ones with visible light imaging: FLIR C2 and FLIR Duo R, and their respective features are features are summarized in Table 2. FLIR C2 was presented as the “world’s first full-featured, pocket- summarized in Table2. FLIR C2 was presented as the “world’s first full-featured, pocket-sized thermal sized thermal camera designed for building industry experts and contractors” [47], while FLIR Duo camera designed for building industry experts and contractors” [47], while FLIR Duo R was presented R was presented as “the world’s first compact, lightweight, radiometric thermal and visible light as “the world’s first compact, lightweight, radiometric thermal and visible light imager designed for imager designed for drone applications” [48]. Currently, the FLIR Duo R camera is no longer on the drone applications” [48]. Currently, the FLIR Duo R camera is no longer on the market and a new market and a new version is available (FLIR Duo Pro R). Figure 2 shows the two recording devices. version is available (FLIR Duo Pro R). Figure2 shows the two recording devices. The important feature The important feature is the simultaneous acquisition of a dual visible-thermal image dataset. The is the simultaneous acquisition of a dual visible-thermal image dataset. The larger field of view and larger field of view and sensor size of the Duo R makes it more suitable for thermal acquisition of sensor size of the Duo R makes it more suitable for thermal acquisition of large-scale objects, and the large-scale objects, and the higher resolution of the visible sensor allows for capturing a reliable higher resolution of the visible sensor allows for capturing a reliable dataset for 3D reconstruction. dataset for 3D reconstruction.

Table 2. Comparison of selected recording devices. Table 2. Comparison of selected recording devices. Imaging Specifications FLIR C2 FLIR Duo R Imaging Specifications FLIR C2 FLIR Duo R IR sensorIR sensor 80 80 x 6060 pixels 160 x 160120 pixels120pixels × × ThermalThermal sensitivity sensitivity <0.10<0.10 °C◦ C <0.050< 0.050°C (*) ◦C (*) FieldField of view of view 41° 41 x◦ 31°31 ◦ 57° x 5744°◦ 44◦ × µ × µ SpectralSpectral range range 7.5 7.5–14–14 mm 7.5–13.5 7.5–13.5 m m Accuracy 2 C 5 C Accuracy ±2± °C◦ ±5 °C± ◦ Digital camera 640 480 pixels 1920 1080 pixels Digital camera 640 x ×480 pixels 1920 x 1080× pixels Operating temp. range 10 to +50 C 0 to +50 C − ◦ ◦ WeightOperating (incl. battery) temp. range −10 to 0.13 +50 kg °C 0 to +50 0.084 °C kg WeightSize (incl. battery) 125 0.1380 kg24 mm3 590.08441 kg 29.6 mm3 Size 125 x ×80 x ×24 mm3 59 x 41 x× 29.6× mm3 (*) nominal sensitivity of the Lepton core sensor https://www.flir.com/products/lepton/. (*) nominal sensitivity of the Lepton core sensor https://www.flir.com/products/lepton/.

FigureFigure 2. 2.FLIR FLIR C2 C2 (left) (left) and and FLIR FLIR Duo Duo R R (right) (right) [47 [4,487,4].8] Pictures. Pictures are are not not in in scale. scale.

2.3. Calibration Procedure of Visible-Thermal Sensors 2.3. Calibration Procedure of Visible-Thermal Sensors TheThe calibration calibration boils boils down down to to the the solution solution of of a a geometric geometric problem: problem the: the estimation estimation of of the the relative relative spatialspatial position position of of the the imaging imaging sensors sensors (visible (visible and and thermal). thermal) The. The key key aspect aspect here here is is that that visible visible and and thermalthermal images images are are acquired acquired in in a single a single measurement measurement process, process therefore, therefore the calibrationthe calibration target target must must be be suitable for the two imaging modalities, namely the reflective- and the emissive-based. A multi- step procedure is adopted only in the processing phase.

J. Imaging 2020, 6, 76 5 of 13 suitable for the two imaging modalities, namely the reflective- and the emissive-based. A multi-step procedureJ. Imaging 2020 is, 6 adopted, x FOR PEER only REVIEW in the processing phase. 5 of 14

2.3.1. The Calibration Passive Target Calibration targetstargets can can be be of of di differentfferent types types depending depending on on (i) the(i) the markers markers used used (corners, (corners, circles, circle etc.),s, (ii)etc. their), (ii) arrangement their arrangement (structured (structure or unstructured)d or unstructured) and (iii) the and working (iii) the principle working (active, principle passive) (active, [49]. Inpassive) this paper, [49]. aIn simple this paper target, a (shownsimple target in Figure (shown3) was in chosen Figure to3)be was cost-e chosenffective to be and cost easy-effective to deploy. and Accordingeasy to deploy to the. A previousccording taxonomy, to the previous it is based taxonomy, on squared it is featuresbased on (grid squared pattern), features and it(grid is structured pattern), and passiveit is structured (it does and not passive require external(it does not energy require sources, external e.g., energy lightbulbs). sources, e.g., lightbulbs).

Figure 3. The calibration target.

TheThe targettarget has has been been properly properly designed designed for thefor calibrationthe calibration of both of the both visible the visible and the and LWIR the thermal LWIR sensor.thermal Asensor. layer A of layer material of material with a knownwith a known emissivity emissiv valueity isvalue covered is covered by a second by a second material, material with, croppedwith cropped squares, squares, having having very di ffveryerent different emissivity, emissivity, giving rise giving to a geometric rise to a pattern,geometric like pattern a chessboard., like a Thechessboard. difference The in emissivitydifference inin theemissivity two zones in resultsthe two in zones two di resultsfferent in radiation two different values emitted,radiation creating values distinguishableemitted, creating zones distinguishable in the thermal zones images. in the thermal In particular, images the. In masked particular, layer the is masked made by layer aluminum is made paperby aluminum well laid paper and painted well laid black and (emissivity painted aroundblack (emissivity 0.3), and the around cover is 0.3) white, and cardboard the cover (emissivity is white aroundcardboard 0.9). (emissivity The use of around two separate 0.9). The physical use of layers, two separate i.e., not ofphysical a painted layers pattern, i.e., on not single of a support,painted allowedpattern on limiting single thesupport, blurring allow dueedto limiting thermal the di ffblurringusion at due the to interfaces, thermal diffusion thus producing at the interfaces a sharper, pattern.thus producing The problem a sharper of the pattern specular. The reflection problem inof thethe thermalspecular rangereflection was in addressed the thermal by applyingrange was a finishingaddressed (micro-roughness) by applying a finishing to the aluminum (micro-roughness) to obtain to a ditheffusive aluminum surface to at obtain LWIR a . diffusive surface This wasat LWIR simply wavelengths done by pressing. This was the simply foil on adone sandpaper by pressing with the a course foil on grid a sandpaper size. with a course grid size. 2.3.2. The Calibration Algorithm 2.3.2.The The extraction Calibration of Algorithm salient features from the visible images is straightforward, thanks to the high contrastThe and extraction sharpness of salient of the patternfeatures and from the the imaging visible performance images is straightforward of CCD (Charge-Coupled, thanks to the Device) high cameras.contrast and The sharpness identification of the of pattern the internal and the regions imaging was performance carried out by of applying CCD (Charge an intensity-based-Coupled Device blob) detectioncameras. T algorithmhe identification [50] to find of the the internal square centroid regions ofwas each carri grided element.out by applying Clearly, besidesan intensity the limited-based sizeblob ofdetection the bolometric algorithm sensor, [50] the to thermalfind the imagessquare arecentroid less sharp of each than grid visible element. ones Clearly, due to the beside natures the of LWIRlimited imaging size of the itself. bolometric Thermal sensor contrast, the is athermalffected byimages the contribution are less sharp of thethan environment visible ones thatdue causesto the spuriousnature of reflectionsLWIR imaging and makes itself. feature Thermal extraction contrast more is affected difficult by to the achieve. contribution Figure4 of shows the environment an example ofthat the cause targets spurious recorded atreflections the same a timend make in thes twofeature diff erent extraction bands. more difficult to achieve. Figure 4 showsSome an example specular of contributions the target recorded on the aluminum at the same squares time in still theoccur two differ in bothent the bands. thermal and visible range and the problem was treated with a dedicated pipeline in the calibration phase, as detailed in the following.

J. Imaging 2020, 6, x FOR PEER REVIEW 6 of 14

J. Imaging 2020, 6, 76 6 of 13 J. Imaging 2020, 6, x FOR PEER REVIEW 6 of 14

Figure 4. The target recorded in the visible (left) and in the thermal bands (right) in a single measurement.

Some specular contributions on the aluminum squares still occur in both the thermal and visible range and the problem was treated with a dedicated pipeline in the calibration phase, as detailed in the followingFigure 4. 4.The. The target target recorded recorded in the in visible the visible (left) and (left) in the and thermal in the bands thermal (right) band in as single (right) measurement. in a single Firstmeasurement, the images. are segmented using the Maximum Stable Extremal Region blob detector (MSER)First, [50] the. For images each image are segmented, the resulting using blob the set Maximum, which may Stable contain Extremal outliers Region, is refined blob, imposing detector (MSER)a set Someof [shape50 specular]. For constraints each contributions image, on theeach resulting on element. the aluminum blob The set, resulting whichsquares may set still is contain occur then indilated outliers, both thein is orderthermal refined, to imposingandisolate visible the a setsinglerange of shapeandconnected the constraints problem component onwas each treated containing element. with the Thea dedicated calibration resulting pipeline set pattern. is then in Using the dilated calibration the in Hough order phase totransform isolate, as detailed the, a singleset ofin connectedlinesthe following is fitted component along. the containingedges of the the connected calibration component pattern. Usingand the the homography Hough transform, matrix is a setestimated of lines. isThe fitted imageFirst along, theperspective the images edges areis of corrected the segmented connected by applying using component the the Maximum homography and the Stable homography transformation Extremal matrix Region and is estimated., ifblob the detectorsquares The imageare(MSER) not perspective fully[50]. For detected each is corrected imagein the, t initialbyhe applyingresulting phase the,blob an homography auxiliaryset, which search may transformation contain routine outlier is and, applies, ifisd the ref aroundined squares, imposing the are first not fullyneighborsa set detectedof shape of the inconstraints thedetected initial square.on phase, each anThis element. auxiliary routine Thesearch applie resulting routines a local set is clusteringis applied then dilated around using in an the order adaptive first to neighbors isolate k-means the of theandsingle detected searches connected square. for componentcorrupted This routine squares. containing applies Assuming athe local calibration clustering to have pattern. ausing good anUsing estimation adaptive the Hough k-means of the transform square and searches size,, a set this for of corruptedauxiliarylines is fitted procedure squares. along Assumingthe clusters edges the of to thegrey have connected level a good aroundestimation component each neighbor of and the the square homographyinto size,high thiser and auxiliarymatrix low eris procedureestimatedintensities. clusterswithThe image respect the perspective greyto the level calibration aroundis corrected eachmask by neighbor grey applying level into. theThese higher homography clusters and lower are transformation then intensities merged, withand and, respect ifthe the resulting squares to the calibrationregionare not is fully checked mask detected grey against level. in a the Theseset initial of clustersshape phase constraints. are, an then auxiliary merged, If this search andnew the region routine resulting satisfies is applie region thed is aroundcons checkedtraints the against, firstit is aassumedneighbors set of shape to of be the constraints. a new detected square Ifsquare. thisand newadded This region routineto the satisfies detected applie thes seta constraints,local. The clustering procedure it is assumedusing is iteratively an toadaptive be aapplied new k- squaremeans until andthe detected added searches to set thefor is detectedcorrupted full or set. the squares. Themaximum procedure Assuming iteration is iterativelyto is have reached. a applied good If estimationthe until image the detected is of not the rejected,square set is fullsize, a orset this the of maximumhorizontalauxiliary procedure iterationof vertical is clusterslines reached. are the fitted If thegrey into image level the is aroundpoints not rejected, and each the neighbor a set set of of square horizontal into highcentroids ofer verticaland is lowsubstituted lineser intensit are fittedwithies intothewith corresponding the respect points to and the the linecalibration set intersections of square mask centroids .grey The level last is substituted. stepThese is clusters to apply with are the the correspondingthen stereo merged, intrinsic and line andthe intersections. resulting extrinsic Theestimationregion last is step checked proposed is to applyagainst in Reference the a stereoset of shape intrinsic[51]. The constraints. andproposed extrinsic If calibration this estimation new region procedure proposed satisfies allow in the Referenceed cons to mitigatetraints [51]., Theitthe is proposedpossibleassumed reflections to calibration be a new on proceduresquare the calibration and allowedadded target to to the mitigate in detected the visible the set possible and. The thermal procedure reflections band iss oniteratively, which the calibration have applied their target untilown incharacteristicsthe the detected visible set and (see is thermal Figure full or 5) thebands, and maximum degrade which have iteration the theireffectiveness is own reached. characteristics of Ifa classical the image (see geometric is Figure not rejected,5 calibration) and degrade a set at ofa thedifferenthorizontal effectiveness level of vertical. Combining of a lines classical arestate fitted geometric-of-the into-art the calibration computer points and vision at t ahe di setalgorithmsfferent of square level., it centroids Combiningwas possible is substituted state-of-the-art to recover with the computercorruptedthe corresponding visionsquare algorithms,regions line intersections, reducing it was the possible. The number last to recover stepof the is undetectable theto apply corrupted the images. stereo square This intrinsic regions, allow and reducinged to extrinsic reduce the numbertheestimation calibration of theproposed undetectabletime and in Reference to perform images. [5 1the This]. The calibration allowed proposed to in reduce calibrationan uncontrolled the calibration procedure environmenttime allow anded with to performmitigate a passive the calibrationtarget,possible without reflections in an the uncontrolled need on the of calibrationheating environment source targets. with in the a passive visible target,and thermal without band thes need, which of heatinghave their sources. own characteristics (see Figure 5) and degrade the effectiveness of a classical geometric calibration at a different level. Combining state-of-the-art algorithms, it was possible to recover the corrupted square regions, reducing the number of the undetectable images. This allowed to reduce the calibration time and to perform the calibration in an uncontrolled environment with a passive target, without the need of heating sources.

Figure 5. TheThe reflections reflections affecting affecting the target, in the visible (left) and in the thermal bands (right).(right).

Figure 5. The reflections affecting the target, in the visible (left) and in the thermal bands (right).

J. Imaging 2020, 6, 76 7 of 13 J.J. Imaging Imaging 20 202020, ,6 6, ,x x FOR FOR PEER PEER REVIEW REVIEW 77 of of 14 14

TheThe calibration calibration method method has has been been successfully successfullysuccessfully applied applied applied both both indoor indoors indoorss and and outdooroutdoors, outdoors,s, obtaining obtainingobtaining comparablecomparable estimationestimation error, errorerror, and, andand a a relativea relativerelative rotation rotationrotation between betweenbetween the thethe sensors sensorssensors of zero, ofof zerozero as expected.,, asas expected.expected. The veryTheThe verysmallvery smallsmall calibration calibration calibration error error error allows allows allows the 3D the the reconstruction 3D 3D reconstruction reconstruction to be toto eff be ectivelybe effectively effectively performed, performed, performed, as shown asas shown shown later. lat later.er. 2.4. Validation on Virtual Environment 2.4.2.4. ValidationValidation onon VirtualVirtual EnvironmentEnvironment To validate the reconstruction algorithm, a virtual environment was initially used. The simulator ToTo validate validate the the reconstruction reconstruction algorithm algorithm,, a a virtual virtual environment environment was was initially initially used. used. The The simulator simulator used during the testing for the aerial 3D reconstruction was CoppeliaSim, an open-source robotics usedused duringduring thethe testingtesting forfor thethe aerialaerial 3D3D reconstructionreconstruction waswas CoppeliaSimCoppeliaSim,, anan openopen--sourcesource roboticsrobotics simulator with interfaces to multiple programming languages. The simulation environment (shown in simulatorsimulator withwith interfacesinterfaces toto multiplemultiple programmingprogramming languages.languages. TheThe simulationsimulation environmentenvironment (shown(shown Figure6) was composed of a teleoperated quadrotor, a multi-texturized model of a building, a model inin FigureFigure 6)6) waswas composedcomposed ofof aa teleoperatedteleoperated quadrotor,quadrotor, aa multimulti--texturizedtexturized modelmodel ofof aa buildingbuilding,, aa of a stereo vision system and a virtualized GPS. modelmodel of of aa stereostereo visionvision systemsystem andand aa virtualizedvirtualized GPSGPS..

FigureFigure 6. 6.6. TheThe virvir virtualtualtual environmentenvironment withwith thethe texturetexture forfor the the visible visible (left) (left) and and for for the the thermal thermalthermal band bandsbandss (right)(right).(right). .

TheThe vision vision system system has has been been configured configured configured to to mimic mimic the the behavior behavior of of a a realistic realistic realistic thermographic camera setupsetup,setup,, specifically specificallyspecifically by by setting setting thethe nominalnominal fieldfield field ofof view,view, sensor sensor size size and and frame frame raterate ofof the the FLIR FLIR DuoDuo Duo,, , togethertogether with with t thethehe results results from from the the geometric geometric calibration calibration for forfor the thethe relative relative positioning positioningpositioning of ofof the thethe visible visible and and thermalthermal sensors sensors.sensors.. Finally,Finally, Finally, toto simulatesimulate thethe appearanceappearance ofof thethe building building in in the the reflective reflectivereflective and andand emissive emissiveemissive imagingimaging modes modes,modes,, we we applied applied two two specific specificspecific textures texturestextures which whichwhich are areare then thenthen “ “seen”“seenseen”” either eithereither by byby the the visible visible or or by by the the thermalthermal simulated simulated sensorsensor sensor,,, respectively. respectively. The The simulator simulator has has been been interfaced interfaced toto MATLABMATLAB MATLAB,,, from fromfrom where wherewhere wewe compute computedcomputedd and and control controlledcontrolledled the thethe drone dronedrone trajectory trajectorytrajectory and andand the thethe image imageimage acquisition. acquisition.acquisition. Figure Figure 77 shows shows thethe 3D3D reconstructionreconstruction using usingusing the thethe images imagesimages acquired acquiredacquired byby the thethe simulated simulated drone.drone.

FigureFigure 7. 7. 7. SuperpositSuperpositSuperpositionionion ofof of the the dual dual visible visible-thermalvisible--thermalthermal three three-dimensionalthree--dimensionaldimensional ( ((3D)3D3D)) reconstruction reconstruction of of the the virtualvirtual environment. environment.

J. Imaging 2020, 6, 76 8 of 13

2.5. 3D Reconstruction Pipeline The Structure from Motion (SFM) technique was chosen as the reconstruction algorithm. SFM [52–59] is a photogrammetry technique able to reconstruct a sparse 3D model of a static target from several 2D images of the same object taken from different points of view. To get a dense 3D reconstruction, the resulting sparse model must be further elaborated using a multi-view stereo (MVS) [59] algorithm. The SFM and MVS methods used for the reconstruction were provided by the open-source projects OpenMVG (Open Multiple View Geometry) [60] and CMVS (Clustering Views for Multi-View Stereo), respectively. The proposed reconstruction methodology uses the visible images for building the full 3D object reconstruction by SFM and MVS. Then, thanks to the geometric calibration of the dual visible-thermal sensor, for each 3D point of the reconstructed model, the corresponding radiometric thermal value is computed. This is accomplished by keeping track, for each 3D point, of the pair of images from which it has been triangulated and then projecting it back into the image plane of one of them. Once in the image plane, its coordinates (in pixels) are transformed from the visible image frame into the related thermal frame (through the homogeneous transformation estimated with the geometric calibration presented in Section 2.3), where the radiometric value can be obtained. By performing this reprojection for each 3D point, it is possible to derive the final radiometric 3D thermal model. Clearly, due to the different sensor resolution, the thermal mapping is not bijective anymore. In our approach, the thermal images are not subjected to any super sampling or other interpolation technique, and multiple pixels of the visible images are simply mapped into a single thermal and, consequently, to the same temperature. The advantage of this approach is that the temperatures mapped in the point cloud are the real values recorded by the thermal sensor. In order to build the visible 3D model, the first step is to extract the conjugated features for each image. The discriminative capabilities of these features heavily affect the performance and the quality of the overall Structure from Motion. In our method, we adopted A-KAZE (Accelerated KAZE), a fast multi-scale feature detection and description method based on nonlinear scale spaces [61].

2.6. Mission Planning and Drone Control The trajectory for the image acquisition is planned before the mission using Pyfplanner, an open-source software, developed by the authors in Python, and available online at https://gitlab.com/ npiccinelli/pyfplanner. The aim of the software is to provide a sequence of commands to later be sent to the Unmanned Aerial Vehicle (UAV) through the open-source ground station software Mission Planner (https://ardupilot.org/planner/index.html). Table3 lists the available commands.

Table 3. Mission planning and Unmanned Aerial Vehicle (UAV) control command list.

UAV Commands Parameters Description HOME (latitude, longitude, altitude) Define the home position for the UAV controller WAYPOINT (latitude, longitude, altitude) Define a new waypoint to the current flight plan Set a camera rotation relative to the forward CAMROT (yaw, pitch, rool) direction SHOT none Trigger the shot command to the FLIR Duo R Define the forward direction of the camera with CYAW (yaw, pitch, rool) respect to the initial rotation LOITER (latitude, longitude, altitude) Keep the UAV in the commanded position LAND none Start the landing procedure

The trajectory generated by the software is on the plane along the vertical direction of the line connecting the initial and the final position. The positions are defined in geographic coordinates. The maximum height of the trajectory is defined in meters with respect to a ground offset in order to avoid an undesired hovering effect. If the camera field of view (FOV) and the acquisition distance are known in advance, it is possible to derive the waypoint distance based on the desired overlap percentage J. Imaging 2020, 6, x FOR PEER REVIEW 9 of 14 J. Imaging 2020, 6, 76 9 of 13 percentage between neighbor images. Otherwise, the software allows to set the vertical and horizontal traverse steps manually. In the case of a visible-thermal stereo system with different FOVs between(e.g., such neighbor as the FLIR images. Duo Otherwise, R), to guarantee the software the full allows coverage to set by the the vertical two sensors and horizontal, the FOV traverseused to stepsplan the manually. trajectory In should the case be of the a visible-thermal smaller one. stereo system with different FOVs (e.g., such as the FLIR Duo R), to guarantee the full coverage by the two sensors, the FOV used to plan the trajectory should3. Results be theand smaller Discussion one.

3. ResultsThe goal and Discussion of this work was to propose an effective and simple-to-use workflow for 3D thermography of buildings by exploiting dual visible-thermal sensors mounted on Unmanned Aerial VehicleThes. goal The ofmethod this work has was been to valid proposeated an using effective only and the simple-to-useFLIR Duo R model workflow because for 3D even thermography if the FLIR ofC2 buildings is a dual byvisible exploiting-thermal dual camera visible-thermal, the limited sensors size of mounted the visible on sensor Unmanned does Aerialnot provide Vehicles. enough The methodspatial resolution has been validatedwhen applied using in only the thefield FLIR of aerial Duo R3D model reconstruction. because even In fact, if the to FLIRhave C2a dense is a dual 3D visible-thermalreconstruction, camera,imaging the resolution limited size must of thebe high visible enough sensor to does capture not provide the texture enough of the spatial reconstructed resolution whensurface. applied At a typical in the distance field of of aerial 5 m 3Dfrom reconstruction. the building, the In FLIR fact, C2 to havecannot a denseguarantee 3D reconstruction,a good enough imagingreconstruction resolution; to be must more be highspecific, enough the tosize capture of the the image texture cell of at the object reconstructed plane, at a surface. distance At of a typical5 m, is distanceabout 8 mm of 5 mfor from the FLIR the building, C2 and 4 the mm FLIR for C2thecannot Duo R, guarantee corresponding a good to enough a minimum reconstruction; resolved detail to be moreof 16 and specific, 8 mm the respectively, size of the image according cellat to object Nyquist. plane, Moreover, at a distance as the of FLIR 5 m, Duo is about R has 8 mmbeen for specifically the FLIR C2designed and 4 mm to forbe the carried Duo R, around corresponding with a drone to a minimum, it has resolved a ready- detailto-use of interface 16 and 8 mm with respectively, the drone accordingcommunication to Nyquist. system Moreover, MAVLink as. the FLIR Duo R has been specifically designed to be carried around with a drone, it has a ready-to-use interface with the drone communication system MAVLink. 3.1. Experimental Setup 3.1. Experimental Setup The proposed solution was experimentally validated in a noncontrolled environment by The proposed solution was experimentally validated in a noncontrolled environment by performing the outdoor 3D reconstruction of a building under restoration. The image acquisition was performing the outdoor 3D reconstruction of a building under restoration. The image acquisition done near the city of Verona (Italy) in mid-April 2019, with cloudy conditions and atmospheric was done near the city of Verona (Italy) in mid-April 2019, with cloudy conditions and atmospheric temperature between 9 and 13 °C . The UAV used was a custom-made quadrotor controlled through temperature between 9 and 13 C. The UAV used was a custom-made quadrotor controlled through a a Pixhawk running ArduPilot◦ , an open-source project based on the Arduino framework. The Pixhawk running ArduPilot, an open-source project based on the Arduino framework. The quadrotor quadrotor was also provided with an electronic gimbal to control the rotation of the vision system. was also provided with an electronic gimbal to control the rotation of the vision system. The UAV The UAV flight plan was made using Pyfplanner and uploaded in the UAV controller using Mission flight plan was made using Pyfplanner and uploaded in the UAV controller using Mission Planner. Planner. The trajectory was designed to provide a sequence of vertical and horizontal images with 80% The trajectory was designed to provide a sequence of vertical and horizontal images with 80% of overlap, a façade distance of 10 m, for safety reasons, and maximum height of 8 m. The acquired of overlap, a façade distance of 10 m, for safety reasons, and maximum height of 8 m. The acquired dataset is composed of 37 pairs of visible and thermal images (an example is shown in Figure8). The dataset is composed of 37 pairs of visible and thermal images (an example is shown in Figure 8). The whole measurement process took 5 min. The 3D reconstruction and the thermal mapping run on a i7 whole measurement process took 5 min. The 3D reconstruction and the thermal mapping run on a i7 8700 with 32 GB of RAM and took about 15 min to accomplish the dense reconstruction. 8700 with 32 GB of RAM and took about 15 min to accomplish the dense reconstruction.

Figure 8. Images from the dataset. Picture of the building in the visible band (left) and same view Figure 8. Images from the dataset. Picture of the building in the visible band (left) and same view captured in the thermal band (middle and right). The thermal range is between 3 and 14 °C. captured in the thermal band (middle and right). The thermal range is between 3 and 14 ◦C.

3.2.3.2. 3D3D Reconstruction TheThe resultingresulting 3D3D reconstruction reconstruction is is shown shown in in Figure Figure9, 9 and, and even even without without a quantitative a quantitative evaluation evaluation of theof the mapping mapping accuracy, accuracy it can, it be can seen be how seen the how proposed the proposed method is method able to map is able the to thermal map theinformation thermal overinformation the 3D reconstructionover the 3D reconstruction with acceptable with accuracy. acceptable accuracy.

J. Imaging 2020, 6, x FOR PEER REVIEW 10 of 14 J. Imaging 2020, 6, 76 10 of 13

FigureFigure 9. Di 9.ff erentDifferent views view ofs of the the building building from from thethe aerialaerial 3D 3D reconstruction reconstruction in inthe the visible visible band band (left (left column)column) and thermaland thermal band band (right (right column). column). The The thermalthermal range range of of the the 3D 3D reconstruction reconstruction was was adjusted adjusted between 11 and 13.5 °C. between 11 and 13.5 ◦C.

4. Conclusions4. Conclusions 3D thermography3D thermography can can be be an an unparalleled unparalleled tool tool forfor building diagnostics. diagnostics. When When used used by drones by drones,, it it can allowcan allow safe safe inspection inspection of of parts parts that that are are di difficultfficult to reach or or that that would would be bedifficult difficult to examine to examine in in any other way (such as roofs). Applications can range from structural or maintenance diagnostics to any other way (such as roofs). Applications can range from structural or maintenance diagnostics to the investigation of large archaeological sites and energy audits. 3D thermography, generally always the investigation of large archaeological sites and energy audits. 3D thermography, generally always associated with a 3D representation of the building in the visible band, has the key feature of allowing associatedan accurate with alocation 3D representation of the thermal of map. the building in the visible band, has the key feature of allowing an accurateIn location this article of, thewe thermal have proposed map. a simple system to realize an aerial 3D thermography of Inbuildings. this article, The wesystem have consists proposed of a single a simple device, system which to realizetakes 2D an images aerial simultaneously 3D thermography in the of visible buildings. The systemand long consists infrared ofban ads single. After device,calibration which, it is possible takes 2D to reconstruct images simultaneously the 3D in the visible in the band visible with and long infraredSFM techniques bands. and After then calibration,add the thermal it is information. possible to The reconstruct system has the been 3D validated in the visible during band a real with SFM techniquesmeasurement and campaign then add from the a drone thermal on a information. civil building. The system has been validated during a real measurementLaser campaign scanning can from provide a drone a great on a amount civil building. of data, in the form of a point cloud dataset, but instrumentation is costly and requires a highly skilled operator. Also, LiDAR (Light detection and Laser scanning can provide a great amount of data, in the form of a point cloud dataset, but Ranging) instruments can provide 3D data with high spatial precision but once again, at a high cost. instrumentation is costly and requires a highly skilled operator. Also, LiDAR (Light detection and Although the obtained results cannot compete with those provided by these more sophisticated Ranging)instrumentation instrumentss, we can also provide consider 3Dthe dataperformance with high of the spatial proposed precision simple and but cost once-effective again, system at a high cost. Althoughvery interesting the obtained in the continual results monitoring cannot compete of historical with buildings those provided and 3D byobjects, these e.g., more statues sophisticated. instrumentations, we also consider the performance of the proposed simple and cost-effective system very interesting in the continual monitoring of historical buildings and 3D objects, e.g., statues.

Author Contributions: Conceptualization, C.D., R.M. and D.A.; methodology, R.M., N.G. and N.P.; software, N.G. and N.P.; investigation, N.G. and N.P.; writing—original draft preparation, D.A. and T.d.R.; writing—review and J. Imaging 2020, 6, 76 11 of 13 editing, C.D., R.M. and T.d.R.; supervision, C.D., R.M. and D.A.; funding acquisition, R.M. and C.D. All authors have read and agreed to the published version of the manuscript. Funding: This research was partially funded by POR FSE 2014–2020 Regione del Veneto, Project Title: Sviluppo di tecnologie basate su immagini termiche per il monitoraggio indoor and outdoor, Project Code: 1695-29-2216-2016, and by MIUR project “Department of Excellence” 2018–2020. Conflicts of Interest: The authors declare no conflict of interest. The funders had no role in the design of the study; in the collection, analyses, or interpretation of data; in the writing of the manuscript, or in the decision to publish the results.

References

1. Song, Z. Handbook of 3D machine vision: Optical metrology and imaging. In Optics and Optoelectronics; CRC Press: Boca Raton, FL, USA, 2013; ISBN 9781439872192. 2. Vollmer, M.; Möllmann, K.-P. Infrared Thermal Imaging: Fundamentals, Research and Applications, 2nd ed.; Wiley-VCH Verlag GmbH & Co. KGaA: Weinheim, Germany, 2018; ISBN 9783527693320. 3. Rakha, T.; Gorodetsky, A. Review of Unmanned Aerial System (UAS) applications in the built environment: Towards automated building inspection procedures using drones. Autom. Constr. 2018, 93, 252–264. [CrossRef] 4. Garrido, I.; Lagüela, S.; Otero, R.; Arias, P. Thermographic methodologies used in infrastructure inspection: A review—Post-processing procedures. Appl. Energy 2020, 266, 114857. [CrossRef] 5. Huang, Y.; Shih, P.; Hsu, K.-T.; Chiang, C.-H. To identify the defects illustrated on building facades by employing infrared thermography under shadow. NDT E Int. 2020, 111, 102240. [CrossRef] 6. Teni, M.; Krsti´c,H.; Kosi´nski,P. Review and comparison of current experimental approaches for in-situ measurements of building walls thermal transmittance. Energy Build. 2019, 203, 109417. [CrossRef] 7. Bienvenido-Huertas, D.; Moyano, J.; Marín, D.; Fresco-Contreras, R. Review of in situ methods for assessing the thermal transmittance of walls. Renew. Sustain. Energy Rev. 2019, 102, 356–371. [CrossRef] 8. Soares, N.; Martins, C.; Gonçalves, M.; Santos, P.; da Silva, L.S.; Costa, J.J. Laboratory and in-situ non-destructive methods to evaluate the thermal transmittance and behavior of walls, windows, and construction elements with innovative materials: A review. Energy Build. 2019, 182, 88–110. [CrossRef] 9. Glavaš, H.; Hadzima-Nyarko, M.; Buljan, I.H.; Bari´c,T. Locating Hidden Elements in Walls of Cultural Heritage Buildings by Using Infrared Thermography. Buildings 2019, 9, 32. [CrossRef] 10. Royuela-del-Val, A.; Padilla-Marcos, M.Á.; Meiss, A.; Casaseca-de-la-Higuera, P.; Feijó-Muñoz, J. Air infiltration monitoring using thermography and neural networks. Energy Build. 2019, 191, 187–199. [CrossRef] 11. Nardi, I.; Lucchi, E.; de Rubeis, T.; Ambrosini, D. Quantification of heat energy losses through the building envelope: A state-of-the-art analysis with critical and comprehensive review on infrared thermography. Build. Environ. 2018, 146, 190–205. [CrossRef] 12. Kirimtat, A.; Krejcar, O. A review of infrared thermography for the investigation of building envelopes: Advances and prospects. Energy Build. 2018, 176, 390–406. [CrossRef] 13. Baldinelli, G.; Bianchi, F.; Rotili, A.; Costarelli, D.; Seracini, M.; Vinti, G.; Asdrubali, F.; Evangelisti, L. A model for the improvement of thermal bridges quantitative assessment by infrared thermography. Appl. Energy 2018, 211, 854–864. [CrossRef] 14. Lucchi, E. Applications of the infrared thermography in the energy audit of buildings: A review. Renew. Sust. Energy Rev. 2018, 82, 3077–3090. [CrossRef] 15. Lerma, C.; Barreira, E.; Almeida, R.M.S.F. A discussion concerning active infrared thermography in the evaluation of buildings air infiltration. Energy Build. 2018, 168, 56–66. [CrossRef] 16. O’Grady, M.; Lechowska, A.A.; Harte, A.M. Quantification of heat losses through building envelope thermal bridges influenced by wind velocity using the outdoor infrared thermography technique. Appl. Energy 2017, 208, 1038–1052. [CrossRef] 17. Barreira, E.; Almeida, R.M.S.F.; Moreira, M. An infrared thermography passive approach to assess the effect of leakage points in buildings. Energy Build. 2017, 140, 224–235. [CrossRef] 18. Fox, M.; Goodhew, S.; De Wilde, P. Building defect detection: External versus internal thermography. Build. Environ. 2016, 105, 317–331. [CrossRef] 19. Nardi, I.; Paoletti, D.; Ambrosini, D.; de Rubeis, T.; Sfarra, S. U-value assessment by infrared thermography: A comparison of different calculation methods in a Guarded Hot Box. Energy Build. 2016, 122, 211–221. [CrossRef] J. Imaging 2020, 6, 76 12 of 13

20. Djupkep Dizeu, F.; Maldague, X.; Bendada, A. Mapping of the Indoor Conditions by Infrared Thermography. J. Imaging 2016, 2, 10. [CrossRef] 21. Barreira, E.; Almeida, R.M.S.F.; Delgado, J.M.P.Q. Infrared thermography for assessing moisture related phenomena in building components. Constr. Build. Mater. 2016, 110, 251–269. [CrossRef] 22. Sfarra, S.; Marcucci, E.; Ambrosini, D.; Paoletti, D. Infrared exploration of the architectural heritage: From passive infrared thermography to hybrid infrared thermography (HIRT) approach. Mater. Constr. 2016, 66, e094. [CrossRef] 23. Fox, M.; Coley, D.; Goodhew, S.; De Wilde, P. Time-lapse thermography for building defect detection. Energy Build. 2015, 92, 95–106. [CrossRef] 24. Albatici, R.; Tonelli, A.M.; Chiogna, M. A comprehensive experimental approach for the validation of quantitative infrared thermography in the evaluation of building thermal transmittance. Appl. Energy 2015, 141, 218–228. [CrossRef] 25. Kylili, A.; Fokaides, P.A.; Christou, P.; Kalogirou, S.A. Infrared thermography (IRT) applications for building diagnostics: A review. Appl. Energy 2014, 134, 531–549. [CrossRef] 26. Nardi, I.; Sfarra, S.; Ambrosini, D. Quantitative thermography for the estimation of the U-value: State of the art and a case study. J. Phys. Conf. Ser. 2014, 547, 012016. [CrossRef] 27. Krankenhagen, R.; Maierhofer, C. Pulse phase thermography for characterising large historical building façades after solar heating and shadow cast—A case study. Quant. Infrared Thermogr. J. 2014, 11, 10–28. [CrossRef] 28. Paoletti, D.; Ambrosini, D.; Sfarra, S.; Bisegna, F. Preventive thermographic diagnosis of historical buildings for consolidation. J. Cult. Herit. 2013, 14, 116–121. [CrossRef] 29. Dall’O’, G.; Sarto, L.; Panza, A. Infrared Screening of Residential Buildings for Energy Audit Purposes: Results of a Field Test. Energies 2013, 6, 3859–3878. [CrossRef] 30. Martínez-Corral, M.; Javidi, B. Fundamentals of 3D imaging and displays: A tutorial on integral imaging, light-field, and plenoptic systems. Adv. Opt. Photonics 2018, 10, 512. [CrossRef] 31. Remondino, F.; El-Hakim, S. Image-based 3D modelling: A review. Photogramm. Rec. 2006, 21, 269–291. [CrossRef] 32. Santos, P.; Ritz, M.; Fuhrmann, C.; Fellner, D. 3D mass digitization: A milestone for archaeological documentation. Virtual Archaeol. Rev. 2017, 8, 1. [CrossRef] 33. Aicardi, I.; Chiabrando, F.; Maria Lingua, A.; Noardo, F. Recent trends in cultural heritage 3D survey: The photogrammetric computer vision approach. J. Cult. Herit. 2018, 32, 257–266. [CrossRef] 34. Markiewicz, J.; Tobiasz, A.; Kot, P.; Muradov, M.; Shaw, A.; Al-Shammaa, A. Review of Surveying Devices for Structural Health Monitoring of Cultural Heritage Buildings. In Proceedings of the 2019 12th International Conference on Developments in eSystems Engineering (DeSE), Kazan, Russia, 7–10 October 2019; pp. 597–601. 35. Chernov, G.; Chernov, V.; Barboza Flores, M. 3D dynamic thermography system for biomedical applications. In Application of Infrared to Biomedical Sciences; Ng, E.Y.K., Etehadtavakol, M., Eds.; Springer: Singapore, 2017; pp. 517–545. 36. Nardi, I.; Sfarra, S.; Ambrosini, D.; Dominici, D.; Rosciano, E. Complementarity of terrestrial laser scanning and IR thermography for the diagnosis of cultural heritage: The case of Pacentro Castle. In Proceedings of the 13th International Workshop on Advanced Infrared Technology & Applications, Pisa, Italy, 29 Semptember–2 October 2015; pp. 263–266, ISBN 9788879580250. 37. Wang, C.; Cho, Y.K.; Gai, M. As-Is 3D Thermal Modeling for Existing Building Envelopes Using a Hybrid LIDAR System. J. Comput. Civ. Eng. 2013, 27, 645–656. [CrossRef] 38. González-Aguilera, D.; Rodriguez-Gonzalvez, P.; Armesto, J.; Lagüela, S. Novel approach to 3D thermography and energy efficiency evaluation. Energy Build. 2012, 54, 436–443. [CrossRef] 39. Natephra, W.; Motamedi, A.; Yabuki, N.; Fukuda, T. Integrating 4D thermal information with BIM for building envelope thermal performance analysis and thermal comfort evaluation in naturally ventilated environments. Build. Environ. 2017, 124, 194–208. [CrossRef] 40. Ceccarelli, S.; Guarneri, M.; Ferri de Collibus, M.; Francucci, M.; Ciaffi, M.; Danielis, A. Laser Scanners for High-Quality 3D and IR Imaging in Cultural Heritage Monitoring and Documentation. J. Imaging 2018, 4, 130. [CrossRef] 41. Kirimtat, A.; Krejcar, O.; Selamat, A.; Herrera-Viedma, E. FLIR vs SEEK thermal cameras in biomedicine: Comparative diagnosis through infrared thermography. BMC Bioinform. 2020, 21, 88. [CrossRef] J. Imaging 2020, 6, 76 13 of 13

42. Jaspers, M.E.H.; Carrière, M.E.; Meij-de Vries, A.; Klaessens, J.H.G.M.; van Zuijlen, P.P.M. The FLIR ONE thermal imager for the assessment of burn wounds: Reliability and validity study. Burns 2017, 43, 1516–1523. [CrossRef] 43. García-Tejero, I.; Ortega-Arévalo, C.; Iglesias-Contreras, M.; Moreno, J.; Souza, L.; Tavira, S.; Durán-Zuazo, V. Assessing the Crop-Water Status in Almond (Prunus dulcis Mill.) Trees via Connected to . Sensors 2018, 18, 1050. [CrossRef] 44. Yang, M.-D.; Su, T.-C.; Lin, H.-Y. Fusion of Infrared Thermal Image and Visible Image for 3D Thermal Model Reconstruction Using Smartphone Sensors. Sensors 2018, 18, 2003. [CrossRef] 45. Daffara, C.; Marchioro, G.; Ambrosini, D. Smartphone diagnostics for cultural heritage. In SPIE 11058, Optics for Arts, Architecture, and Archaeology VII.; Liang, H., Groves, R., Targowski, P., Eds.; SPIE: Munich, Germany, 2019; pp. 260–270. [CrossRef] 46. Somboonkaew, A.; Vuttivong, S.; Prempree, P.; Amarit, R.; Chanhorm, S.; Chaitavon, K.; Porntheeraphat, S.; Sumriddetchkajorn, S. Temperature-compensated infrared-based low-cost mobile platform module for mass human temperature screening. Appl. Opt. 2020, 59, E112. [CrossRef] 47. FLIR C2 Datasheet. Available online: https://flir.netx.net/file/asset/7595/original (accessed on 20 June 2020). 48. FLIR Duo R Datasheet. Available online: https://flir.netx.net/file/asset/10905/original (accessed on 22 June 2020). 49. Rangel, J.; Soldan, S.; Kroll, A. 3D Thermal Imaging: Fusion of Thermography and Depth Cameras. In Proceedings of the 2014 International Conference on Quantitative InfraRed Thermograph, QIRT Council, Bordeaux, France, 7–11 July 2014. 50. Matas, J.; Chum, O.; Urban, M.; Pajdla, T. Robust wide-baseline stereo from maximally stable extremal regions. Image Vis. Comput. 2004, 22, 761–767. [CrossRef] 51. Zhang, Z. Flexible camera calibration by viewing a plane from unknown orientations. In Proceedings of the 7th IEEE international conference on computer vision, Kerkyra, Greece, 20–27 September 1999; Volume 1, pp. 666–673. 52. Bolles, R.C.; Baker, H.H.; Marimont, D.H. Epipolar-plane image analysis: An approach to determining structure from motion. Int. J. Comput. Vis. 1987, 1, 7–55. [CrossRef] 53. Ullman, S. The interpretation of structure from motion. Proc. R. Soc. Lond. B 1979, 203, 405–426. [CrossRef] [PubMed] 54. Koenderink, J.J.; van Doorn, A.J. Affine structure from motion. J. Opt. Soc. Am. A 1991, 8, 377. [CrossRef] [PubMed] 55. Westoby, M.J.; Brasington, J.; Glasser, N.F.; Hambrey, M.J.; Reynolds, J.M. ‘Structure-from-Motion’ photogrammetry: A low-cost, effective tool for geoscience applications. Geomorphology 2012, 179, 300–314. [CrossRef] 56. Schonberger, J.L.; Frahm, J.-M. Structure-from-Motion Revisited. In Proceedings of the 2016 IEEE Conference on Computer Vision and Pattern Recognition (CVPR), Las Vegas, NV, USA, 27–30 June 2016; pp. 4104–4113. 57. Özye¸sil,O.; Voroninski, V.; Basri, R.; Singer, A. A survey of structure from motion. Acta Numer. 2017, 26, 305–364. [CrossRef] 58. Saputra, M.R.U.; Markham, A.; Trigoni, N. Visual SLAM and Structure from Motion in Dynamic Environments: A Survey. ACM Comput. Surv. 2018, 51, 1–36. [CrossRef] 59. Hartley, R.; Zisserman, A. Multiple View Geometry in Computer Vision, 2nd ed.; Cambridge University Press: Cambridge, UK, 2004; ISBN 9780521540513. 60. Moulon, P.; Monasse, P.; Perrot, R.; Marlet, R. Openmvg: Open multiple view geometry. In International Workshop on Reproducible Research in Pattern Recognition; Springer: Cham, Switzerland, 2016; pp. 60–74. 61. Alcantarilla, P.F.; Nuevo, J.; Bartoli, A. Fast explicit diffusion for accelerated features in nonlinear scale spaces. IEEE Trans. Patt. Anal. Mach. Intell. 2011, 34, 1281–1298.

© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).