<<

PHOTOGRAMMETRY AS A SURVEYING TECHNIQUE

APPLIED TO HERITAGE CONSTRUCTIONS RECORDING - ADVANTAGES AND LIMITATIONS

VOLUME II

João Ricardo Neff Valadares Gomes Covas

Dissertação de Natureza Científica para obtenção do Grau de Mestre em

ARQUITECTURA

Orientação

Professor Auxiliar Luis Miguel Cotrim Mateus

Professor Auxiliar Victor Manuel Mota Ferreira

Constituição do Júri

Presidente: Professor Auxiliar Jorge Manuel Tavares Ribeiro

Vogal: Professora Auxiliar Cristina Delgado Henriques

DOCUMENTO DEFINITIVO

Lisboa, FAUL, Dezembro de 2018

PHOTOGRAMMETRY AS A SURVEYING TECHNIQUE

APPLIED TO HERITAGE CONSTRUCTIONS RECORDING - ADVANTAGES AND LIMITATIONS

VOLUME II

João Ricardo Neff Valadares Gomes Covas

Dissertação de Natureza Científica para obtenção do Grau de Mestre em

ARQUITECTURA

Orientação

Professor Auxiliar Luis Miguel Cotrim Mateus

Professor Auxiliar Victor Manuel Mota Ferreira

Constituição do Júri

Presidente: Professor Auxiliar Jorge Manuel Tavares Ribeiro

Vogal: Professora Auxiliar Cristina Delgado Henriques

DOCUMENTO DEFINITIVO

Lisboa, FAUL, Dezembro de 2018

INDEX OF CONTENTS

Index of Contents ...... i Index of Figures ...... v Index of Tables ...... vii ANNEX 1 – ADDITIONAL TEXTS – STATE OF THE ART ...... 1 1.1. The Evolution of Photography and Photogrammetry ...... 1 1.1.1. Genesis ...... 1 1.1.2. The Four Phases of Photogrammetry ...... 2 1.2. Light is the essence ...... 9 1.2.1. Visible Light and the Electromagnetic Spectrum ...... 10 1.2.2. Types of Materials and Types of Reflection ...... 11 1.2.3. Lighting, Atmospheric, and Material conditions To Record Scenes ...... 13 1.3. Camera Controls ...... 17 1.3.1 The pinhole camera model...... 18 1.3.1.1. Intentional Refraction in Lenses ...... 20 1.3.1.2. The Effects of on Ground Sampling Distance...... 22 1.3.1.3. The Effects of Focal Length on Perspective ...... 24 1.3.1.4. The Field of View and Crop Factor ...... 26 1.3.2. Aperture ...... 27 1.3.2.1. Range of Focus or ...... 28 1.3.2.2. Infinity Focus and Hyperfocal Distance ...... 29 1.3.3. The Shutter ...... 31 1.3.3.1. Global Shutter vs Rolling Shutter ...... 32 1.3.4. Viewfinder and Image Focusing ...... 33 1.3.4.1. Viewfinder to Frame and Focus in Photogrammetry ...... 35 1.3.5. ...... 36 1.3.6. Dynamic Range, Scene Dynamic Range, and Camera Dynamic Range ...... 36 1.3.6.1. Dynamic Range in Photogrammetry ...... 38 1.4. Types of Photographic Cameras ...... 41 1.4.1. Compact Cameras ...... 41 1.4.2. Bridge Compact Cameras ...... 41

i

1.4.3. Digital Single Lens-Reflex-Camera and Single Lens-Reflex-Camera ...... 43 1.4.4. Mirrorless Interchangeable Lens Camera ...... 43 1.4.5. Smartphones ...... 44 1.4.6. Action Cameras or Sports Cameras ...... 44 1.4.7. Types of Cameras and Platforms in Photogrammetry ...... 45 1.5. Types of Lenses ...... 47 1.5.1. Prime Lenses vs Zoom Lenses ...... 47 1.5.2. The Five Types of Lenses ...... 48 1.5.3. Macro Lenses...... 48 1.5.4. Wide-Angle-Lenses ...... 49 1.5.5. Fisheye Lenses ...... 49 1.5.6. Normal Lenses ...... 51 1.5.7. Telephoto Lenses ...... 51 1.6. Brief Introduction to Imaging Sensors ...... 53 1.6.1. Imaging Sensors and Outputs ...... 53 1.6.2. Sensors of Digital Photographic Cameras ...... 55 1.6.3. Color Separation Methods ...... 55 1.6.4. Formats of Imaging Sensors ...... 56 1.7. Bibliography...... 59 ANNEX 2 – DATA OF THE CASE STUDIES ...... 65 2.1. Equipment ...... 65 2.2. Photographic Recording of the Wall in Paço D’Arcos and Calibration ...... 67 2.3. Photographic Recording of The Medallions of FAUL ...... 69 2.4. Photographic Calibration of GoPro ...... 71 2.5. Photographic Recording of The of S. Jorge in Lisbon ...... 73 2.6. Photographic Recording of The Castle of Sesimbra ...... 75 2.7. Photographic Recording of The Convento dos Capuchos in Sintra ...... 97 2.8. Photographic Recording of The Igreja de Stº André in Mafra ...... 103 2.9. Terrestrial Laser Scanning of The Castle of The Convent of Christ ...... 111 2.10. GPS Recording of The Castle of Tomar ...... 117 2.11. First Photographic Recording of The Castle of Tomar ...... 129 2.12. Second Photographic Recording of The Castle of Tomar ...... 133 2.13. Python Scripts - SkyRemover Buildings ...... 197

ii

2.14. Python Scripts - SkyRemover Vegetation ...... 199 2.15. Python Scripts – Contour Lines ...... 201 2.16. Python Scripts – Process Plan ...... 203 2.17. Python Scripts – Process Section Cuts...... 207

iii

Page intentionally left blank

iv

INDEX OF FIGURES

Figure 1: The four phases of photogrammetry (Schenk, 2005)...... 3

Figure 2: Example of graphical photogrammetry...... 4

Figure 3: Classification of sensing systems (Schenk, 2005)...... 7

Figure 5: Representation of a portion of the electromagnetic spectrum, from ultraviolet to infrared (Konecny, 2003)...... 9

Figure 6: The ways light can behave when interacting with objects...... 12

Figure 7: Soft light vs hard light. Soft light is preferred in photogrammetry (Langford, 2002). . 14

Figure 8: The human eye (Konecny, 2003)...... 18

Figure 9: The pinhole camera model (Langford, 2002)...... 19

Figure 10: Example of a lens with several optic elements and the refraction effect represented in dashed lines (Docci & Maestri, 2005)...... 21

Figure 11: The effect on the FOV by changing the focal length. Camera position 1 uses short focal length while camera position 2 uses a longer focal length (Linder, 2005)...... 23

Figure 12: The effects of changing the focal length. Left image cropped fisheye image captured with Samyang 8mm fisheye lens. Right image captured with 25mm focal length...... 24

Figure 13: The effect on perspective by varying the focal length. See figure 5 for reference of camera positions (Linder, 2005)...... 25

Figure 14: The FOV and crop factor while using the same camera from the same POV...... 26

Figure 15: The effect of adjusting aperture. Left image was with a longer focal length while the image on the right was acquired with a shorter focal length. The result is blurrier background on the left image...... 29

Figure 16: Fisheye image taken from the Castle of Sesimbra. Every object is in focus regardless of camera settings...... 30

Figure 4: The effects of using global shutter vs rolling shutter. Top image used global shutter and bottom image used rolling shutter...... 32

Figure 5: Optical viewfinder of a SLR camera. Pentaprism represented next to position A (Langford, 2002) ...... 34

Figure 17: An example in which scene dynamic range is far greater than the camera dynamic range. Left image: camera dynamic range adjusted for the interior space. Right image: Clipping of highlights and shadows...... 37

v

Figure 18: Types of cameras. From top left to bottom right: Point and shoot camera ( Coolpix A300); Bridge compact camera (Nikon Coolpix L340); DSLR camera (Nikon D3100); MILC camera (Nikon Z7); Smartphone (Huawei Mate 10 Lite); Sports camera (GoPro)...... 42

Figure 19: Various ways to photograph. From the top left to the bottom right. Terrestrial photogrammetry with the use of telescopic pole, handheld, and tripod; lastly, aerial photogrammetry with the use of a UAV (Fiorillo, Jiménez Fernández-Palacios, Remondino, & Barba, 2013)...... 46

Figure 21: Comparison of the FOV of a full frame fisheye lens (from a Samyang 8mm lens) with a wide-angle lens (represented within the red limits - Nikkor 18mm)...... 48

Figure 22: Example of a fisheye image. Observe the curved horizon line...... 50

Figure 23: From left to right. Samyang 8mm fisheye lens; Wide-angle lens AF Nikkor 20mm f/2.8D; and telephoto lens AF-P DX Nikkor 70-300mm f/4.5-6.3G ED VR...... 52

Figure 6: From left to right. Imaging sensor with Bayer filter (Langford, 2002). FoveonX3 colour separation method (http://www.foveon.com/) ...... 56

Figure 7: Standard sizes of imaging sensors...... 57

Figure 14: Planview of the terrestrial laser scanning of the Castle of Tomar...... 112

Figure 15: Planview of the GPS recording with the position of the GPS points...... 118

Figure 16: First image of the pre-planning of the second recording of The Castle of Tomar. .. 134

Figure 17: Second image of the pre-planning of the second recording of The Castle of Tomar...... 135

Figure 18: Pix4D report from the first photogrammetric processing (spans 9 pages)...... 168

Figure 19: Pix4D report from the first photogrammetric processing (spans 14 pages ) ...... 182

vi

INDEX OF TABLES

Table 2: Computers ...... 65

Table 3: Photographic equipment ...... 65

Table 4: Accessories ...... 66

Table 5: More survey equipment ...... 66

Table 6: Index table of the collected data. From the photographic recording of a wall in Paço D'Arcos...... 68

Table 7: Table with sample images of the collected data. Photographic recording of a wall in Paço D'Arcos ...... 68

Table 8: Index table of the collected data. From the photographic recording of the medallions of Faculty of Architecture...... 70

Table 9: Table with sample images of the collected data. From the photographic recording of the medallions of Faculty of Architecture...... 70

Table 10: Index table of the collected data. Photographic calibration of GoPro camera...... 72

Table 11: Table with sample images of the collected data. Photographic calibration of GoPro camera...... 72

Table 12: Index table of the collected data. From the photographic recording of The Castle of S. Jorge in Lisbon, Portugal...... 74

Table 13: Table with sample images of the collected data. From the photographic recording of The Castle of S. Jorge in Lisbon, Portugal...... 74

Table 14: Index table of the collected data. From the photographic recording of The Castle of Sesimbra, Portugal...... 76

Table 15: Table with detailed information about the collected data. From the photographic recording of The Castle of Sesimbra, Portugal...... 77

Table 16: Table with sample images of the collected data. From the photographic recording of The Castle of Sesimbra, Portugal...... 81

Table 17: Table with general information of the collected data. From the photographic recording of the Convento dos Capuchos in Sintra, Portugal...... 98

Table 18: Table with detailed information of the collected data. From the photographic recording of the Convento dos Capuchos in Sintra, Portugal...... 98

Table 19: Table with sample images of the collected data. From the photographic recording of the Convento dos Capuchos in Sintra, Portugal...... 99

vii

Table 20: Index table of the collected data. From the photographic recording of the Igreja de Stº André in Mafra, Portugal...... 104

Table 21: Table with detailed information of the collected data. From the photographic recording of the Igreja de Stº André in Mafra, Portugal...... 105

Table 22: Table with sample images of the collected data. From the photographic recording of the Igreja de Stº André in Mafra, Portugal...... 106

Table 23: Index table of the collected data. Terrestrial laser scanning of The Castle of Tomar...... 112

Table 24: Table with detailed information of the collected data. Terrestrial laser scanning of The Castle of Tomar...... 113

Table 25: Table with matrix transformation from the terrestrial laser scanning of The Castle of Tomar...... 114

Table 26: Index table of the collected data. GPS recording of The Castle of Tomar...... 118

Table 27: Table with sample images of the collected data. GPS recording of The Castle of Tomar...... 119

Table 28: Table with coordinates information from the GPS recording of The Castle of Tomar...... 126

Table 29: Index table and charts of the collected data. From the first photographic recording of The Castle of Tomar...... 130

Table 30: Table with sample images from the collected data. From the photographic recording of The Castle of Tomar...... 131

Table 31: Index table and charts of the collected data. From the second photographic recording of The Castle of Tomar...... 136

Table 32: Table with detailed information of the collected data. From the second photographic recording of The Castle of Tomar...... 137

Table 33: Table with sample images of the collected data. From the second photographic recording of The Castle of Tomar...... 142

Table 34: Table with detailed information concerning the first photogrammetric processing with Pix4D software...... 166

Table 35: Table with detailed information concerning the exports from the first photogrammetric processing with Pix4D software...... 167

Table 36: Table with detailed information concerning the second photogrammetric processing with Pix4D software...... 177

viii

Table 37: Table with detailed information concerning the exports from the second photogrammetric processing with Pix4D software...... 178

ix

Page intentionally left blank

x

ANNEX 1 – ADDITIONAL TEXTS – STATE OF THE ART

1.1. THE EVOLUTION OF PHOTOGRAPHY AND PHOTOGRAMMETRY When dedicating a section for the historical evolution of photography and photogrammetry, which all have roots in common, a general reference of the contents is made, as the progress and developments of the photographic equipment, equipment for photographic analysis, and breakthroughs in mathematics and geometry have occurred mainly in the last two centuries, and the information to present, although interesting, is extensive. As implied, the evolution occurred due to contributions from several researchers, artists, professionals, and entrepreneurs, who had the main objective to ameliorate the efficacy and efficiency in the application of photogrammetry and its representation techniques. Therefore, the main contributions are briefly mentioned.

1.1.1. GENESIS Although the term "photogrammetry" was adopted and promoted from one of the publications of Albrecht Meydenbauer as of December 6, 1867, in the magazine no. 49 with the title "Wochenblatt des Architektenvereins zu Berlin" (translated: Weekly Magazine of the Association of Architects in Berlin), the concept of "photogrammetry" had already been thought of in the previous centuries. It may be noted that one of the first to contribute was Leonardo da Vinci in 1480 (Konecny, 2003; Mikhail, Bethel, & Chris, 2001) when he stated that perspective is the projection of pyramidal lines from all objects and these pass through a plane of transparent glass to focus on a surface, where the interpretation of the markings is done. Over the following centuries, scientists studied the mathematical laws governing projective geometry to create true- perspective drawings, and to determine points in space from images. Inclusively, in 1883, it was established the relationship between projective geometry and photogrammetry. Establishing this relationship was possible by studying photography, which was discovered by Joseph Nicephore Niépce in 1826, although impractical at the time because photographing would take several hours, up to 8 hours. Years later, in 1837, Jacques Mandé Daguerre discovered a method to photograph faster by utilizing other materials to preserve the image, such as polishing a sheet of silver-plated copper to a mirror finish and subsequent treatments. He designated this method as "Daguerrotype". A few years later, in 1840, Dominique François Jean Arago, a French geodesist,

1

suggested and attempted to promote the use of photogrammetry as a new surveying method by utilizing the Daguerrotype photographing method (Albertz, 2007; Duerer, 1977). As previously mentioned, the idea of applying photogrammetry as a survey technique had already been thought of before the publication of Albrecht Meydenbauer in 1867 to promote the term "photogrammetry". In fact, other terms had been thought of previously by Aimé Laussedat, a French military engineer, who experimented topographic mapping with imagery and is considered the Father of Photogrammetry for he was the first to complete a topographic map using terrestrial photographs. Laussedat named the photogrammetric technique as “Iconométrie”, “Métrophotographie”, “Photométrographie”, or “Photographométrie”. It was during one of the graphical restitutions that Meydenbauer when visited by Dr Otto Kersten, a geographical explorer, was suggested the term “Photogrammetrie” instead of “Photometrographie”. After publishing the paper with the title “Die Photogrammetrie” the name was accepted worldwide, even though the editors of the journal pointed towards to the fact this new term is better but not completely satisfying (Albertz, 2007). For the purpose of the dissertation, which is essentially the survey of cultural heritage buildings that contain, in almost all cases, hard-to-reach areas, narrow spaces, and other peculiar situations that difficult the surveying operations, it is crucial to quote: «Furthermore, Meydenbauer saw that architectural objects could only be appropriately covered by the use of wide-angle lenses. […] In these tests the method proved to be a success, and Meydenbauer could now provide evidence that photogrammetry is suited for architectural surveys and for topographic data acquisition as well.» (Albertz, 2007)

«[…] the first wide-angle lens used for mapping – 105O Pantoshop lens.» (Duerer, 1977)

Moreover, in the current thesis, it is demonstrated the indispensable and essential need for utilizing ultra-wide-angle lenses, up to 180o of FOV, to acquire information of architectural objects.

1.1.2. THE FOUR PHASES OF PHOTOGRAMMETRY From here on, and for reasons of simple exposition of the material, the evolution of photogrammetry is segmented into four phases, as suggested by Albertz and Wiedemann (Albertz, J., Wiedemann, 1996). Each phase corresponds to a cycle of approximately 50 years. The first phase corresponds to Graphical Photogrammetry, which goes from 1850 to 1900; the second phase, from 1900 to the present moment, refers to Analog Photogrammetry; the third phase, from 1960 to the present moment, includes Analytical Photogrammetry; and the last phase,

2

relates to Digital Photogrammetry and that is currently favoured and increasing in popularity. See Figure 1. It is important to note that the survey methods have not changed much for 100 years until 1900, although surveying instruments were introduced with higher levels of accuracy and precision and delivered better outputs. The non-development or non-employment of other surveying methods reflected in the overall poor status of topographic mapping. In fact, topographic mapping was of pressing need for the development of countries for it offered information regarding land usage, internal communications, and so on (Albertz, J., Wiedemann, 1996; Albertz, 2007). Starting with the first phase, graphical photogrammetry, as the name suggests, takes advantage of the laws underlying geometry for the formation of images, making it possible to produce graphical restitutions. See Figure 2. In this type of photogrammetry images of large format and high quality are desired to increase the precision and accuracy when originating measurements. The Royal Prussian Photogrammetric Institution, founded by Meydenbauer, utilized this survey technique mostly on architectural constructions. Major contributors include: i) Aimé Laussedat, who investigated the use of kites and balloons to acquire aerial imagery and was the first person to capture an aerial image using a

Figure 1: The four phases of photogrammetry (Schenk, 2005).

3

balloon, thus opening new doors for the use of photography in other situations, such as military purposes. However, due to the arduous application to capture the photographs to cover the intended area to survey, Laussedat decided to cease further experiments; ii) Gaspard Felix Tournachon (Nadar) successfully recorded aerial photographs at 80meters of height while using a balloon and was requested to provide with aerial photographs for military purposes. Nowadays, a nadir image is synonymous to vertical or zenith images; iii) Paulo Ignazio Pietro Porro, focused on improving the quality of the images by utilizing several lens elements, developed ways of removing lens distortions, and invented the tachometer; iv) Franz Stolze discovered the floating mark, one of the key elements to advance to analogue photogrammetry for it allows to perform measurements from stereoscopic imagery. The floating mark is a point located in the three- dimensional space and is used as a reference in a stereoscopic project (Duerer, 1977). Regarding analogue photogrammetry, this phase is predominantly characterized by the introduction of numerous optical and mechanical plotters developed by many historically notable contributors to perform three-dimensional geometric reconstructions from two overlapping photographs (stereophotogrammetry or stereo vision). Specifically, the floating mark is controlled under stereoscopic vision thus allowing to draw contours lines or structural lines. The reason for developing numerous plotters relates to the need to increase precision, decrease the required time to perform the drawings, and increase cost-effectiveness and user- friendliness - because operating these machines demanded numerous mathematical calculations and some level of willingness. The main purpose for using such devices was mostly for topographic mapping. The analogue phase was not only noted for the use of stereoscopic vision but also noted for the application of aerial photogrammetry when considering the discovery of aeroplane flight by the Wright brothers. Important contributors include: i) Edouard Deville, who invented the first stereoscopic- plotting instrument; ii) Theodor Scheimpflug, for introducing the concept of radial triangulation, for being the first to successfully map using aerial photographs, and for introducing the concept

Figure 2: Example of a graphical photogrammetry.

4

of radial triangulation; iii) Sebastian Finsterwalder, who, in 1899, started to contribute with mathematical operations, such as relative and absolute orientation that are essential for analytical photogrammetry and for double-image photogrammetry; iv) Dr Carl Pulfrich, for designing the first stereocomparator and is referred to as the father of stereo-photogrammetry, even though Dr Henry George Fourcade developed a similar instrument roughly during the same period of time; v) Professor Reinhard Hugershoff, not only for inventing surveying and mapping instruments that accept vertical, oblique, and convergent imagery, but also for inventing a ring mount to correct drift when taking aerial photographs; vi) Otto von Gruber, who in 1924 introduced mathematical equations that provided with one of the essential contributions even for photogrammetry of nowadays, that is, the six model points for clearing parallax effect. However, they were not employed at the time for they required a prolonged amount of time to calculate; vii) Earl Church, also one of the main contributors for the reason that his solutions for analytical photogrammetry influenced the photogrammetry of today, such as space resection, orientation, intersection, rectification, etc (Duerer, 1977). Other crucial developments during the analogue photogrammetry phase are prominent. For example, the invention of the gimbal to hold and reduce oscillations of the photographic camera during flights, a technology used presently on drones. Nevertheless, one of the greatest contributors to photogrammetry was Sherman Mills Fairchild, an entrepreneur who developed a rotary-blade shutter placed between the lenses so to sharpen and increase the quality of the images. Lastly, Dimitri Rebikoff demonstrated that photogrammetry can be operated in aquatic environments and invented the electronic flash. Concerning analytical photogrammetry, this phase is marked by the advent of computers. This technology allowed replacing expensive optical and mechanical elements of the analogue plotters and greatly decrease the time for calculating the mathematical equations. Consequently, analytical photogrammetry is a hybrid between analogue photogrammetry and digital photogrammetry. The main particularities of this phase include the development of several analytical plotters, the use of analytical triangulation, orthophoto projectors, and digital outputs such as DEM (Digital Elevation Model) and digital maps. From the several researchers and developers, apart from the inventors of the computer, particular emphasis is given to Duane Brown. Brown discovered new mathematical approaches for camera calibration and introduced bundle adjustment (BA), a method for calculating a solution by considering the several mathematical equations involved in the photogrammetric process, such as external and internal orientation

5

parameters of the photographic camera, survey points, radial lens distortions, and self- calibration. Self-calibration is used to overcome the conventional camera calibrations since every lens or photographic camera from the same manufacturer contains its own small errors. In other words, it is possible to utilize regular photographic cameras instead of professionally designed photogrammetric cameras. Additionally, Brown contributed with other solutions, including decentring distortion and principal point calibration (Duerer, 1977). In the last phase, digital photogrammetry refers to a method where the input data is given in the digital form instead of analogue, thus resulting in a total digital photogrammetric workflow. The images are loaded onto the computer in the digital form, either by collecting the images with a digital photographic camera mounted or not on a platform or by digitizing the analogue images. As stated by Konecny: «Digital photogrammetry combined with image processing techniques finally became the new tool to partially or fully automate point measurement, coordinate transformation, image matching […]» (Konecny, 2003) Digital images are referred to as soft-copy – digital images or digitized film - and analogue images are referred to as hard-copy – original film or printed digital images. See Figure 3. Definitely, photogrammetric techniques of today are, to some major extent, dissimilar to the photogrammetric techniques of the past centuries. Although the mathematical and geometrical principles underlying photogrammetry are very much the same, the dissimilarities are relative as to how the outputs are produced – via software and not via mechanical devices operated by professional photogrammetrists (Mikhail, Bethel, & McGlone, 2001). Digital photogrammetry covers a wider domain than before, as images can be acquired not only from passive techniques, such as via photography but also via active techniques, such as radar imaging. That is, with the availability of low-cost and high-quality imaging systems and the ever- increasing computing power hardware resources, particularly in the past few years (Ferreira, 2011), computer systems of 64x bits are widely used today and an abundant amount of RAM makes it possible to compute extremely large photographic datasets. The digital outputs are innumerable, which include Digital Elevation Model (DEM), mesh models, tin models, orthophotos, videos, CAD drawings, virtual reality (VR) and so on. In addition to the possibility of using such outputs in game engines, to share on the internet, For the time being, digital photographic cameras, and in addition to smartphones, have practically, if not completely, replaced analogue photographic and professional metric photogrammetric equipment. Furthermore, it may be stated that due to high level of automatic

6

Figure 3: Classification of sensing systems (Schenk, 2005). processing, powerful Computer Vision algorithms, and advances in optics, electronics, and imaging, the need for highly skilled photogrammetrist specialist to exploit to the maximum the use of images are no longer that much required, although the underlying photogrammetric principles remain the same (Mikhail, Bethel, & Chris, 2001). Indeed, current software advances allow the computation of non-calibrated photographs. Digital photogrammetry is a field of study interconnected to Computer Vision, Remote Sensing and Geospatial Information Systems. Although many developments have been researched and new milestones reached, other technologies, such as LIDAR (laser scanners), allow to acquire information with a high level of accuracy and precision, and are likewise preferred to perform recording of structures.

7

Page intentionally left blank

8

1.2. LIGHT IS THE ESSENCE Light is the raw material for creating images. With no light, no images could ever be produced. Photographing is, in fact, the manipulation of light. If the user intends to acquire high- quality images, the underlying behavioural principles that govern light must be studied. This subject relates to physics, and once the basic principles of interaction of light with the environment are understood, and vice versa, the user is equipped with the knowledge to achieve better results. The current information is strictly related to the operating mechanisms in the photographic cameras, allowing to manipulate light. In other words, a photographic camera is an instrument to interact with light within a controlled environment, and consequently to collect particular outputs. Light corresponds only to a rather narrow interval of the electromagnetic spectrum, that is, from 400 up to 700 nanometres. The following information regards not only to light but to the electromagnetic spectrum as well. For the previously stated reasons and to execute a photogrammetric project with the use of photographic equipment, it is fundamental to study and master thoroughly the use of photographic cameras to a point that limitations are known. Therefore, in a first moment, the concept of light is discussed, answering elementary questions such as: what is light; what are the basic behavioural characteristics of light, and what are its interactions with different materials and associated phenomenon. Posteriorly, it is discussed the purpose of creating a pinhole and the use of a lens and chamber to produce images. Lastly, it is discussed the fundamental mechanisms for manipulating light, such as focal length, focusing, aperture, other concepts and subsequent image creations. As aforementioned, light is the fundamental element of photography and vision. Without light the surrounding environment is unperceivable. Light allows interacting with external events.

Figure 4: Representation of a portion of the electromagnetic spectrum, from ultraviolet to infrared (Konecny, 2003).

9

Particularly, light transports potential information, valuable as input to be interpreted by specific sensors, as the human eye or the photosites installed on a sensor of a photographic camera. The output from the interaction of light and a sensor is an image or an “illusion” or “partial reality” since it is partial information captured from a specific range of the electromagnetic spectrum. Furthermore, the first output generated by the sensor is interpreted in a posterior phase to derive other types of information. There is a reading of a reading. It is crucial to manipulate not only the light travelling inside the body of the photographic camera but also the light on the object when possible.

1.2.1. VISIBLE LIGHT AND THE ELECTROMAGNETIC SPECTRUM The behaviour of light is summarized into four main characteristics (Langford, 2002): i) Light propagates as a wave. Different wavelengths are interpreted as different colours. See Figure 4. ii) Light propagates in a linear trajectory if travelling through a uniform medium. The linear trajectory is responsible for the formation of shadows. iii) Light propagates at a great speed, approximately at 300,000,000 m/s and it is known as the “speed of light”. The speed of light is affected by the substance it travels in. The denser the substance the slower the light travels. iv) Light also behaves as a particle, referred to as “photon”. For these reasons, changes occur to the materials when struck by light. For example, the photographic sensor takes advantage of this behaviour to register diverse information for the creation of images, such as light intensity, and red, green, and blue colours. Moreover, the harder the light, the more photons occupy per unit of volume. Other aspects concerning light must be considered to further comprehend the inner workings of a photographic camera. Regarding point i): As light propagates in the form of waves, and waves range from several hundreds of meters down to microns of length, wavelengths are arranged into what is referred as “The Electromagnetic Spectrum”. For example, radio waves have a low frequency, and gamma rays or x-rays have a high frequency, allowing the later ones to penetrate through diverse materials. Therefore, depending on the size of the wavelength, different behaviours are expected, both visibly and non-visibly. The electromagnetic spectrum, for intelligibility purposes, is segmented into several bands, each having a specific name and include a specific range of

10

wavelengths. Of all the known wavelengths, the human eye perceives merely a narrow band between 400nm and 700nm, referred to as the visible light. Visible light is further segmented into channels to include wavelengths corresponding to different colours: Red, Green, and Blue. See Figure 4. The three previously mentioned colours are the values captured by most digital photographic sensors, and it is presumed the human eye contains three types of conically shaped sensors each of which captures a specific channel. Interestingly, another peculiar characteristic of the behaviour of light is when all the wavelengths of the visible band are approximately uniform, the human brain interprets as white colour (no colour actually). If a wavelength is prominent relative to others, light is interpreted as a different colour. A diverse combination of wavelengths, or the absence of specific wavelengths, result in specific effects (Langford, 2002). Concerning point ii): Other behaviours are noted. For example, the harder the light, and if projected directly on an object, the shadows are presented with high contrast and the contours are well defined. Similarly, and in reverse logic, if the light source is weak, or projects light diffusely as when light reflects on other materials, the shadows are soft, displaying a range of grey tones. Relative to points iii) and iv): The interaction of light - that can originate from incoherent sources (such as sunlight or incandescent) or coherent (laser, which is monochromatic, directional, and bright) - on materials causes several possible outcomes depending on the properties of the materials, such as texture, colour, and shape, and of the properties of the incident light, such as angle of incidence, colour, and intensity (Vosselman & Maas, 2014).

1.2.2. TYPES OF MATERIALS AND TYPES OF REFLECTION Materials can be sorted into two main categories: a) opaque materials; and b) transparent or translucent materials. Such distinction serves to explain clearly the different effects. In reality, every material reflects part of the incident light and absorbs the remaining part. In the latter case, it is transformed into heat for instance. The quantity of absorbed light is strictly related to the opaqueness of the material. As a rule of thumb, the opaquer the material is, the more absorption occurs. In other words, the darker the object the warmer it becomes when compared to another object with a lighter colour and under the same conditions. For example, Langford (Langford, 2002, pp.44) explains that a blue material absorbs all of the other wavelengths but the blue one, which is reflected and thus the object is perceived as blue. However, if the light is coloured, the material reflects a slightly different blue colour. Specifically, the output colour results from the interaction of the properties of the material with those of the

11

light. Thus, the reason for the sky displaying different colours from sunrise to sunset. In this case, the great reason for the colours being different at different times of the day is due to the angle of the incident light on the atmosphere. Another key aspect to bear in mind is the texture or type of finish of the object and its effect on light reflection. A damp or matte surface reflects light in a dispersive pattern, irrespective to the incident angle. This behaviour is referred to as “diffuse reflection” or “Lambertian reflection”. Oppositely, “specular reflection” occurs when a surface has high reflection and reflects light mostly in one direction depending on the incident angle. Examples of specular surfaces are mirrors, polished surfaces, or metallic paint. Regarding transparent or translucent materials, such as glass or textured glass respectively, most of the light, or part of the light, travels through the material and exits in a different direction, depending on the shape of the object. This phenomenon is referred to as “refraction”. In the case of translucent materials, light tends to diffuse more. As implicitly noted thus far, the texture, colour, shape, and transparency or translucency contribute to particular effects. Yet, a key concept related to transparent and translucent materials must be discussed thoroughly: refraction. As previously mentioned, light travels at a constant speed through a substance and if the density of the substance changes so does the speed of light. Refraction phenomenon occurs most notably when light hits and penetrates the object from an oblique angle. While travelling through a different density the speed decreases or increases and the direction is adjusted relative to that change of speed. The resulting effect is a distortion. For example, submerging a pencil in a glass filled with water, the submerged section of the pencil

Figure 5: The ways light can behave when interacting with objects.

12

appears disjointed. Refraction does not occur if the incident light is perpendicular to the surface of the object and if said object is symmetrical – “transmission” occurs. In short, refraction is a phenomenon where light changes the direction of propagation when travelling through one substance to another or when travelling through a substance with varying density. Another crucial aspect to consider is the distance of the light source to the object being lit. In general, the closer the object is to the light source the more illuminated it is, and knowing that light travels in a straight line, the intensity of light hitting a surface can be calculated. As a rule of thumb, a surface receives 4 times as much light for every time the distance of the object to the light source is reduced by half. In other words, doubling the distance increases light intensity by a factor of 4, that is, the light intensity is inversely proportional to the square of the distance. This phenomenon must be considered inside enclosed spaces because when photographing equal objects at different distances from the light source the visual effects are distinct. If the intent is to collect images of objects with the same light intensity, either the objects must be placed at the same distance from the light source or more light sources must be installed. However, this overall effect does not occur in exterior spaces illuminated by natural light for the objects in the exterior spaces are essentially at the same distance relative to the distance to the sun. Specifically, the distance of planet Earth to the sun is such that all relative distances between objects on Earth are irrelevant. Consequently, photographing exterior spaces results in images where surfaces have roughly the same light intensity, except due to some local conditions such as weather, shadows, etc.

1.2.3. LIGHTING, ATMOSPHERIC, AND MATERIAL CONDITIONS TO RECORD SCENES Having knowledge of the effects of light on materials and the quality of the materials provides the user or surveyor with the opportunity to analyse, in anticipation, the objects to be photographed, and to determine if processing the images for photogrammetric purposes are within acceptable limits or are disposable due to low quality. For that reason, it should be considered 3 aspects when analysing a scene: i) the atmospheric conditions; ii) the lighting conditions; and iii) the types of objects and materials (D’Ayala & Smars, 2003; Mateus, 2012). Regarding the atmospheric conditions, the ideal weather to perform recording sessions is during cloudy days because objects and materials are evenly lit, and most details, if not all, are visible and acquirable by the recording equipment. Per contra, the most undesirable weather conditions correspond to foggy and rainy days and even days with intense sunlight. Such conditions not only may contribute to lower quality images because undesirable reflections are

13

more likely to happen, but also may have negative effects on the recording instruments, not to mention a negative impact on the surveyor. For instance, due to the high temperatures, the operator may require more breaks, or the instruments may malfunction electronically. Relative to the lighting conditions, images must have enough quality to allow to distinguish as many details as possible. It should be avoided the recording of scenes in very sunny days with a high amplitude between shadows and highlights because the dynamic range of photographic cameras is lower than the dynamic range of the scene. In other words, details may not be acquired in very dark areas or very bright areas. In addition, evenly lit scenes have a higher probability of reflecting light diffusively, while scenes that are struck with hard light can lead to undesired reflections, such as specular reflections. Even though for the human perception the 3D information is more visually appealing if the results present with high contrast between dark and lit areas, for photogrammetric purposes it is desired homogeneous lighting. In the case of interior spaces with no entry of light, the lighting conditions may be overcome with the use of artificial lights. Relative to the conditions of the objects or materials, there is a myriad of outcomes as to how light behaves due to how texture and colour are combined along with transparency, translucency, or opaqueness of the materials themselves. Specifically, due to the shape of the objects, light hitting at different angles provides with several optical effects. The effects interpreted by the human brain provide with the notion or illusion of depth and other characteristics such as three-dimensionality, edges, solidness, transparency, and so on. Keeping track of all possible lighting outcomes leads to an unfulfilled exercise, therefore one should adopt the method of exclusion. That is, observe the scene and identify objects or materials difficult to capture with the surveying instruments, these include opaque, transparent, and translucent materials with high refraction intensity and specular reflection. Materials such as polished marble, mirrors, glass, metal, and others. In addition to reflection conditions, image matching

Figure 6: Soft light vs hard light (Langford, 2002). Soft light is preferred in photogrammetry.

14

algorithms perform best with objects with complex geometry and textures than those objects with simple geometry and textures.

15

Page intentionally left blank

16

1.3. CAMERA CONTROLS Manipulating a photographic camera is a complex but simple task. Complexity being different from complicated. In a metaphorical explanation, it is similar to juggling, where the various mechanisms of the cameras are manipulated to obtain specific effects, while trying to counter the undesired (but not negative) effects that each mechanism offers. The unwanted effects are relative, depending on the purpose of photography. For example, photographing objects travelling at high speeds requires fast values to obtain a sharp image of the object, unless the purpose is the opposite. However, using fast shutter speeds allows a lower quantity of light to travel into the body of the camera, therefore requiring other mechanisms to be adjusted in order to compensate for the lack of light, which in turn may result in other undesired effects. All of the mechanisms were researched and developed for over 200 years of photographic experiments, where a wide variety of photographic cameras were invented, perfected, and even fallen into disuse due to obsolescence. Nevertheless, a photographic camera is essentially a closed box with a pinhole, a lens, and a sensor to capture images. In the context of photogrammetry, the operator or surveyor must consider the following (Linder, 2005):

i) The possibility to frame the scene ii) The possibility to focus a specific point (distance setting) iii) A shutter to determine the time of exposure of the sensor iv) Aperture to define light intensity and range of focus v) Save the pictures digitally or physically (storage) vi) Measurement of exposure (histogram) vii) Resolution of the imaging sensor. The higher and bigger the better, respectively. viii) Energy supply (batteries) ix) Accessories, such as a tripod, remote release, and adaptors for flash and lenses

A photographic camera is a machine to control light in very specific ways with the use of mechanisms, such as aperture, shutter speed, ISO values, focusing, etc. For the very same reason, it is important to stress that for photogrammetric purposes, any change made to the photographic camera mechanisms results in a change of the camera geometry because said changes alter, even if slightly, the optical beam path. In other words, it is the same as creating another “photogrammetric” model with its own geometric properties. The

17

reversal of the geometric properties to the previous ones proves to be almost an impossible task. Therefore, to constant geometry it is advised to use the least amount of photographic cameras, and to change the settings only when necessary (Wohlfeil, Strackenbrock, & Kossyk, 2013). In addition, Wohfeil et al. recommend high image overlap, up to 80% or 90%, and if camera geometry changes, oblique images of the object should be acquired, even if redundant, to increase the geometric integrity of the photogrammetric reconstruction.

1.3.1 THE PINHOLE CAMERA MODEL From what has been aforementioned, the creation of an image requires a film or imaging sensor that allows the recording of light. However, the sole use of a sensor or film in an open space is insufficient because the light travelling to the sensor results in no image at all. In short, the sensor is merely lit. For the creation of an image, the pinhole camera model is a fundamental geometry model to explain the formation of an image ( in a perspective model). Thus, 3 major conditions must be met, such as the use of a i) pinhole to control the light travelling into the ii) dark chamber and that hits the iii) film or imaging sensor. The reason for using these three critically essential elements for photographing is due to the fact that all photons travelling from every point of the subject to be recorded hit all of the photosites on the imaging sensor. Specifically, it can be thought as the superimposition of an infinite number of images of the subject of interest on the imaging sensor, and the resulting image is what is commonly perceived as turning on a light and observing a wall being lit (Langford, 2002; Mikhail, Bethel, & Chris, 2001). For a clear understanding of the formation of an image, the human eye is used as an analogy. In an over-simplified deconstruction, the human eye is an enclosed sphere, behaving as a dark room, containing the retina at one side to receive light and a small orifice at the opposite

Figure 7: The human eye (Konecny, 2003).

18

side from where light travels in - referred to as a pupil and is functioning as a pinhole. In addition, next to the pinhole and on the inside, a lens refracts the light. See Figure 7. Regarding the orifice or pinhole and while bearing in mind that light travels in a straight line, the purpose of the pinhole is to control the light rays coming from the exterior of the darkroom, a method to limit the total number of photons. See Figure 8. The rays of light when passing through the pinhole cast a circular shape on the sensor, referred to as “circle of confusion”. The size of the circle of confusion is always greater or equal to the size of the pinhole if no lens is utilized. In addition, the size of the circle of confusion is directly related to the size of the pinhole and the distance of the sensor to the pinhole. As a rule, the smaller the pinhole, the smaller the circle of confusion, and the sharper the image is. Oppositely, if the pinhole is of considerable size, overlapping of several circles of confusion occurs and a defocused image is generated, similar to a smudged painting. Interestingly, the use of a pinhole results in the projection of an inverted image, both in the horizontal and vertical directions, as the pinhole acts as a plane of symmetry. Regarding the use of a dark chamber, it allows to generate a better-defined image as light rays of other objects in the scene do not hit the sensor. In other words, the sensor in the dark chamber receives solely the light travelling from the scene of interest and the respective intensity values for interpretation. Consequently, the sensor is fully dedicated to interpreting the intensity values of the scene of interest without the interference of intensity values of other scenes of no interest. As a rule, as the distance increases from the pinhole to the imaging sensor increases, the circle of confusion increases, and the resultant image is more “confusing” or defocused.

Figure 8: The pinhole camera model (Langford, 2002).

19

Other effects are noted as well, particularly when adopting extreme solutions. As mentioned, as the size of the pinhole decreases the image becomes sharper because the incoming rays of light are limited to a specific amount. However, the created image is darker for less light enters the dark chamber. A method to overcome such setback is to increase the duration the sensor is exposed to the light rays. The inverse logic applies. As the pinhole diameter increases the sensor is struck with an increased quantity of light rays and light intensity but at the cost of lower quality results. In sum, the smaller the diameter of the pinhole the better to acquire sharp images. Considering this logic to the extreme an interesting effect occurs: “diffraction”. Not to be confused with refraction. Continuously decreasing the size of the pinhole, inevitably at one point, leads to poorer outputs due to diffraction. Diffraction occurs when rays of light change direction as they pass adjacent to opaque edges. This effect is more prevalent in objects with irregular edges. See Figure 5 - diffraction. A balance is required between the size of the pinhole camera, for sharp details, and the phenomenon of diffraction that leads to unsharpened details.

1.3.1.1. INTENTIONAL REFRACTION IN LENSES Thus far the creation of an image through the use of a pinhole camera model has been presented. Yet, the purpose of using lenses must be brought to light. The reason for installing a lens in the pinhole is to benefit from the controlled refraction effect that results in higher image quality. Taking refraction benefits to the maximum via the use of several lenses allows photographing at different distances from the subject and in focus. Consider the setup presented previously: an object, a pinhole, a dark chamber, and a sensor inside the dark chamber. As noted, the objective of the pinhole is to limit the size of the conic projection - or quantity of rays of light - in order to produce sharper images by decreasing the size of the circle of confusion. It is noted that the size of the resulting image is directly related to the distance of the sensor to the pinhole, also known as “focal length”. The closer the sensor is to the pinhole, the smaller the subject is represented due to an increase of the FOV, and vice versa. In addition, if the distance of the sensor to the pinhole is the same as the distance of the object to the pinhole, the resultant image is approximately the same size as the object.

20

Considering such effects, lenses are utilized to control not only the size of the circle of confusion so to converge, theoretically, the light rays to a single point, but also to control the captured details and the size of the represented object. The latter is crucial to allow photographing objects at different distances and deliver images where the objects are portrayed with the same size due to the intentional manipulation of refraction of the lenses. In other words, instead of increasing continuously the projection of the cones containing the rays of light, lenses converge light rays to a specific distance, referred to as “focal point”. Optimal results are achieved by coinciding the sensor with this distance. Displacing the sensor beyond the convergence point produces images that are inverted again, and the output is unsharpened. In short, lenses inflect the light rays to a specific distance in the dark chamber. The inflexion is controlled by the shape of the lens or a combination of several lens lenses. The calculated distance is designated as “focal length”. A photographic camera is more than an assembly of a single lens. In reality, a lens is the assembly of several optical elements with the main objective of reducing optical defects or aberrations, in order to create images with a high spatial resolution by controlling the light entering the body of the photographic camera. The optical elements are a made of glass of various qualities and shaped into specific volumes to obtain specific refraction and diffraction properties. On average a lens has between 5 and 8 elements and the assembly of the lens elements demand highly accurate manufacturing technology. See Figure 9. Slight misalignments lead to non-promising outputs and thus photographic cameras and lenses should be handled with care to prevent any vicissitudes, apart from the fact that even with the correct placement of the lens elements a few minor issues arise. For example, some dispersion of light occurs when light

Figure 9: Example of a lens with several optic elements and the refraction effect represented in dashed lines (Docci & Maestri, 2005).

21

travels through each and every lens element, even with surface finishes that counteract such undesired effects.

1.3.1.2. THE EFFECTS OF FOCAL LENGTH ON GROUND SAMPLING DISTANCE Focal length, or principal distance, is the distance along the optical axis from the perspective centre to the image plane containing a sharpened representation of the reality being captured (Mikhail, Bethel, & Chris, 2001). Every lens has its focal length values engraved around the cylindrical ring of the body of the lens, usually in mm. A few peculiarities are noted. A lens providing shorter focal lengths is considered more powerful for it requires higher convergence power to produce an image of equivalent quality created with a longer focal length lens. Once again, note that the image details are physically smaller as the focal length decreases – if maintaining the same MP count of the imaging sensor, POV, and position. By way of explanation, the higher the refraction index, more convergence occurs, the sensor is closer to the pinhole, the FOV is wider, and the image subjects and details are physically smaller on the sensor (Linder, 2005). Such occurrence leads to an important issue in photogrammetry, which can be simultaneously an advantage or a disadvantage: the impact of resolution (that is, the number of MP) on the spatial resolution of the final image. In other words, as the subject is smaller on the final image, each pixel captures a greater area of the subject – known as Ground Sampling Distance (GSD). For instance, an image with 2mm of GSD means that each pixel captures features with 2mm of length. If the purpose is to capture with the same GSD with a shorter focal length lens, the sensor density must increase to capture the same area as an image produced with a longer focal length. The previous statement is true if the user photographs with different focal lengths from the same POV and position. Another method to overcome such an issue is to photograph closer to the subject while using higher MP count, and thus the same area of the subject is captured for the same area of the sensor. See Figure 10. In other words, the GSD is approximately the same. However, other notable optical effects occur that are discussed further in other sections. For instance, fisheye images, the central topic of the current thesis, produce images with varying GSD, from high GSD values from the margins of the image to low GSD values the closer the are to the centre of the image. When utilizing photogrammetric techniques in architectural context, the use of varying focal length values proves to be beneficial as the distance of the camera to a subject varies depending on the sizes of the rooms and accessibility.

22

It must be noted that GSD is a numerical quantity relating to the area of the object surface to the area of a pixel of the image. Such a concept should not be confused with spatial and pixel resolution. The spatial resolution defines the sharpness of an image (or the quality of an image) relative to the total number of MP of an image, that is, pixel resolution. The calculated GSD of an image can be within the defined tolerance levels to produce documentation at a particular drawing scale, however, if spatial resolution is of poor quality, the reprojection of points into the virtual spaces from the images will be of poor quality as well as the recognition of homologous features in images is less accurate. The causes for poor spatial resolution can be several, ranging from inappropriate camera settings, the lens itself, lighting conditions, or inaccessibility to photograph the objects at the appropriate distance, to image post-processing in the office. The GSD is a valuable method to determine the average distance the operator or surveyor must be relative to the object of interest – that is, coverage. For instance, by defining a typical

Figure 10: The effect on the FOV by changing the focal length. Camera position 1 uses short focal length while camera position 2 uses a longer focal length (Linder, 2005).

23

drawing scale, it can be calculated the maximum distance of a given photographic camera to the subject of interest.

1.3.1.3. THE EFFECTS OF FOCAL LENGTH ON PERSPECTIVE Adjusting the focal length inevitably affects the FOV. Maintaining the same camera format and settings but changing the lens to a longer or shorter focal length value results in two interesting effects: i) the FOV adjusts to include more or less of the object for the same area of the sensor, and ii) the perspective effects are accentuated. As observed in Figure 11, not only the facial features of the portrayed person are exaggerated, but also some information of the facial features have not been captured, such as the ears (because the camera position changed to be closer to the subject). In addition, the image looks as if it has been projected over a curved surface. Note that the left image was originally larger and is currently cropped to better compare in detail the distortion effects when compared to the image on the right. That is, utilizing a longer focal length includes more information about the subject but at the cost of including fewer features from the scene. On the right image, it is observed that the FOV narrowed and the information looks “flattened” (perspective effect). The reason for the “flatness” is mostly related to the proportion of focal length versus the distance of the camera to the subject including the depth of the subject. In a more concise explanation, longer focal length lenses take as input light rays that are more parallel to each other, thus resulting in an image that appears “flat” or proportionate in the horizontal and vertical axis. If short focal length lenses are utilized, the FOV increases and the lens captures light rays travelling greater distances from this increased field of view (steeper angle). For that reason, a planar wall appears curved in fisheye images. – a flat wall appears squeezed near the margins of the image. In photogrammetric terms, the GSD value increases the farther away the photosite is from the centre of the sensor due to the corresponding increase of

Figure 11: The effects of changing the focal length. Left image cropped fisheye image captured with Samyang 8mm fisheye lens. Right image captured with 25mm focal length.

24

the FOV. In theory, the central point of the imaging sensor or image is the only point with no distortions and lowest GSD value. For instance, and using over-simplified calculations, photographing a plane wall from a 5 meter distance: the light ray hitting the lens at an angle of 89° travels a distance very close to 5 meters, however, if the light ray hits the lens at an angle of 30°, the travel distance corresponds to 10 meters. If for an angle of 89° the respective photosite captures 1 cm of the wall, then for an angle of 30° the same photosite captures approximately 2cm of the wall. On the extreme end, the distance is close to infinite because a light ray is essentially parallel to the wall considering that the angle of incidence is of 1°. See Figure 12.

Figure 12: The effect on perspective by varying the focal length. See figure 5 for reference of camera positions (Linder, 2005).

25

1.3.1.4. THE FIELD OF VIEW AND CROP FACTOR Even though the concept of focal length has been discussed in the previous paragraphs, it is yet necessary to clarify thoroughly the effects of focal length and the FOV relative to the size of the imaging sensor in use. Such an issue is related to “Crop Factor” (Langford, 2002). In essence, and as previously mentioned, a specific focal length corresponds to a particular FOV and a particular image circle on the imaging sensor. Not always this condition is verified across all cameras containing imaging sensors of different sizes. Particularly, smaller imaging sensors use a smaller area of the image circle cast by the lens. The captured FOV is shorter. This means that a short focal length lens used on a small imaging sensor has the equivalent FOV of a longer focal length used on a larger imaging sensor. For example, photographic cameras using APS-C sensors (30 x 17mm) have a 1.5x crop factor value, meaning that the FOV of a 50mm for the APS-C sensor is equivalent to the FOV of a 75mm lens used on a full frame sensor. Furthermore, using the same 50mm lens on a full frame imaging sensor generates images with wider FOV. Note that the crop factor is calculated relative to a full frame imaging sensor. See Figure 13. Interestingly, photography companies, such as Nikon for example, knowing that a large area of the image circle is not captured by the imaging sensor, lenses are designed to cast a smaller image circle but with higher quality on the optical elements. Such lenses are labelled as Dx, and when mounted on a camera with a full-frame imaging sensor – known as Fx cameras-,

Figure 13: The FOV and crop factor while using the same camera from the same POV. https://www.nikonusa.com

26

the resultant image displays heavy , that is, dark corners because the imaging sensor is larger than the image circle. When analysing photogrammetry literature, omitting the crop factor or not mentioning the equivalent FOV on a full frame imaging sensor may lead to unwanted conclusions relative to the application of said lens.

1.3.2. APERTURE Several markings are observable around the cylindrical body of the lens. Until this moment, focal length and focus have been mentioned and are represented in mm, and meters or feet respectively. In the case of aperture, the values are represented in ƒ numbers. At first, it may be unusual, but once the underlying logic is brought to light, the aperture is as important as the other operating mechanisms of the photographic camera (Langford, 2002; Mikhail, Bethel, & Chris, 2001). While focal length allows the user to control a few aspects of the resulting image such as the FOV, aperture allows the user to control the quantity of light travelling inside the body of the photographic camera and the sharpness of the image. This is achieved manually -by rotating the ring in the camera body-, or automatically with the use of automatic exposure feature of the photographic camera. While observing the lens, an orifice is visible between the lens elements, featuring a series of smooth thin blades to form a circular aperture. The lower the ƒ value, the larger the diameter of the orifice becomes. Furthermore, ƒ values are given in click stops, and in a specific scale, unlike the focal length values that can be adjusted progressively with no “stops”. For example:

ƒ/2, ƒ/2.8; ƒ/4; ƒ/5.6; ƒ/11; etc.

As a general rule, adjusting the ƒ value to the next click stop, luminosity is uniformly reduced in half. The aperture values not only are internationally recognized but are also a method to indicate light intensity entering the body of the photographic camera no matter the format. This is due to how ƒ values are calculated, that is, the ratio between the two variables:

ƒ/number = focal length ÷ aperture

27

For example, a value of ƒ/4 means the diameter of the aperture is one-fourth of the focal length value. As previously mentioned, luminosity is decreased by four times for each “stop”. Aperture follows the same logic. Decreasing the diameter value in half (or increasing the ƒ value), luminosity is decreased by four times. It is crucial to specify that the maximum and minimum values of aperture vary from lens to lens and from the format of the photographic camera. In essence, small lenses or small photographic camera formats have the disadvantage of not using higher ƒ values. By way of explaining, bigger lenses have the possibility of using lower diameter values to further decrease luminosity. For example, lenses fabricated for small camera formats can have values up to ƒ/22, and lenses fabricated for large camera formats can have values up to ƒ/45. This limitation is due not only to the physical size of the lens but also to the diffraction phenomenon. That is, the smaller the diameter the harder it becomes to eliminate negative optical effects and the price of the lens is higher. The same logic applies to the other extreme of the aperture values. The smaller the ƒ value (or the longer the diameter), the harder it is to eliminate optical aberrations due to the high quantity of light rays travelling inside the photographic camera. The advantage of using low ƒ values is to benefit from higher luminosity and therefore higher shutter speeds. For example, the price of a lens may double for having only one more ƒ click stop, even though all the other characteristics are equal. The user must consider the cost-benefit of purchasing such high-quality lenses.

1.3.2.1. RANGE OF FOCUS OR DEPTH OF FIELD It has been referred that light intensity is one of the effects from adjusting the aperture, however, other effects are observed and are related to focusing, such as depth of field or range of focus. It has also been explained the effects of adding a pinhole and a lens and the effects for producing a sharp image. The aperture follows a similar logic (Langford, 2002; Mikhail, Bethel, & Chris, 2001). The aperture is the mechanism of the photographic camera that greatly impacts the sharpness of the resulting image for it determines the range at which the objects being photographed are within focus or not, in addition to controlling the luminosity level. For example, if an object located at a medium distance is to be focused and other objects in the foreground and background are not relevant for the photogrammetric processing, the operator can set the ƒ value to the minimum to only have the first object in focus. However, if all objects are of interest, the ƒ value is set to the maximum value. In other words, the range of focus increases but at the

28

Figure 14: The effect of adjusting aperture. Left image was shot with a longer focal length while the image on the right was acquired with a shorter focal length. The result is blurrier background on the left image. cost of lower exposure. This is due to limiting the number of light rays travelling inside the photographic camera and subsequent smaller footprint of the circles of confusion on the sensor. See Figure 14.

As a rule, to increase the range of focus, lower ƒ values must be utilized. In short, the range of focus determines the distance at which subjects are within an acceptable focusing threshold. It is said “acceptable focusing threshold” because sharpness decreases smoothly and gradually as the subjects are farther away from the focusing point. Three behaviours are noticed: i) Even though all of the settings of the camera are adjusted and locked for taking a photograph, the total range of focus varies depending on how close the object is to the photographic camera. That is, the closer the object is to the camera, the total range of focus decreases. Inversely, the more distant the object is to the camera, the longer the range of focus. ii) The range of focus is affected by the focal length in use. The longer the focal length the shorter the range of focus, and vice versa. iii) The higher the aperture value, the sharper the image is.

1.3.2.2. INFINITY FOCUS AND HYPERFOCAL DISTANCE An important question remains: the range of focus is a range, in which subjects become increasingly defocused up to a point subjects are considered out of focus. The transition is smooth as implicitly referred. A surface is considered out of focus when compared to the limited power of focusing of the human eye. In general, an image is accepted as “in focus” even when small

29

circles of confusion are present. The limit for which eyes see with sharpness is on the order of 0.25mm, or one-fourth of a millimetre (Langford, 2002) - although Docci and Maestri (Docci & Maestri, 2005) state that the human eye has the power to distinguish lines for every 0.1mm of interval – but for graphical representation a tolerance of 0.2 or 0.3mm is acceptable. For practical purposes, two major concepts related to focusing must be addressed in order to benefit from all of the range the range of focus has to offer (Langford, 2002; Mikhail, Bethel, & Chris, 2001): i) Infinity focus is when the lens is set to focus at infinity. On the body of the lens and next to the focusing ring, several markings are displayed in meters or feet, from ∞ to 1.5 meters for instance – the latter value varies from lens to lens. Adjusting the focus ring to the symbol ∞ the camera is focusing on infinity. In practical effect, two-thirds of the range of focus are unused because they are beyond the horizon line. This is due to the fact that the point of focus is said to be parallel to the light rays. The remaining third of the range of focus is between said horizon and the photographic camera. ii) Hyperfocal distance is valuable in the context of photogrammetry. Once the lens is set to focus on infinity, the hyperfocal distance can be set up by adjusting the focusing ring to the closest object in focus. Such a method allows increasing the total range of focus by 50% more. Specifically, two-thirds of the range of focus

Figure 15: Fisheye image taken from the Castle of Sesimbra. Every object is in focus regardless of camera settings.

30

is located from the hyperfocal point to infinity and the remaining third is between the hyperfocal point and the photographic camera. Fisheye lenses due to their exceptionally small focal length values can benefit from a range of focus from infinity down to 30cm away from the camera body. See Figure 15.

1.3.3. THE SHUTTER Another setting of utmost importance in photographic cameras is the shutter. Such component allows controlling the light travelling inside the camera for a specific period of time. Camera shutters can be installed in two places: i) in the middle of the lens, ii) or inside the body of the photographic camera, in front of the imaging sensor. The first type of shutter is composed of several opaque thin blades that open quickly once the shutter button is pressed. The second type of shutters are installed in the focal plane and are of more convenient use for photographic cameras with interchangeable lenses, leading to a synergistic combination with SLR viewfinders. Furthermore, the same shutter is utilized for several lenses, reducing the monetary cost and the need to fabricate lenses with integrated shutters. On the cylindrical body of the lens, the shutter is displayed on a scale of numbers to indicate the shutter speed. Common shutter speeds range from 1 second down to 1/500 of a second. Similar to the aperture values, shutter speeds are also given in click stops:

[...] 1/250 1/125 1/60 1/30 1/15 1/8 1/4 1/2 [...]

Shutter speed values can go up to 30 seconds or more and down to 1/8000 of a second. For each click stop, light intensity is increased or decreased by a factor of 4. For this reason, aperture and shutter speed values can be adjusted in such ways that the resulting light intensity is the same. For example, photographing with a shutter speed of 1/500 and with an aperture of ƒ/4 provides the same luminosity intensity as when photographing with a shutter speed of 1/250 and with an aperture of ƒ/5.6. However, the visual effects are noticeably distinct. Shutter speed is a great asset to decide if moving objects should be focused or not. Fast shutter speeds are exceptionally suitable to capture sharp images of moving objects. Per contra, if static objects are

31

meant to be captured, as in most photogrammetric situations, lower shutter speeds are applied in order to provide with the opportunity of decreasing the ƒ value and increase the range of focus.

1.3.3.1. GLOBAL SHUTTER VS ROLLING SHUTTER Two types of shutter capturing techniques exist: i) global shutter and ii) rolling shutter. The difference resides in the way the electrical signals are read by the imaging sensor. With rolling shutter, the sensor is exposed in a progressive motion, meaning that the recording of an image is not instantaneous as it occurs with the global shutter. The disadvantage of the former is clearly perceptible if photographing subjects travelling at high speeds, which are represented disproportionally and/or skewed. Many cameras benefit from the use of mechanical shutter. However, recent developments have allowed the manufacture of photographic cameras that do not require the use of mechanical shutters by taking advantage of the properties of CCD and CMOS imaging sensors. Consequently, cameras using electronic shutters are silent because there are no moving parts and consume less battery. In addition, shutter speeds can increase up to 1/32000 of a second or more. Rolling shutter effect may occur if photosites are activated sequentially instead of globally. Despite the advantages, many cameras on the market benefit from hybrid shutters, which can be used when silent mode is selected. Digital shutters are regularly installed in compact cameras, mirrorless interchangeable lens cameras, and smartphones.

Figure 16: The effects of using global shutter vs rolling shutter. Top image used global shutter and bottom image used rolling shutter. https://www.premiumbeat.com

32

The issues related to global shutter vs rolling shutter are of utmost importance considering the application of fisheye lenses, given that rolling shutter mechanisms introduce lower quality reconstructions apart from the fact that fisheye images per se have a higher propensity of generating geometrically inaccurate outputs without the use of control points (Strecha et al., 2015). Thus, global shutters are preferred, particularly in fisheye lenses.

1.3.4. VIEWFINDER AND IMAGE FOCUSING Another key element of any photographic camera is the viewfinder, a mechanism allowing the user to pre-frame the subject, observe the FOV, and select a specific focus. Particularly, the viewfinder offers the opportunity to observe the scene and proceed with the necessary adjustments before photographing, obviating the need of trial and error approach by presenting the viewer with a preview of the resulting output. However, the viewfinder component has not always been integrated in photographic cameras. Particularly during the early developments of photographic cameras, a glass sheet placed in the rear part of the camera allowed to see and focus the image formed by the lens before placing the photographing film. Other approaches were researched. In photographic cameras of today, 3 methods of offering a preview to the user are utilized: i) Most commonly present in the declining disposable or point-and-shoot photographic cameras of today is the direct viewfinder, installed next to the lens in a separate position and offering a direct view of the scene but misaligned relative to the point of view of the lens. A parallax error occurs from this offset. It is given the illusion that the scene being photographed is the same as the one being observed through the viewfinder due to the same width-height proportions of the viewfinder. The resulting misalignment is notorious when analysing the outputs. Concerning the focusing of the image, the photographic cameras utilizing this type of viewfinder often are equipped with autofocus system (AF), thus reducing the need for the user to make manual adjustments. For this reason, the photographic cameras are referred to as “point-and-shoot” and are suitable for most situations. This type of viewfinder is close to obsolescence. ii) The optical viewfinder. For an accurate and precise focusing of the objects, the second type of viewfinder is recommended, and it is utilized in SLR or DSLR cameras. As the name implies, these types of cameras make use of a mirror to reflect the incoming light into a pentaprism to reflect once more the image to the viewfinder, where the user observes the scene - thus the nomenclature “reflex”. In other words, it is as if the user is viewing through the lens but at a

33

Figure 17: Optical viewfinder of a SLR camera. Pentaprism represented next to position A (Langford, 2002) separate location as a result of intentionally calculated reflexions. The reason to reflect the light into a separate location is due to the fact of being impossible for the sensor and viewfinder to share the same optical path. More specifically, when the shutter button is pressed, the mirror moves out of the light path to allow light to continue its original path and illuminate the sensor. The use of this type of viewfinder offers more accurate outputs - about 95-98% accurate). The pentaprism not only corrects the light path by reflecting light several times but also inverts the image in the horizontal and vertical axis for the user to observe the same image as the sensor detects. The advantages of using the first and second type of viewfinders, also known as optical viewfinders, is the non-consumption of energy. In DSLR cameras, the viewfinder displays not only the scene but also the camera settings, such as shutter speed, aperture, ISO, flash, etc. iii) Developments of digital imaging technology in recent years have been providing users with a third type of viewfinder: the electronic viewfinder (EVF). The electronic viewfinder is the latest and reasonably low-cost invention allowing users to observe in real time the true image through a small display (shaded from ambient light) at the back of the photographic camera. It is similar to an optical viewfinder but displaying 100% true coverage of the sensor. Some compact cameras or bridge compact cameras of today include an EVF allowing the user to observe and frame the subject. The viewfinder technology has two major disadvantages. Firstly, it consumes electrical power. Secondly, the LCD display, for being of digital nature, is conditioned to a refresh rate and a specific range of colour values (digitally interpreted) as opposed to the SLR optical viewfinders where the preview image offers greater details because the information is raw and

34

non-interpreted. In addition, following moving objects is harder due to the refresh rate of the EVF. For these reasons, professional photographic cameras of today are mostly equipped with optical viewfinders since it offers better control.

1.3.4.1. VIEWFINDER TO FRAME AND FOCUS IN PHOTOGRAMMETRY The viewfinder is a crucial element in photogrammetry for it allows the operator to frame the scene and preview the outputs that will be used in photogrammetry pipeline during the office phase. A few notable aspects must be considered. While observing through the viewfinder, it is common for photographic cameras to display a set of well-distributed dots (fiducial marks) in a symmetrical pattern, ranging from 9 up to 39 or more. The user is able to select one and the camera uses its “Single-Point AF” system (Auto-Focus). Alternatively, the photographic camera operates with “Dynamic-Focus AF”, in which the camera automatically switches to surrounding focus points if the subject moves after locking focus on a specific selected user focus point. The third option is “Auto-area AF” where the camera automatically determines the best focus point. These options are relative to “where” the focus is locked. There still are “Focus Modes” that concentrate on “how” focus points are acquired. In “AF-S” (single-shot) the photographic camera utilizes the user-selected focus point and performs the necessary calculations to focus the subject while the button is pressed halfway. In “AF-C” (continuous), the photographic camera focuses continuously while the shutter is pressed halfway. In “AF-A” (automatic) the camera analysed the scene and calculates the best focus mode, that is, if the subject is stationary it uses single focus, and if the subject is moving, it uses continuous focus. Lastly, in MF (manual focus), the user performs all the necessary operations to focus the subject, including rotating the focusing ring on the lens. For photogrammetric purposes, where dozens, hundreds, or even thousands of images are acquired, the single-point AF or AF-S is the preferred approach. In addition, even if the remaining points are unutilized for focusing, the user has the possibility of using the remaining focus points as guidelines to estimate the percentage of overlap between photographs. Performing such increases sureness not only on the individual quality of each image but also the quality for photogrammetric processing.

35

1.3.5. EXPOSURE The grand majority of photographic cameras of today are equipped with an exposure system, which is a unit of measurement to quantify the amount of light per unit area reaching the imaging sensor. Exposure is calculated by combining the quantity of light provided by the shutter speed, lens aperture, and ISO. By way of explaining, exposure determines how light or dark the resulting image will be after photographing, or the extreme differences between the brightest and darkest areas of the resulting image. Such measurement is possible to achieve due to the digital nature of modern cameras, that is, CCD and CMOS imaging sensors are sensitive to a specific range of brightness, and the software installed in the photographic camera calculates the dynamic range to determine if the full potential range of brightness of the imaging sensor is used. The camera notifies the exposure level to the user, through the LCD display or the viewfinder, if the image is underexposed or overexposed via the representation of a series of vertical lines, similar to a ruler, grouped up to form a range that goes from -2, underexposed, to up to +2, overexposed. The optimal point of exposure is at 0 because the tones in the image are distinguishable and distributed above and below the mid-tone, and there is less noise. For example, an underexposed image will display dark tones as pure black, reducing the interpretability of the image. Such an occurrence is referred to as clipping, which can occur simultaneously for both highlights and shadows or one at a time. Such an issue is related to dynamic range. Most photographic cameras adjust automatically the exposure (AE - Automatic Exposure) if the automatic mode is selected.

1.3.6. DYNAMIC RANGE, SCENE DYNAMIC RANGE, AND CAMERA DYNAMIC RANGE The concept of exposure is distinct from that of dynamic range, even though both concepts are related. Dynamic range is the maximum difference between the lightest and darkest tones (as well) and is subdivided into scene dynamic range (SDR) and camera dynamic range (CDR). A scene with high dynamic range, such as a sunrise or a sunset, contains tones that range from very light tones to very dark tones. For example, photographing the sky during a sunrise produces an image with visible details of the sky, and the ground is extremely dark - underexposed. If photographing the ground, the ground details are distinguishable, and the sky tones are mostly white – overexposed. See Figure 18. In this particular case, the dynamic range of the scene is greater than the dynamic range of the photographic camera. That is, the extreme

36

difference between black and white tones of the scene is superior to the extreme difference between black and white tones possible to capture with the sensor of the photographic camera. One solution to solve such a problem is to shoot with different exposures in order to capture the maximum information for the various tones present in the scene. In a later stage, the photographs are merged - denominated tone mapping - to obtain a final image with high dynamic range, referred as HDR image. The advantage of HDR images or high camera dynamic range is the presence of details and vibrant colours. There are times the dynamic range of the scene is shorter than the dynamic range of the photographic camera. In this case, the dynamic range can be adjusted to generate diverse outputs. Such room for creative adjustments is referred to as “Exposure Latitude”.

Exposure Latitude = Camera Dynamic Range – Scene Dynamic Range

The dynamic range histogram is presented on the LCD screen with detailed information, displaying four colours: red, green, blue, and grey. The grey colour results from the overlapping

Figure 18: An example in which scene dynamic range is far greater than the camera dynamic range. Left image: camera dynamic range adjusted for the interior space. Right image: Clipping of highlights and shadows.

37

of red, green, and blue colours, thus informing the user that the intensity of RGB is distributed uniformly for that particular tone range. If in the dynamic range graphic red colour is displayed above the grey area it means the scene is radiating a considerable amount of red colour relative to the other two colours. Specifically, the horizontal axis of the graphic displays all possible brightness values in the photograph from pure black - on the far left-, to pure light - on the far right. The vertical axis represents how many pixels have that particular brightness value. The more pixels within a specific range of brightness in an image, the higher the spike. For example, a complete mid-grey image displays a vertical line exactly in the middle of the histogram.

1.3.6.1. DYNAMIC RANGE IN PHOTOGRAMMETRY Due to the limited dynamic range of imaging sensors, the surveyor is faced with the typical challenge of capturing details as best as possible when transitioning from interior to exterior spaces and vice versa, or when photographing interior spaces with windows and/or entrances. For this reason, the previously mentioned information is of utmost importance for executing photogrammetric surveys of morphologically complex structures. From the point of view of executing successfully the image correlation algorithms, extremely poor results are more than likely to be produced with a photographic camera set to fully automatic mode to register information of interior spaces. Three methods are considered when photographing interior spaces: i) With the photographic camera set to fully automatic, the surveyor selects wisely the focusing point to coincide with any tone of the interior space. The camera adjusts the settings automatically and determines the best exposure. All of the images are collected swiftly, including images of the transition between interior and exterior spaces. In the latter situation, several photographs must be taken to gradually follow the changes in light intensity. ii) In the second method, the settings of the photographic camera are manually configured relative to the luminosity of the interior space. The photographs are acquired, and a uniform model is reconstructed. However, such method is unreliable when transitioning to another space with contrasting luminosity for it demands the user to continuously adjust the camera settings. iii) The third method is a combination of the first and second methods, that is, a semi- automatic approach in which the positive aspects are combined. Cameras manufactured for the past few years, including those in smartphones, offer an automatic solution to acquire HDR images by taking 3 or more photographs with different exposures in a short interval of time.

38

Depending on the installed software, some cameras merge the acquired images into the final one, saving space in the memory card. If this option is not available the images will have to be merged in the computer, preferably using an automatic approach otherwise the task is time- consuming. The major setback is the need to use a tripod so that the set of images are exactly from the same POV.

39

Page intentionally left blank

40

1.4. TYPES OF PHOTOGRAPHIC CAMERAS An essential idea to consider before initiating a photogrammetric project is the choice of photographic equipment. Presently, a myriad of photographic camera designs is available on the market with various characteristics, ranging from professional to amateur usage (Langford, 2002). For a clear explanation, cameras are grouped into 6 categories: i) Compact Cameras; ii) Bridge Compact Cameras; iii) Single Lens Reflex Cameras; iv) Mirrorless Interchangeable Lens Cameras; v) Smartphones; vi) Action Cameras or Sports Cameras. It should be noted that the presented information refers mostly to digital cameras as it is the current trend of the market.

1.4.1. COMPACT CAMERAS i) Compact Cameras are designed for pragmatic purposes. More often than not are known as “point-and-shoot”. As the latter description entails, compact cameras are fabricated to be low-cost and pro-user-friendly, requiring little to no technical skills, and for that reason, compact cameras are fully automatic. Yet, a few cameras, but of superior cost, allow some manual control over the settings to produce personalized or quasi-professional outputs. In general, the greater the freedom to manipulate the mechanisms of the camera, the cost is higher and higher. Other notable characteristics are the small and light format, allowing the user to handle with high flexibility, easily carry inside the pockets of clothing, and is good enough for typical day-to-day use. However remarkable this small technological device is, compact cameras have some setbacks. The user should avoid photographing objects at close range, avoid illuminating surfaces with flash beyond 3m, and keep the camera steady while pressing the shutter button to allow enough time for the camera to focus. Despite these negative aspects, compact cameras are ideal to proceed with a swift collection of photographs and are extremely easy to carry. In general, compact cameras have these features: zoom lenses that include digital zoom, auto exposure, incorporated flash, small imaging sensor format ranging from 1-inch to ASP-C format, and photography modes. For these aspects, the resulting images may be pixelated if too much zoom is utilized, and images can be noisier due to very small imaging sensor formats.

1.4.2. BRIDGE COMPACT CAMERAS ii) Bridge Compact Cameras also recognized as “Advanced Compact Cameras” may be considered a hybrid between compact cameras and DSLR cameras. The cameras in this category

41

are notorious for their superzoom and non-interchangeable-lens mounted on a camera body (often similar to a DSLR) and whose focal length varies from wide-angle to telephoto, thus the given nomenclature of “bridge”. Particularly, bridge compact cameras benefit from the typical hand-grip of a DSLR, make use of an electronic viewfinder, and offer the user with greater manual control to adjust settings. The selling price is superior to those of compact cameras. The disadvantage of cameras falling in this category relates to the fact of them neither being handy as a compact camera nor equivalent to a DSLR that offers far more control and superior image quality. All in all, bridge cameras are recommended for photographers who desire more manual control over the camera settings and longer zoom range for a relatively low price when compared to DSLR cameras.

Figure 19: Types of cameras. From top left to bottom right: Point and shoot camera (Nikon Coolpix A300); Bridge compact camera (Nikon Coolpix L340); DSLR camera (Nikon D3100); MILC camera (Nikon Z7); Smartphone (Huawei Mate 10 Lite); Sports camera (GoPro). Images from URL: https://www.nikonusa.com https://www.huawei.com/en/ https://www.gopro.com

42

1.4.3. DIGITAL SINGLE LENS-REFLEX-CAMERA AND SINGLE LENS-REFLEX-CAMERA iii) In essence, Digital Single Lens Reflex (DSLR) Cameras are for those who seek to perform both amateur and professional photography due to high-quality image output and extreme versatility to manually control the camera settings that allow delivering intensely creative imagery. Originally introduced in the 90s, DSLR cameras are the successors of SLR cameras, having replaced photographic film with digital imaging sensors, typically the size of APS- C or Full Frame if the camera is a top end model. The prices range from a few hundred euros up to a few thousand euros. The major advantages of DSLR cameras are the possibility to change camera lenses to use in different situations, superior quality of lenses than those manufactured and installed in compact and bridge compact cameras, and the possibility of operating in manual, semi-automatic, or automatic mode. Furthermore, DSLR cameras allow the users to frame and focus objects accurately in the scene, observe the range of focus without lag. In addition, a great variety of lenses and accessories are available on the market that equips the user to photograph anything and everything. The name reflex comes from the use of a pentaprism to reflect light to the optical viewfinder.

1.4.4. MIRRORLESS INTERCHANGEABLE LENS CAMERA iv) In recent years, a new type of photographic camera is becoming popular on the market, the Mirrorless Interchangeable Lens Cameras (MILC). These types of digital cameras are competing with DSLR cameras on the market, and for this reason, many common characteristics are shared. For example, MILC cameras utilize interchangeable lenses and offer high control over the camera settings. The differences are on “minor” details, with the exception of one major difference, the body size. MILC cameras are more compact than DSLR because they make use of electronic viewfinders instead of a system of mirrors to reflect light into an optical viewfinder. The removal of the system of mirrors allows the body size and weight to reduce significantly while maintaining the sensor size, APS-C or Full Frame format. The setback is that lenses have a specific mount type for MILC cameras because lenses are of the same size or slightly smaller, and the camera may feel harder to grip due to the camera body and lens size combination. Yet, some DSLR lenses are usable in MILC cameras when installing appropriate adapters. For this reason, MILC users have a limited choice of lenses and accessories when compared to DSLR users.

43

For easy framing purposes, some MILC are equipped with an EVF to help users handle the camera and frame the scene. Nonetheless, as aforementioned, EVF efficiency is not compared to optical viewfinders due to lag when photographing moving subjects (related to refresh-rate of the screen), although improvements are under development. Despite having a few disadvantages, MILC is better for high-speed photography. Due to high digital usage, the major problem relates to battery usage.

1.4.5. SMARTPHONES v) Another type of photographic camera is the one integrated into smartphones. Currently, smartphones are ubiquitous, and their ready-usage is so increasingly convenient and competitive relative to compact cameras that the latter ones are declining on the market. Similar to compact cameras, smartphones are of small size but are the equivalent to a pocket-sized computer, offering the possibility to install multiple applications and execute diverse tasks. For example, applications for post-processing, filters, share the results on the internet, the possibilities are many. For the reason of being of small format, imaging sensors are of small format as well and are capable of delivering high-resolution outputs, ranging approximately from 10MP up to 20MP. However, due to the size of the sensor, performance is poor in low light conditions due to high noise when compared to larger format sensors. In addition, when compared to DSLR or MILC, smartphones do not support interchangeable lenses, but rather lenses specifically designed and coupled to the lens of the smartphone, similar to an extension. Furthermore, smartphones do not offer many zooming options as any other type of photographic cameras. The major setback is running out of battery quickly, although smartphones offer other uses besides photographing. In essence, for point-and-shoot situations smartphones provide with satisfactory results even when compared to compact cameras. If the user desires to acquire higher quality images, a little research should be made and decided if, for the same price as a high-end smartphone, an entry- level DSLR should be purchased.

1.4.6. ACTION CAMERAS OR SPORTS CAMERAS vi) The last category of cameras comprises action cameras, also known as sports cameras. The main characteristics are their small, adaptable, portable, lightweight, tough, and sturdy body to endure adverse circumstances, as they are predominantly utilized for filming deeply engaging

44

and high-risk actions that may damage the camera. High-risk actions such as skateboarding, bicycling, F1 racing, skydiving, scuba diving, surfing, and other innumerable applications. For this reason, adaptability is one of the main characteristics, and thus the camera is frequently mounted on body parts, pets, cars, drones, and other equipment, through the use of ad-hoc designed mounting devices. In fact, other types of cameras with superior performance and functionalities could be mounted instead, as in the case of drones of considerable size used for surveying. However, due to their heavy weight and size, mounting and filming turn out to be a challenging and difficult task to achieve. Furthermore, for most applications fully automatic video capturing is a must in order to obtain optimized results anywhere and in any filming or burst-photography session. One of the features sometimes desired is high fps recording to allow the users to slow down videos while maintaining visual fluidity. A few sports cameras record up to 240fps for that effect while others record at 30fps. Regarding the type of FOV, sports cameras, in general, utilize wide angle and ultra-wide angle lenses for the main reason of capturing most of the action as possible. Main brands known for sports cameras are GoPro and Sony.

1.4.7. TYPES OF CAMERAS AND PLATFORMS IN PHOTOGRAMMETRY In the context of photogrammetric recording, every type of photographic camera is usable in a fully automatic pipeline (Argarwal et al., 2010; Snavely, n.d.; Snavely, Seitz, & Szeliski, 2006) due to developments in Computer Vision algorithms that allow the use of unordered sets of images obtained with several cameras, focal lengths, and lens distortions. Nevertheless, such automation does not necessarily mean that every image is within acceptable limits for a digital reconstruction. Depending on the requirements and outputs to produce, particular photographic equipment is selected to efficiently record the scenes, buildings, or objects. Therefore, trends are observable. For example, sports cameras are valuable, effective, and practical to mount on lightweight drones (with a gimbal mount support) in order to conduct surveying works of buildings from diverse POV and cover hardly accessible areas (Küng et al., 2012; Vallet, Panissod, Strecha, & Tracol, 2012). Such an application is possible because sports cameras can be equipped with high-density sensors capable of obtaining images with 14MP for instance. In general, sports cameras contain wide angle lenses or even ultra-wide-angle lenses that provide with additional information, convenient to increase image correlation. Other types of cameras, such as DSLR and

45

MILC, can also be mounted on drones that support heavier weight and, for that purpose, drones are larger in size. Overall, minimizing weight on drones is a crucial factor to increase the time of flight and thus the total number of acquired images. It is also possible to use cameras from helicopters (Vozikis, 2007). See Figure 20. Other aerial platforms are also usable as indicated by Ferreira (Ferreira, 2011): balloon; airship; glider; wing glider; jet; gyroplane; helicopter; coaxial helicopter; quadcopter; and multicopper. Each platform has its own advantages and disadvantages if considering factors such as reach, wind resistance, autonomy, manoeuvrability, and load capacity. As for terrestrial applications, professional grade cameras are preferred, such as DSLR instead of compact and bridge cameras. The reason for such preference is related to the higher quality manufacture of lenses, and the possibility to change and adapt accessories or other lenses to overcome challenging situations that arise during the recording session. For instance, due to inaccessibility, the operator may require changing from a short focal length lens to a longer focal

Figure 20: Various ways to photograph. From the top left to the bottom right. Terrestrial photogrammetry with the use of telescopic pole, handheld, and tripod; lastly, aerial photogrammetry with the use of a UAV (Fiorillo, Jiménez Fernández-Palacios, Remondino, & Barba, 2013).

46

length lens to photograph from the same POV. Alternatively, in indoor spaces, the operator may require changing from a short focal length lens to a very short focal length lens. Moreover, it is also possible to mount the photographic camera in telescopic poles to record complex geometries (Covas, Ferreira, & Mateus, 2015) or when other surveying techniques are harder to use, such as TLS (Andrews, Bedford, & Bryan, 2013). There are also photographic cameras mounted on aeroplanes and satellites, but for the purposes of the current thesis, we focus on terrestrial and low altitude aerial photogrammetry. It is in this context that the next section the types of lenses are presented.

1.5. TYPES OF LENSES Currently, there is an innumerable quantity of lenses available on the market. Lenses with distinct design, materials, and construction. Photographic lenses can be structured in two separate but interrelated methods: i) for its ability or inability to change focal length values; ii) and by the FOV related to focal distance. The two structures are presented (Langford, 2002).

1.5.1. PRIME LENSES VS ZOOM LENSES Concerning the first structure i), lenses included in this group are categorized either as “Prime Lenses” or “Zoom Lenses”. Lenses inserted in the first category have their focal length fixed, as opposed to lenses included in the second category. The advantage of prime lenses is shown in the superior quality of the images, particularly with high-end lenses. However, for the past few decades, casual photographers have been preferring to purchase lenses whose focal length is variable for it allows higher flexibility during the photographic acquisition, in addition to the image still having high quality due to the rigorous production of lenses. In the context of photogrammetry, the ideal equipment to use is a prime lens because the design of prime lenses benefit from greatly reduced optical aberrations and allow to create more accurate photogrammetric models. Zoom lenses, as mentioned, vary their focal distance, making their usage more convenient because the photographer can remain in one particular POV. Zoom lenses are bigger and bulkier, and the cause for such is due to the necessity of adjusting all of the lens elements forward or backwards to change the focal length. In general, to use zoom lenses in photogrammetry, the minimum or maximum focal length is selected since the intermediate focal length values are difficult to define accurately. For reasons of prevention, it is

47

recommended to tape the focal length ring for it not to accidentally move (Shortis, Bellman, Robson, Johnston, & Johnson, 2006). Zoom lenses can include multiple types of lens categories as described in the second structuring method ii). For example, a zoom lens can have a focal length ranging from wide-angle up to a telephoto. Zoom lenses can also include merely one category. Interestingly, there are lenses whose focal length varies so much that are referred to as “Superzoom lenses”, which also reflects on poorer geometric consistency. Specifically, the optics are complex because as the focal length value is changed the diameter of the diaphragm has to be automatically adjusted to maintain the aperture value constant.

1.5.2. THE FIVE TYPES OF LENSES Regarding method ii), lenses are divided into 5 categories depending on the FOV they provide: a) “Close-up” or “Macro Lens”; b) “Wide-Angle Lens”; c) “Ultra-Wide-Angle Lens”; d) “”; e) “Telephoto Lens”. It is important to note that the FOV is controlled by the focal length value combined with the size of the sensor.

1.5.3. MACRO LENSES In the first category (a), macro lenses, as a rule, are used for creating images in which the dimension of the represented object is equal than or superior to the dimension of the object in

Figure 21: Comparison of the FOV of a full frame fisheye lens (from a Samyang 8mm lens) with a wide-angle lens (represented within the red limits - Nikkor 18mm).

48

the scene – magnification effect. There is no specific focal length for this type of lenses, which can vary between 35mm up to 180mm of focal length for instance. Frequently, macro lenses are used to capture very small details, anything as small as a caterpillar or a plant. Such is possible give the high power of convergence – refraction - of the lenses to focus objects closer than any other “conventional” lens. Some advantages of macro lenses are related with optics, that is, images have greater contrast, colour, sharpness, spatial resolution, less vignetting effect, and less geometric distortion. All in all, the shorter the focal length the cheaper is the lens and the closer the user has to be relative to the object. For photogrammetric purposes, macro lenses have application for photographing small-sized objects, such as instruments, fossils, etc. (Samaan, Héno, Pierrot-deseilligny, Pascal, & France, 2013; Yanagi & Chikatsu, 2010).

1.5.4. WIDE-ANGLE-LENSES Concerning the second category (b), wide angle lenses are characterized for their short focal length that allows a wide area of coverage than the so-called “normal” lenses. Such FOV is advantageous if the user must be close to the object or needs to capture a larger scene when it is not possible to distance from the object. If using the same sensor format and photographing from the same place, note that the GSD value decreases as the FOV increases. A straightforward way to know if a lens is a wide angle lens is by comparing the focal length with the sensor format. As a rule of thumb, a wide-angle lens has its focal length shorter than or equal to the longer side of the sensor. For example: for an APS-C sensor, which is approximately 24mm by 16mm, a wide-angle lens is equal to or less than 24mm. Another way of identifying a wide-angle lens is to analyse the produced image and observe if it has a FOV more or less than 80o. For photogrammetry, wide angle lenses are ideal for general application due to their flexible use as they allow to frame the whole object even if the user is fairly close to the object. For instance, wide angle lenses are good for photographic statues or buildings of considerable size.

1.5.5. FISHEYE LENSES Following the previous category is the ultra-wide-angle lenses or fisheye category (c), which have even shorter focal length values and tend to be prime lenses for better performance. Such type of lenses produces images with unusual distortions in which the image looks as if it was projected over a spherical object due to its high FOV. It is common for fisheye lenses to have fixed

49

Figure 22: Example of a fisheye image. Observe the curved horizon line. focal length values to better control the image quality and optical aberrations, particularly chromatic aberration. Fisheye images can be of two types: “spherical fisheye” or “full-frame fisheye”. The difference between these two lenses is related to the area used for projecting the image. In the spherical lens type, the FOV is equal in any direction, circumscribing the final circular image over a black rectangular background. The black areas correspond to unused pixels. In full frame fisheye lenses, the widest FOV corresponds to the diagonal axis of the sensor, unlike the spherical fisheye lenses. A “true” fisheye lens is considered the one capable of capturing 180o at the widest point. Some brands fabricate exclusive lenses that have up to 220o of FOV, making said lenses bulky, heavy, and expensive. Fisheye images can be recognized as such when the focal length value is equal or shorter to the shortest side of the imaging sensor. For example, a 15mm lens is considered a fisheye lens for an ASPS-C sensor (24mm x 16mm). Another method to identify is by its typical coverage and features represented with extremely curved lines. In the photogrammetric context, the major advantage of ultra-wide-angle lenses is the range of focus. As the focal length is very short, everything from 30cms to infinity is focused. For surveyors, who use photogrammetry as a surveying method, fisheye lenses allow recording scenes or buildings containing some degree of complexity and that are difficult to capture using laser scanning or any other type of lens (Barazzetti, Previtali, & Roncoroni, 2017; Covas et al.,

50

2015). The challenge associated with ultra-wide-angle lenses lies in determining accurately the distortion parameters in order to create an accurate point cloud. Given the presented information, it is noted that case studies in this thesis make use of fisheye lenses, exploring geometrical differences in the final models and potential applications with other surveying instruments.

1.5.6. NORMAL LENSES Category (d) includes lenses considered “normal” because the resulting images are comfortable to be observed by the human eye and mimic the perspective effects of the human eye. The use of normal lenses proves to be interesting and useful when the user wishes to illustrate certain elements of the environment or constructions with proportions closer to what is perceived by the human eye. Lenses are considered “normal” when the focal length is approximately the length of the diagonal of the imaging sensor. For example, for APS-C imaging sensors, a normal lens has 35mm of focal length. For Full-frame sensors, whose diagonal measures approximately 43mm, normal lenses range between 35mm and 50mm of focal length.

1.5.7. TELEPHOTO LENSES Last but not least, the last category (e) includes lenses referred to as telephoto lenses, which offer a narrower FOV relative to all of the lenses previously mentioned. As the focal length increases, the field of view narrows, the range of focus shortens, and everything represented in the image becomes “flattened”. An interesting phenomenon to observe is that the distance between objects in the background and foreground seems to be shorter. In addition, objects seem closer to the camera and the unfocused parts of an image are very blurred. A tripod should be used. Telephoto lenses are to be utilized for shooting from medium up to long distance. Note that as the focal length increases the more difficult it is to capture light, consequently forcing the manufacturing of physically larger sensors and the use of high-quality techniques.

51

Figure 23: From left to right. Samyang 8mm fisheye lens; Wide-angle lens AF Nikkor 20mm f/2.8D; and telephoto lens AF-P DX Nikkor 70-300mm f/4.5-6.3G ED VR. Images from URL: https://www.nikonusa.com https://www.samyanglensglobal.com In fact, all of the lenses included in this category can be further subdivided into three groups: moderate telephoto, from 70mm up to 135mm of focal length; medium telephoto lenses, from 135mm up to 300mm of focal length; super telephoto lenses, superior to 300mm of focal length. The FOV ranges from 30o down to 1o. For photogrammetric purposes, telephoto lenses are ideal for photographing distant objects for which is impossible to approach due to safety or preservation conditions.

52

1.6. BRIEF INTRODUCTION TO IMAGING SENSORS An essential topic concerning photogrammetry relates to the employment of imaging sensors to acquire sharp images as a requirement for generating outputs with the highest levels of accuracy and precision as conceivable. Imaging sensors are one of the contributing factors for successful reconstructions apart from the underlying set of operating algorithms of the photogrammetry software. In the broadest definition, sensors have the primary function to read information originated by external stimuli, events, or changes in the physical environment, and translate said information into, but not always into, a human-comprehensible medium. In other words, sensors provide data by converting external stimuli, considered input, into internal information, considered output, which is intelligible to a specific system. In current times, sensors are employed for a myriad of ordinary objects, thus existing a great collection of sensors capable of capturing the most diverse information such as, but not limited to, movement, temperature, gravity, pressure, humidity, touch, velocity, sound, radiation, angle, displacement, distance, acceleration, speed, pressure, proximity, and others. These stimuli may be captured by electronic or biological sensors. In essence, sensors are an extension of what the human body is capable of perceiving by transforming the external information into computer- readable data that is simultaneously and conveniently human intelligible. In remote sensing, and photogrammetry, electronic sensors provide with information of the electromagnetic spectrum. Said information is given in the form of photographic images, hyperspectral images, monochromatic images, thermal images, amongst other useful outputs. To achieve such results, remote sensing technology frequently integrates multiple sensors that may be complemented by other devices, therefore increasing the quality of the captured data.

1.6.1. IMAGING SENSORS AND OUTPUTS Sensors are composed of cells organized in a two-dimensional regular grid and that is photosensitive to a specific range of the electromagnetic spectrum. The photosensitive cells, also known as photosites, are generally constructed into a square shape and register an intensity level of the electromagnetic spectrum. The reading of the intensity level is directly related to the reading capacity range of the photosite. After an exposure to a source of electromagnetic radiation, the intensity of said radiation is converted into an electrical charge which is then

53

transformed into intelligible information or, technically, numerical values thus resulting in the appearance of the digital image (Mikhail, Bethel, & Chris, 2001). In addition to the variation of the sensor reading range – defined with ISO values in photographic cameras-, sensors can similarly sense intensity values of a specific band of the electromagnetic spectrum. For example, ordinary images are composed of RGB colours, meaning that the creation of said image involved electrical impulses derived from specific ranges of the electromagnetic spectrum, namely red, green, and blue. In this context, depending on what the sensors can capture, sensors can be categorized into “monochromatic sensors”, “panchromatic sensors”, “multi-spectral sensors”, or “hyper- spectral sensors” (Mateus, 2012). Monochromatic sensors are sensitive only to one colour, that is, to a specific wave of the electromagnetic spectrum. Most sensors of 3D laser scanning stations provide monochromatic information. Panchromatic sensors are sensitive to an interval of the electromagnetic spectrum, therefore creating images that support one colour scale and sensitive to different levels of radiation intensity. Example of a panchromatic image is a grayscale image. Multispectral sensors, as suggested by the designation, contains cells sensitive to multiple intervals of the electromagnetic spectrum. The most commonly implemented channels are RGB. These three colours correspond to the visible light (visible to humans). Nonetheless, more channels can be added beyond visible light. Multispectral sensors are commonly built into most digital photographic cameras. Lastly, hyperspectral sensors follow the same logic as multispectral sensors, with the exception that registers several more channels, offering information of the non-visible spectrum. The collection of non-visible information leads to a question regarding the practice of collecting such information as it is non-visible to the human eye. For such information to be operated the colour scale must be shortened into the human visible colour-range spectrum (RGB colours). The resultant image is “unreal” as the colours have been manipulated to be interpreted by a human operator. For example, registering the ultraviolet light on a flower will display its petals in a gradient of colours as opposed to the typically observed monochromatic shade.

54

1.6.2. SENSORS OF DIGITAL PHOTOGRAPHIC CAMERAS Different sensors with their own advantages and disadvantages are available to capture dissimilar, similar, or equal information. For the photographic cameras, there are two main types of sensors that, as each sensor processes information differently. The first type of sensor is the CMOS (“Complementary Metal-Oxide Semiconductor”) and the second type is the CCD (“Charged-Coupled Device”). The main difference between the two sensors is how the electrical charges are converted and transmitted. While in CCD sensors the electrical charges are transferred in a line to the border of the sensor to be read, the CMOS sensors contain transistors (signal amplifiers) for each and every photosite that allow individual reading (Konecny, 2003). For this reason, the energy consumption is considerably lower when compared to the CCD sensors. Another advantage is the lower cost of manufacturing CMOS sensors that, in turn, allow the placement of lower-priced photographic cameras on the market. Relative to the size, CCD sensors are larger because they require a greater surface for the photosites, while CMOS sensors are smaller as photosites can be densely-compacted. However, high density affects the quality of the resultant image. The more compacted the photosites are, the results are noisier. Many digital photographic cameras available on the market make use of the CMOS sensors due to its cost-benefit relative to the CCD sensors.

1.6.3. COLOR SEPARATION METHODS Another key point concerning imaging sensors is the method for separating the colours. There are three methods to be considered, and each affects the internal physical configuration of the photographic camera. See Figure 1. The first is “3CCD” in which 3 CCD sensors are disaggregated physically and each sensor registers a specific interval of the electromagnetic spectrum (RGB). This is possible due to a prism that reflects the same image into the three separate sensors. In this method, non-interpolated RGB values are generated for each photosite since these values are superimposed afterwards to form a pixel. The second method, and the most employed one, consists of the usage of the Bayer Filter Mosaic in merely one sensor. This filter aggregates photosites in groups of 2x2 (RGBG) to generate the three final RGB colours. The green colour is repeated to best capture luminance values. There are other uncommon pattern variations that instead of repeating the green colour use a fourth filter colour detect luminance values. Despite the Bayer filter being the most employed method, the major disadvantage is the structural arrangement of the photosites. That is, for an intelligible

55

Figure 24: From left to right. Imaging sensor with Bayer filter (Langford, 2002). FoveonX3 colour separation method (http://www.foveon.com/) output the values acquired by each group of 2x2 photosites must be calculated for 1 photosite through a process termed “demosaicing”. The interpolated results are then considered a pixel. Yet, to obtain an image with a higher pixel count, the RGB values for the 3 other pixels must be extrapolated to fill in the three vacant photosites utilized during the demosaicing process. The end result is an image whose spatial resolution could be superior. The third colour acquisition technique, named Foveon X3, explores the innate capacity that light has to penetrate through filters. In fact, this method can be understood as the overlapping of three filters into one. The Foveon X3 is only available for CMOS sensors and generates RGB values for each photosite. In essence, most current digital photographic cameras are equipped with CMOS sensors with Bayer Filter Mosaic.

1.6.4. FORMATS OF IMAGING SENSORS During the last few decades and with the shifting from analogic to digital technologies, film sensors have been replaced by digital imaging sensors, which are significantly and frequently smaller and have diverse standard sizes: APS-C Nikon or Sony (1.5x crop factor); APS-C Canon (1.6x crop factor); Four Thirds (2x crop factor); 1’’ (2.7x crop factor); 1/1.7’’ (4.6x crop factor); 1/2.5’’ (6.0x crop factor). See Figure 25. These are the most popular sizes and are termed as “crop sensors” because the format is smaller than a Full Frame, the format of reference. As a rule, the greater the “crop factor”, the smaller the sensor is. Note that larger sensors are expressed in mm while the smaller ones are expressed in inches. In general: DSLR cameras use Full Frame and APS-C sensors; MILC cameras use Full Frame, APS-C, Four Thirds; Compact cameras and smartphones use the smallest formats.

56

Figure 25: Standard sizes of imaging sensors. https://www.cambridgeincolour.com In terms of monetary value, production costs for full frame lenses are higher because they require a wider diameter of the lens and body shape for adequate light exposure. Assuming the same scenery, quantity of pixels, and camera settings except for the sensor format, full frame sensors deliver better results, particularly with low ISO values, for the reason that each individual pixel, while being larger, senses more light and offers lower noise. Consequently, full-frame cameras are more appropriate for low light conditions even at high ISO values. Though, this issue is debatable when comparing a Full Frame sensor with an APS-C sensor with a higher pixel count as more details are possible to capture.

57

Page intentionally left blank

58

1.7. BIBLIOGRAPHY

Albertz, J., Wiedemann, A. (1996). From Analogue To Digital Close-Range Photogrammetry. First Turkish-German Joint Geodetic Days, 245–253. Retrieved from http://www.alwie.net/lit/IstCR.pdf Albertz, J. (2007). A Look Back 140 Years of “Photogrammetry” - Some Remakes on the History of Photogrammetry. Photogrammetric Engineering & Remote Sensing, 504–506. Retrieved from http://www.asprs.org/a/publications/pers/2007journal/may/lookback.pdf Andrews, D. P., Bedford, J., & Bryan, P. G. (2013). A Comparison of Laser Scanning and Structure From Motion As Applied To the Great Barn At Harmondsworth, UK. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences; XXIV International CIPA Symposium, XL(September). Argarwal, S., Furukawa, Y., Sna, N., Curless, B., Seitz, S. M., & Szeliski, R. (2010). Reconstructing Rome. IEEE Computer Society, 43(6), 40–47. https://doi.org/10.1109/MC.2010.175 Barazzetti, L., Previtali, M., & Roncoroni, F. (2017). Fisheye lenses for 3D modeling: Evaluations and considerations. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences - ISPRS Archives, 42(2W3), 79–84. https://doi.org/10.5194/isprs-archives-XLII-2-W3-79-2017 Covas, J., Ferreira, V., & Mateus, L. (2015). 3D Reconstruction with Fisheye Images Strategies to Survey Complex Heritage Buildings. Proceedings of the 2nd International Congress on Digital Heritage 2015, 1, 123–126. D’Ayala, D., & Smars, P. (2003). Minimum requirement for metric use of non-metric photographic documentation. University of Bath Report, (July). Retrieved from http://scholar.google.com/scholar?hl=en&btnG=Search&q=intitle:Minimum+requirement s+for+metric+use+of+non-metric+photographic+documentation#0 Docci, M., & Maestri, D. (2005). Manuale di Rilevamento Architettonico e Urbano (8th ed.). Roma: Laterza. Duerer, A. (1977). Short Chronological History of Photogrammetry. Proceedings of. XIII Congress of the International Society for. Photogrammetry, 1–36. Ferreira, V. M. M. (2011). Planeamento Participativo e as Tecnologias de Informação - Promover o Entendimento do Planeamento Local pelos Cidadãos. Universidade Técnica de Lisboa - Instituto Superior Técnico. Fiorillo, F., Jiménez Fernández-Palacios, B., Remondino, F., & Barba, S. (2013). 3D Surveying and

59

Modelling of the Archaeological Area of Paestum, Italy. Virtual Archaeology Review, 4(8), 55–60. https://doi.org/10.4995/var.2013.4306 Konecny, G. (2003). Geoinformation: Remote Sensing, Photogrammetry and Geographic Information Systems. The Geographical Journal, 1–266. https://doi.org/10.4324/9780203469644 Küng, O., Strecha, C., Beyeler, A., Zufferey, J.-C., Floreano, D., Fua, P., & Gervaix, F. (2012). The Accuracy of Automatic Photogrammetric Techniques on Ultra-Light UAV Imagery. ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, XXXVIII-1/, 125–130. https://doi.org/10.5194/isprsarchives-XXXVIII-1-C22-125- 2011 Langford, M. (2002). Fotografia Básica (originally: Basic Photography) (5th ed.). Lisboa: Dinalivro. Linder, W. (2005). Digital Photogrammetry A Practical Course. Control. https://doi.org/10.1007/3-540-29153-9 Mateus, L. (2012). Contributos Para o Projecto de Conservação, Restauro e Reabilitação. Uma Metodologia Documental Baseada na Fotogrametria Digital e no Varrimento Laser 3D Terrestres. Faculdade de Arquitectura da Universidade Técnica de Lisboa. Retrieved from http://home.fa.utl.pt/~lmmateus/inv_cons/VOLUME_1_web.pdf & http://home.fa.utl.pt/~lmmateus/inv_cons/VOLUME_2_web.pdf Mikhail, E. M., Bethel, J. S., & Chris, J. M. (2001). Introduction to Modern Photogrammetry. EUA: John Willey & Sons. Mikhail, E. M., Bethel, J. S., & McGlone, J. C. (2001). Introduction to Modern Photogrammetry (Vol. 1). Wiley. Samaan, M., Héno, R., Pierrot-deseilligny, M., Pascal, B., & France, C. (2013). Close-Range Photogrammetric Tools for Small 3D Archeological Objects. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XL-5/W2, XL(September), 2–6. Schenk, T. (2005). Introduction to Photogrammetry. Department of Civil and Environmental Engineering and Geodetic Science, The Ohio State University. Retrieved from http://gscphoto.ceegs.ohio-state.edu/courses/GeodSci410/docs/GS410_02.pdf Shortis, M. R., Bellman, C. J., Robson, S., Johnston, G. J., & Johnson, G. W. (2006). Stability of Zoom and Fixed Lenses used with Digital SLR Cameras. International Archives of Photogrammetry, Remote Sensing, and Spatial Information Sciences, XXXVI(5), 285–290.

60

Snavely, N. (n.d.). Bundler: Structure from Motion (SfM) for Unordered Image Collections. Retrieved from http://www.cs.cornell.edu/~snavely/bundler/ Snavely, N., Seitz, S. M., & Szeliski, R. (2006). Photo Tourism: Exploring Photo Collections in 3D. ACM Transactions on Graphics, 25(3), 835. https://doi.org/10.1145/1141911.1141964 Strecha, C., Zoller, R., Rutishauser, S., Brot, B., Schneider-Zapp, K., Chovancova, V., … Glassey, L. (2015). Quality Assessment of 3D Reconstruction Using Fisheye and Perspective Sensors. ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences, II- 3/W4, 215–222. https://doi.org/10.5194/isprsannals-II-3-W4-215-2015 Vallet, J., Panissod, F., Strecha, C., & Tracol, M. (2012). Photogrammetric Performance of an Ultra Light Weight Swinglet “UAV.” ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, XXXVIII-1/, 253–258. https://doi.org/10.5194/isprsarchives-XXXVIII-1-C22-253-2011 Vosselman, G., & Maas, H. G. (2014). Airborne and Terrestrial Laser Scanning. Vozikis, G. (2007). Using Hybrid Surveying Techniques for Documenting the Largest Ancient Theatre in Greece. XXI International CIPA Symposium. Wohlfeil, J., Strackenbrock, B., & Kossyk, I. (2013). Automated High Resolution 3D Reconstruction of Cultural Heritage Using Multi-Scale Sensor Systems and Semi-Global Matching. ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, XL-4/W4, 37–43. https://doi.org/10.5194/isprsarchives-XL-4- W4-37-2013 Yanagi, H., & Chikatsu, H. (2010). 3D Modeling of Small Objects Using Macro Lens in Digital Very Close Range Photogrammetry. International Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences, XXXVIII, 617–622.

Albertz, J., Wiedemann, A. (1996). From Analogue To Digital Close-Range Photogrammetry. First Turkish-German Joint Geodetic Days, 245–253. Retrieved from http://www.alwie.net/lit/IstCR.pdf Albertz, J. (2007). A Look Back 140 Years of “Photogrammetry” - Some Remakes on the History of Photogrammetry. Photogrammetric Engineering & Remote Sensing, 504–506. Retrieved from http://www.asprs.org/a/publications/pers/2007journal/may/lookback.pdf Andrews, D. P., Bedford, J., & Bryan, P. G. (2013). A Comparison of Laser Scanning and Structure From Motion As Applied To the Great Barn At Harmondsworth, UK. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences; XXIV

61

International CIPA Symposium, XL(September). Argarwal, S., Furukawa, Y., Sna, N., Curless, B., Seitz, S. M., & Szeliski, R. (2010). Reconstructing Rome. IEEE Computer Society, 43(6), 40–47. https://doi.org/10.1109/MC.2010.175 Barazzetti, L., Previtali, M., & Roncoroni, F. (2017). Fisheye lenses for 3D modeling: Evaluations and considerations. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences - ISPRS Archives, 42(2W3), 79–84. https://doi.org/10.5194/isprs-archives-XLII-2-W3-79-2017 Covas, J., Ferreira, V., & Mateus, L. (2015). 3D Reconstruction with Fisheye Images Strategies to Survey Complex Heritage Buildings. Proceedings of the 2nd International Congress on Digital Heritage 2015, 1, 123–126. D’Ayala, D., & Smars, P. (2003). Minimum requirement for metric use of non-metric photographic documentation. University of Bath Report, (July). Retrieved from http://scholar.google.com/scholar?hl=en&btnG=Search&q=intitle:Minimum+requirement s+for+metric+use+of+non-metric+photographic+documentation#0 Docci, M., & Maestri, D. (2005). Manuale di Rilevamento Architettonico e Urbano (8th ed.). Roma: Laterza. Duerer, A. (1977). Short Chronological History of Photogrammetry. Proceedings of. XIII Congress of the International Society for. Photogrammetry, 1–36. Ferreira, V. M. M. (2011). Planeamento Participativo e as Tecnologias de Informação - Promover o Entendimento do Planeamento Local pelos Cidadãos. Universidade Técnica de Lisboa - Instituto Superior Técnico. Fiorillo, F., Jiménez Fernández-Palacios, B., Remondino, F., & Barba, S. (2013). 3D Surveying and Modelling of the Archaeological Area of Paestum, Italy. Virtual Archaeology Review, 4(8), 55–60. https://doi.org/10.4995/var.2013.4306 Konecny, G. (2003). Geoinformation: Remote Sensing, Photogrammetry and Geographic Information Systems. The Geographical Journal, 1–266. https://doi.org/10.4324/9780203469644 Küng, O., Strecha, C., Beyeler, A., Zufferey, J.-C., Floreano, D., Fua, P., & Gervaix, F. (2012). The Accuracy of Automatic Photogrammetric Techniques on Ultra-Light UAV Imagery. ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, XXXVIII-1/, 125–130. https://doi.org/10.5194/isprsarchives-XXXVIII-1-C22-125- 2011 Langford, M. (2002). Fotografia Básica (originally: Basic Photography) (5th ed.). Lisboa:

62

Dinalivro. Linder, W. (2005). Digital Photogrammetry A Practical Course. Control. https://doi.org/10.1007/3-540-29153-9 Mateus, L. (2012). Contributos Para o Projecto de Conservação, Restauro e Reabilitação. Uma Metodologia Documental Baseada na Fotogrametria Digital e no Varrimento Laser 3D Terrestres. Faculdade de Arquitectura da Universidade Técnica de Lisboa. Retrieved from http://home.fa.utl.pt/~lmmateus/inv_cons/VOLUME_1_web.pdf & http://home.fa.utl.pt/~lmmateus/inv_cons/VOLUME_2_web.pdf Mikhail, E. M., Bethel, J. S., & Chris, J. M. (2001). Introduction to Modern Photogrammetry. EUA: John Willey & Sons. Mikhail, E. M., Bethel, J. S., & McGlone, J. C. (2001). Introduction to Modern Photogrammetry (Vol. 1). Wiley. Samaan, M., Héno, R., Pierrot-deseilligny, M., Pascal, B., & France, C. (2013). Close-Range Photogrammetric Tools for Small 3D Archeological Objects. International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XL-5/W2, XL(September), 2–6. Schenk, T. (2005). Introduction to Photogrammetry. Department of Civil and Environmental Engineering and Geodetic Science, The Ohio State University. Retrieved from http://gscphoto.ceegs.ohio-state.edu/courses/GeodSci410/docs/GS410_02.pdf Shortis, M. R., Bellman, C. J., Robson, S., Johnston, G. J., & Johnson, G. W. (2006). Stability of Zoom and Fixed Lenses used with Digital SLR Cameras. International Archives of Photogrammetry, Remote Sensing, and Spatial Information Sciences, XXXVI(5), 285–290. Snavely, N. (n.d.). Bundler: Structure from Motion (SfM) for Unordered Image Collections. Retrieved from http://www.cs.cornell.edu/~snavely/bundler/ Snavely, N., Seitz, S. M., & Szeliski, R. (2006). Photo Tourism: Exploring Photo Collections in 3D. ACM Transactions on Graphics, 25(3), 835. https://doi.org/10.1145/1141911.1141964 Strecha, C., Zoller, R., Rutishauser, S., Brot, B., Schneider-Zapp, K., Chovancova, V., … Glassey, L. (2015). Quality Assessment of 3D Reconstruction Using Fisheye and Perspective Sensors. ISPRS Annals of Photogrammetry, Remote Sensing and Spatial Information Sciences, II- 3/W4, 215–222. https://doi.org/10.5194/isprsannals-II-3-W4-215-2015 Vallet, J., Panissod, F., Strecha, C., & Tracol, M. (2012). Photogrammetric Performance of an Ultra Light Weight Swinglet “UAV.” ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, XXXVIII-1/, 253–258.

63

https://doi.org/10.5194/isprsarchives-XXXVIII-1-C22-253-2011 Vosselman, G., & Maas, H. G. (2014). Airborne and Terrestrial Laser Scanning. Vozikis, G. (2007). Using Hybrid Surveying Techniques for Documenting the Largest Ancient Theatre in Greece. XXI International CIPA Symposium. Wohlfeil, J., Strackenbrock, B., & Kossyk, I. (2013). Automated High Resolution 3D Reconstruction of Cultural Heritage Using Multi-Scale Sensor Systems and Semi-Global Matching. ISPRS - International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, XL-4/W4, 37–43. https://doi.org/10.5194/isprsarchives-XL-4- W4-37-2013 Yanagi, H., & Chikatsu, H. (2010). 3D Modeling of Small Objects Using Macro Lens in Digital Very Close Range Photogrammetry. International Archives of Photogrammetry, Remote Sensing and Spatial Information Sciences, XXXVIII, 617–622.

64

ANNEX 2 – DATA OF THE CASE STUDIES

2.1. EQUIPMENT

Table 1: Computers ASUS K56 CB ASUS K46 CM “HUMMER”

COMPUTERS

*1 *2 *3 ® ™ ® ™ ® ® MODEL INTEL CORE INTEL CORE INTEL XEON PROCESSOR I7-3537U I7-35117U E5645 CORES 2 2 24

OPERATING BITS 64-BITS 64-BITS 64-BITS SYSTEM BRAND WINDOWS 10 WINDOWS 10 WINDOWS 7 RAM 8 GB 8 GB 96 GB ® ® ® ® ® ® GPU NVIDIA GEFORCE NVIDIA GEFORCE NVIDIA GEFORCE GT 740M GT 635M GTX 570

Table 2: Photographic equipment NIKON DX AF-S NIKKOR SAMYANG 8MM /3.5 UMC NIKON D3100 18-55MM FISH-EYE CS II - 180° FOV

CAMERA & LENSES

*4 *5 *6

MEGAPIXELS 14,2MP VR UMC LARGE 4608 X 3072 (VIBRATION REDUCTION) (ULTRA MULTI COATING) SIZE MEDIUM 3456 X 2304 SWM

SMALL 2304 X 1536 (SILENT WAVE MOTOR; FORMAT APSC-C NIKON AS H-ASP SENSOR TYPE CMOS (ASPHERICAL LENS;) (HYBRID ASPHERICAL LENSES) WEIGHT 505G A-M MANUAL FOCUS ADJUSTMENT USE PRECAUTIONS 0 TO 40ºC (AUTO-MANUAL MODE)

65

Table 3: Accessories 2 METAL RODS 3 ALUMINIUM RODS TELESCOPIC PAINTER POLE

POLES

*1 *2 *3

TOTAL SIZE 2+M 2+M 2,6M

HÄHNEL YONGNUO POWERFIX (HRN 280 PRO FOR NIKON) (RF-603N) (MODEL: Z30859)

REMOTE SHUTTERS & LASER LEVEL

*4 *5 *6

RANGE 2M CABLE 100M WIRELESS -/-/-

HAMA TANWEER MEDID (TRAVELLER MINI PRO TRIPOD (TOOLS APRON) (FIBERGLASS STRAP 50M) BLACK) TRIPOD & WAIST ACCESSORY (POLE) & MEASURING TAPE

*8 *9 *7

Table 4: More survey equipment

FAROFOCUS 3DS 120 TRIMBLE GNSS R4 TRIMBLE SLATE CONTROLLER

LASER & GPS

*11 *10 *12

66

2.2. PHOTOGRAPHIC RECORDING OF THE WALL IN PAÇO D’ARCOS AND CALIBRATION

In this section, it is presented information relative to the photogrammetric survey of the wall in Paço D’Arcos and respective calibration of the fisheye lens (Samyang 8mm). Two tables are displayed. In the first table, general information regarding the total number of captured images is provided. In the second table, sample images are demonstrated, and the contents are organized relative to the structure presented in the first table.

67

Table 5: Index table of the collected data. From the photographic recording of a wall in Paço D'Arcos. PHOTOGRAMMETRIC RECORDING Nº PHOTOS 8MM 73 8MM CONTINUATION 40 8MM PERSON 13 8MM PIVOT 18 18MM 30 TOTAL 174

Table 6: Table with sample images of the collected data. Photographic recording of a wall in Paço D'Arcos

PHOTOGRAMMETRIC RECORDING

8MM

8MM

8MM CONTINUATION

8MM

8MM PERSON

8MM

8MM PIVOT

8MM

18MM

8MM

68

2.3. PHOTOGRAPHIC RECORDING OF THE MEDALLIONS OF FAUL

In this section, it is presented photographic information relative to the recording of the medallions of the Faculty of Architecture of the University of Lisbon.

69

Table 7: Index table of the collected data. From the photographic recording of the medallions of Faculty of Architecture. PHOTOGRAMMETRIC RECORDING Nº PHOTOS LENSES Nº PHOTOS COMPARISON 8MM VS 18MM 8MM 8 16 18MM 8 CLOSE-UP CONVERGENT AROUND THE MEDALLION 18 8MM 18 GENERAL CONVERGENT AROUND THE MEDALLION 14 8MM 14

TOTAL 48 PHOTOGRAPHS

Table 8: Table with sample images of the collected data. From the photographic recording of the medallions of Faculty of Architecture.

COMPARISON & CALIBRATION OF SAMYANG 8MM FISHEYE LENS

COMPARISON 8MM VS 18MM SAMYANG FISHEYE

8MM

NIKKOR 18-55MM

18MM

CONVERGENT CLOSE-UP CONVERGENT AROUND THE MEDALLION

8MM

GENERAL CONVERGENT AROUND THE MEDALLION

8MM

70

2.4. PHOTOGRAPHIC CALIBRATION OF GOPRO

In this section, it is presented information relative to the calibration of the GoPro lens used in the DJI Phantom Vision2+.

71

Table 9: Index table of the collected data. Photographic calibration of GoPro camera. CALIBRATION OF GOPRO Nº PHOTOS 8MM LENS 18MM LENS 50MM LENS

FAUL MEDALLIONS 76 33 43 FROM WHICH WORKSHOP 103 Nº PHOTOS 50 53 TOTAL 179 83 96

Table 10: Table with sample images of the collected data. Photographic calibration of GoPro camera. CALIBRATION

FAUL MEDALLIONS GOPRO

NIKON D3100

18MM

WORKSHOP GOPRO

NIKON D3100

18MM

72

2.5. PHOTOGRAPHIC RECORDING OF THE CASTLE OF S. JORGE IN LISBON

In this section information relative to the recording project of The Castle of S. Jorge is presented in 2 tables following a hierarchized structure. In the first table, the total number of photographs is indicated for the two approaches and respective lenses. The second table illustrates sample images for each approach respectively.

73

Table 11: Index table of the collected data. From the photographic recording of The Castle of S. Jorge in Lisbon, Portugal.

CHEMINS DE RONDE

IMAGES POSITION 8MM 18MM 50MM 1M EXTENSION INTERIOR 5 EXTERIOR 17 2M EXTENSION INTERIOR 5 EXTERIOR 7 TOTAL 35 OVERALL TOTAL 35 PHOTOGRAPHS

Table 12: Table with sample images of the collected data. From the photographic recording of The Castle of S. Jorge in Lisbon, Portugal. EXTERIOR

1M EXTENSION TOP

18MM

EXTERIOR

18MM

2M EXTENSION TOP

18MM

EXTERIOR

18MM

74

2.6. PHOTOGRAPHIC RECORDING OF THE CASTLE OF SESIMBRA

In this section information relative to the surveying project of The Castle of Sesimbra is presented in 3 sets of tables following a hierarchized structure. In the first table, it is enumerated the number of approaches to record the and the . In addition, for each approach, the total number of photographs is indicated, and the respective lenses used for the surveying. In the second table, it is indicated, for each approach respectively, the total number of acquired photographs for every point of view. Lastly, the third table illustrates, for each approach respectively, sample images for every point of view.

75

Table 13: Index table of the collected data. From the photographic recording of The Castle of Sesimbra, Portugal. THE CASTLE OF SESIMBRA Nº PHOTOS 8MM LENS 18MM LENS 50MM LENS BATTLEMENTS 18MM 1ST APPROACH 573 573 ND BATTLEMENTS 18MM 2 APPROACH 322 322 BATTLEMENTS 18MM 3RD APPROACH 258 258 BATTLEMENTS 8MM 1ST APPROACH 147 FROM WHICH 147 Nº PHOTOS BATTLEMENTS 8MM 2ND APPROACH 199 199 BATTLEMENTS 8MM 3RD APPROACH 188 188 CITADEL 2549 2245 253 51 TOTAL 4236 2779 1406 51

76

Table 14: Table with detailed information about the collected data. From the photographic recording of The Castle of Sesimbra, Portugal.

BATTLEMENTS 18MM 1ST APPROACH

AREAS SUB-AREA 8MM 18MM 50MM BATTLEMENTS INTERIOR - LOW - DIVERGENT RIGHT INTERIOR - LOW - DIVERGENT LEFT 44 INTERIOR - LOW - PERPENDICULAR 22 INTERIOR - LOW - PERPENDICULAR 2 24 INTERIOR - MEDIUM -DIVERGENT RIGHT 17 INTERIOR - MEDIUM -DIVERGENT LEFT 23 INTERIOR - MEDIUM - PERPENDICULAR 20 INTERIOR - HIGH -DIVERGENT RIGHT 17 INTERIOR - HIGH -DIVERGENT LEFT INTERIOR - HIGH - PERPENDICULAR TOP -DIVERGENT RIGHT 28 TOP -DIVERGENT LEFT 34 TOP - NADIR 42 TOP - NADIR 2 25 TOP - NADIR BETWEEN 8 EXTERIOR - LOW - DIVERGENT RIGHT 34 EXTERIOR - LOW - DIVERGENT LEFT EXTERIOR - LOW - PERPENDICULAR 66 EXTERIOR - LOW - PERPENDICULAR 2 85 EXTERIOR - MEDIUM -DIVERGENT RIGHT EXTERIOR - MEDIUM -DIVERGENT LEFT 7 EXTERIOR - MEDIUM - PERPENDICULAR EXTERIOR - HIGH -DIVERGENT RIGHT 27 EXTERIOR - HIGH -DIVERGENT RIGHT 2 23 EXTERIOR - HIGH -DIVERGENT LEFT 21 EXTERIOR - HIGH - PERPENDICULAR 6 TOTAL 573 OVERALL TOTAL 573 PHOTOGRAPHS

BATTLEMENTS 18MM 2ND APPROACH

AREAS SUB-AREA 8MM 18MM 50MM BATTLEMENTS INTERIOR - LOW - DIVERGENT RIGHT 22 INTERIOR - LOW - DIVERGENT LEFT 27 INTERIOR - MEDIUM -DIVERGENT RIGHT 20 INTERIOR - MEDIUM -DIVERGENT LEFT 25 INTERIOR - HIGH -DIVERGENT RIGHT 27 INTERIOR - HIGH -DIVERGENT LEFT 24 TOP - NADIR 42 TOP - NADIR BETWEEN MERLONS EXTERIOR - LOW - DIVERGENT RIGHT 22 EXTERIOR - LOW - DIVERGENT LEFT 21 EXTERIOR - MEDIUM -DIVERGENT RIGHT 26

77

EXTERIOR - MEDIUM -DIVERGENT LEFT 21 EXTERIOR - HIGH -DIVERGENT RIGHT 24 EXTERIOR - HIGH -DIVERGENT LEFT 21 TOTAL 322 OVERALL TOTAL 322 PHOTOGRAPHS

BATTLEMENTS 18MM 3RD APPROACH

AREAS SUB-AREA 8MM 18MM 50MM BATTLEMENTS INTERIOR - LOW - DIVERGENT RIGHT 11 INTERIOR - LOW - DIVERGENT LEFT 10 INTERIOR - MEDIUM -DIVERGENT RIGHT 11 INTERIOR - MEDIUM -DIVERGENT LEFT 11 INTERIOR - HIGH -DIVERGENT RIGHT 11 INTERIOR - HIGH -DIVERGENT RIGHT 2 14 INTERIOR - HIGH -DIVERGENT LEFT 11 INTERIOR - HIGH -DIVERGENT LEFT 2 13 TOP - NADIR 18 TOP - NADIR 2 24 TOP - NADIR BETWEEN MERLONS 18 EXTERIOR - LOW - DIVERGENT RIGHT 10 EXTERIOR - LOW - DIVERGENT LEFT 10 EXTERIOR - MEDIUM -DIVERGENT RIGHT 10 EXTERIOR - MEDIUM -DIVERGENT LEFT 10 EXTERIOR - HIGH -DIVERGENT RIGHT 11 EXTERIOR - HIGH -DIVERGENT RIGHT 2 11 EXTERIOR - HIGH -DIVERGENT LEFT 10 EXTERIOR - HIGH -DIVERGENT LEFT 2 12 CHEMINS DE RONDE BASE 9 CHEMINS DE RONDE BASE 2 13 TOTAL 258 OVERALL TOTAL 258 PHOTOGRAPHS

BATTLEMENTS 8MM 1ST APPROACH

AREAS SUB-AREA 8MM 18MM 50MM BATTLEMENTS INTERIOR - LOW - DIVERGENT RIGHT INTERIOR - LOW - DIVERGENT LEFT INTERIOR - LOW - PERPENDICULAR 12 INTERIOR - LOW - PERPENDICULAR 2 13 INTERIOR - MEDIUM -DIVERGENT RIGHT 24 INTERIOR - MEDIUM -DIVERGENT LEFT 22 INTERIOR - MEDIUM - PERPENDICULAR INTERIOR - HIGH -DIVERGENT RIGHT INTERIOR - HIGH -DIVERGENT LEFT INTERIOR - HIGH - PERPENDICULAR 13 TOP -DIVERGENT RIGHT TOP -DIVERGENT LEFT TOP - NADIR 12 EXTERIOR - LOW - DIVERGENT RIGHT

78

EXTERIOR - LOW - DIVERGENT LEFT EXTERIOR - LOW - PERPENDICULAR EXTERIOR - MEDIUM -DIVERGENT RIGHT EXTERIOR - MEDIUM -DIVERGENT LEFT EXTERIOR - MEDIUM - PERPENDICULAR EXTERIOR - HIGH -DIVERGENT RIGHT 13 EXTERIOR - HIGH -DIVERGENT LEFT 13 EXTERIOR - HIGH - PERPENDICULAR CHEMINS DE RONDE BASE 25 TOTAL 147 OVERALL TOTAL 147 PHOTOGRAPHS

BATTLEMENTS 8MM 2ND APPROACH

AREAS SUB-AREA 8MM 18MM 50MM BATTLEMENTS INTERIOR - LOW - PERPENDICULAR 14 INTERIOR - LOW –-PERPENDICULAR 2 13 INTERIOR - MEDIUM - PERPENDICULAR 15 INTERIOR - MEDIUM - PERPENDICULAR 2 13 INTERIOR - HIGH - PERPENDICULAR 15 INTERIOR - HIGH - PERPENDICULAR 2 13 TOP - NADIR 15 TOP - NADIR 2 14 TOP - NADIR 3 13 TOP - NADIR 4 15 TOP - NADIR 5 12 EXTERIOR - LOW - PERPENDICULAR 22 EXTERIOR - MEDIUM - PERPENDICULAR 14 EXTERIOR - HIGH - PERPENDICULAR 11 TOTAL 199 OVERALL TOTAL 199 PHOTOGRAPHS

BATTLEMENTS 8MM 3RD APPROACH

AREAS SUB-AREA 8MM 18MM 50MM BATTLEMENTS INTERIOR - LOW - PERPENDICULAR 14 INTERIOR - MEDIUM - PERPENDICULAR 14 INTERIOR - HIGH - PERPENDICULAR 13 INTERIOR - HIGH - PERPENDICULAR 2 14 TOP - NADIR 13 TOP - NADIR 2 14 TOP - NADIR 3 12 TOP - NADIR 4 14 TOP - NADIR 5 14 EXTERIOR - LOW - PERPENDICULAR EXTERIOR - MEDIUM - PERPENDICULAR 12 EXTERIOR - HIGH - PERPENDICULAR 26 EXTERIOR - HIGH - PERPENDICULAR 2 28 TOTAL 188 OVERALL TOTAL 188 PHOTOGRAPHS

79

CITADEL

AREAS SUB-AREAS 8MM 18MM 50MM EXTERIOR EXTERIOR FACE OF THE WALLS 140 80 38 TRANSITION TO EAST BATTLEMENTS 74 TRANSITION TO SOUTH BATTLEMENTS 48 TRANSITION TO COURTYARD 43 COURTYARD PIVOT 114 99 KEEP 9 TOWER 4 TRANSITION TO CHEMINS DE RONDE 39 CHEMINS DE RONDE BATTLEMENTS - DIVERGENT RIGHT 168 BATTLEMENTS - DIVERGENT LEFT 171 BATTLEMENTS - TOP NADIR 276 BATTLEMENTS - TOP EXTERIOR PERP. 155 - DIVERGENT RIGHT 137 PARAPET - DIVERGENT RIGHT 2 194 PARAPET - DIVERGENT LEFT 184 PARAPET - PERPENDICULAR 192 WALL TOWERS 88 GROUND COVERAGE 49 TRANSITION TO INTERIOR OF THE KEEP 28 18 KEEP GENERAL PIVOT 43 56 CLOSE-UP PIVOT 102 TOTAL 2245 253 51 OVERALL TOTAL 2549 PHOTOGRAPHS

80

Table 15: Table with sample images of the collected data. From the photographic recording of The Castle of Sesimbra, Portugal.

BATTLEMENTS 18MM 1ST APPROACH

BATTLEMENTS INTERIOR - LOW - DIVERGENT RIGHT INTERIOR – LOW DIVERGENT LEFT

18MM

INTERIOR - LOW - PERPENDICULAR

18MM

INTERIOR - LOW - PERPENDICULAR 2

18MM

INTERIOR - MEDIUM -DIVERGENT RIGHT

18MM

INTERIOR - MEDIUM -DIVERGENT LEFT

18MM

INTERIOR - MEDIUM - PERPENDICULAR

18MM

81

INTERIOR - HIGH -DIVERGENT RIGHT

18MM

INTERIOR - HIGH -DIVERGENT LEFT INTERIOR - HIGH - PERPENDICULAR TOP -DIVERGENT RIGHT

18MM

TOP -DIVERGENT LEFT

18MM

TOP - NADIR

18MM

TOP - NADIR 2

18MM

TOP - NADIR BETWEEN MERLONS

18MM

EXTERIOR - LOW - DIVERGENT RIGHT

18MM

EXTERIOR - LOW - DIVERGENT LEFT

82

EXTERIOR - LOW - PERPENDICULAR

18MM

EXTERIOR - LOW - PERPENDICULAR 2

18MM

EXTERIOR - MEDIUM -DIVERGENT RIGHT EXTERIOR - MEDIUM -DIVERGENT LEFT

18MM

EXTERIOR - MEDIUM - PERPENDICULAR EXTERIOR - HIGH -DIVERGENT RIGHT

18MM

EXTERIOR - HIGH -DIVERGENT RIGHT 2

18MM

EXTERIOR - HIGH -DIVERGENT LEFT

18MM

EXTERIOR - HIGH - PERPENDICULAR

18MM

83

BATTLEMENTS 18MM 2ND APPROACH

BATTLEMENTS INTERIOR - LOW - DIVERGENT RIGHT

18MM

INTERIOR - LOW - DIVERGENT LEFT

18MM

INTERIOR - MEDIUM -DIVERGENT RIGHT

18MM

INTERIOR - MEDIUM -DIVERGENT LEFT

18MM

INTERIOR - HIGH -DIVERGENT RIGHT

18MM

INTERIOR - HIGH -DIVERGENT LEFT

18MM

TOP - NADIR

18MM

TOP - NADIR BETWEEN MERLONS

84

EXTERIOR - LOW - DIVERGENT RIGHT

18MM

EXTERIOR - LOW - DIVERGENT LEFT

18MM

EXTERIOR - MEDIUM -DIVERGENT RIGHT

18MM

EXTERIOR - MEDIUM -DIVERGENT LEFT

18MM

EXTERIOR - HIGH -DIVERGENT RIGHT

18MM

EXTERIOR - HIGH -DIVERGENT LEFT

18MM

BATTLEMENTS 18MM 3RD APPROACH

BATTLEMENTS INTERIOR - LOW - DIVERGENT RIGHT

18MM

85

INTERIOR - LOW - DIVERGENT LEFT

18MM

INTERIOR - MEDIUM -DIVERGENT RIGHT

18MM

INTERIOR - MEDIUM -DIVERGENT LEFT

18MM

INTERIOR - HIGH -DIVERGENT RIGHT

18MM

INTERIOR - HIGH -DIVERGENT RIGHT 2

18MM

INTERIOR - HIGH -DIVERGENT LEFT

18MM

INTERIOR - HIGH -DIVERGENT LEFT 2

18MM

TOP - NADIR

18MM

86

TOP - NADIR 2

18MM

TOP - NADIR BETWEEN MERLONS

18MM

EXTERIOR - LOW - DIVERGENT RIGHT

18MM

EXTERIOR - LOW - DIVERGENT LEFT

18MM

EXTERIOR - MEDIUM -DIVERGENT RIGHT

18MM

EXTERIOR - MEDIUM -DIVERGENT LEFT

18MM

EXTERIOR - HIGH -DIVERGENT RIGHT

18MM

EXTERIOR - HIGH -DIVERGENT RIGHT 2

18MM

87

EXTERIOR - HIGH -DIVERGENT LEFT

18MM

EXTERIOR - HIGH -DIVERGENT LEFT 2

18MM

CHEMINS DE RONDE BASE

18MM

CHEMINS DE RONDE BASE 2

18MM

BATTLEMENTS 8MM 1ST APPROACH

BATTLEMENTS INTERIOR - LOW - DIVERGENT RIGHT INTERIOR - LOW - DIVERGENT LEFT

INTERIOR - LOW - PERPENDICULAR

8MM

INTERIOR - LOW - PERPENDICULAR 2

8MM

88

INTERIOR - MEDIUM -DIVERGENT RIGHT

8MM

INTERIOR - MEDIUM -DIVERGENT LEFT

8MM

INTERIOR - MEDIUM - PERPENDICULAR INTERIOR - HIGH -DIVERGENT RIGHT INTERIOR - HIGH -DIVERGENT LEFT INTERIOR - HIGH - PERPENDICULAR

8MM

TOP -DIVERGENT RIGHT TOP -DIVERGENT LEFT TOP - NADIR

8MM

EXTERIOR - LOW - DIVERGENT RIGHT EXTERIOR - LOW - DIVERGENT LEFT EXTERIOR - LOW - PERPENDICULAR EXTERIOR - MEDIUM -DIVERGENT RIGHT EXTERIOR - MEDIUM -DIVERGENT LEFT EXTERIOR - MEDIUM - PERPENDICULAR EXTERIOR - HIGH -DIVERGENT RIGHT

8MM

EXTERIOR - HIGH -DIVERGENT LEFT

8MM

EXTERIOR - HIGH - PERPENDICULAR

89

CHEMINS DE RONDE BASE

8MM

BATTLEMENTS 8MM 2ND APPROACH

Battlements INTERIOR - LOW - PERPENDICULAR

8MM

INTERIOR - LOW –-PERPENDICULAR 2

8MM

INTERIOR - MEDIUM - PERPENDICULAR

8MM

INTERIOR - MEDIUM - PERPENDICULAR 2

8MM

INTERIOR - HIGH - PERPENDICULAR

8MM

INTERIOR - HIGH - PERPENDICULAR 2

8MM

90

TOP - NADIR

8MM

TOP - NADIR 2

8MM

TOP - NADIR 3

8MM

TOP - NADIR 4

8MM

TOP - NADIR 5

8MM

EXTERIOR - LOW - PERPENDICULAR

8MM

EXTERIOR - MEDIUM - PERPENDICULAR

8MM

EXTERIOR - HIGH - PERPENDICULAR

8MM

91

BATTLEMENTS 8MM 3RD APPROACH

BATTLEMENTS INTERIOR - LOW - PERPENDICULAR

8MM

INTERIOR - MEDIUM - PERPENDICULAR

8MM

INTERIOR - HIGH - PERPENDICULAR

8MM

INTERIOR - HIGH - PERPENDICULAR 2

8MM

TOP - NADIR

8MM

TOP - NADIR 2

8MM

TOP - NADIR 3

8MM

92

TOP - NADIR 4

8MM

TOP - NADIR 5

8MM

EXTERIOR - LOW - PERPENDICULAR EXTERIOR - MEDIUM - PERPENDICULAR

8MM

EXTERIOR - HIGH - PERPENDICULAR

8MM

EXTERIOR - HIGH - PERPENDICULAR 2

8MM

CITADEL

EXTERIOR EXTERIOR FACE OF THE WALLS

8MM

18MM

93

50MM

TRANSITION TO EAST BATTLEMENTS

8MM

TRANSITION TO SOUTH BATTLEMENTS

8MM

TRANSITION TO COURTYARD

8MM

COURTYARD PIVOT

8MM

18MM

KEEP

50MM

TOWER

50MM

94

TRANSITION TO CHEMINS DE RONDE

8MM

CHEMINS DE RONDE BATTLEMENTS - DIVERGENT RIGHT

8MM

BATTLEMENTS - DIVERGENT LEFT

8MM

BATTLEMENTS - TOP NADIR

8MM

BATTLEMENTS - TOP EXTERIOR PERP.

8MM

PARAPET - DIVERGENT RIGHT

8MM

PARAPET - DIVERGENT RIGHT 2

8MM

PARAPET - DIVERGENT LEFT

8MM

95

PARAPET - PERPENDICULAR

8MM

WALL TOWERS

8MM

GROUND COVERAGE

18MM

TRANSITION TO INTERIOR OF THE KEEP

8MM

KEEP GENERAL PIVOT

8MM

18MM

CLOSE-UP PIVOT

8MM

96

2.7. PHOTOGRAPHIC RECORDING OF THE CONVENTO DOS CAPUCHOS IN SINTRA

In this section information relative to the surveying project of The Convento dos Capuchos is presented in 2 tables following a hierarchized structure. In the first table the total number of photographs is indicated for each sub-area or surface of the surveyed rooms. The second table illustrates sample images for each area, and sub-area or surface respectively.

97

Table 16: Table with general information of the collected data. From the photographic recording of the Convento dos Capuchos in Sintra, Portugal.

CONVENTO DOS CAPUCHOS Nº PHOTOS 8MM LENS 18MM LENS 50MM LENS CONVENTO DOS CAPUCHOS 1699 1699 FROM WHICH TOTAL 1699 Nº PHOTOS 1699

Table 17: Table with detailed information of the collected data. From the photographic recording of the Convento dos Capuchos in Sintra, Portugal.

CONVENTO DOS CAPUCHOS

AREAS SUB-AREA 8MM 18MM 50MM ROOMS CELL 18 CHAPTERHOUSE 18 CHOIR 85 CHURCH 120 ENTRANCE 230 INFIRMARY 75 KITCHEN 97 LIBRARY 114 REFECTORY 22 ROOM 1 37 ROOM 2 35 SUPERIOR ROOM 56 VISITOR’S ROOM 1 28 VISITOR’S ROOM 2 21 WATER CLOSET (WC) 130 CORRIDORS DIVERGENT DOWNWARDS RIGHT 280 DIVERGENT UPWARDS RIGHT 84 DIVERGENT DOWNWARDS LEFT 192 DIVERGENT UPWARDS LEFT 31 EXTERIOR TRANSITION TO EXTERIOR 80 EXTERIOR 138 TOTAL 1699 OVERALL TOTAL 1699 PHOTOGRAPHS

98

Table 18: Table with sample images of the collected data. From the photographic recording of the Convento dos Capuchos in Sintra, Portugal.

GENERAL SURVEY

ROOMS CELL

8MM

CHAPTERHOUSE

8MM

CHOIR

8MM

CHURCH

8MM

ENTRANCE

8MM

INFIRMARY

8MM

KITCHEN

99

8MM

REFECTORY

8MM

ROOM 1

8MM

ROOM 2

8MM

SUPERIOR ROOM

8MM

VISITOR’S ROOM 1

8MM

VISITOR’S ROOM 2

8MM

WATER CLOSET (WC)

100

8MM

CORRIDORS DIVERGENT DOWNWARDS RIGHT

8MM

DIVERGENT UPWARDS RIGHT

8MM

DIVERGENT DOWNWARDS LEFT

8MM

DIVERGENT UPWARDS LEFT

8MM

EXTERIOR TRANSITION TO EXTERIOR

8MM

EXTERIOR

8MM

101

Page intentionally left blank

102

2.8. PHOTOGRAPHIC RECORDING OF THE IGREJA DE STº ANDRÉ IN MAFRA

In this section information relative to the recording project of the Igreja de Stº André is presented in 3 tables following a hierarchized structure. In the first table the total number of photographs is indicated for each utilized lens and for each approach. In the second table each approach is subdivided into areas or surfaces and it is indicated the total number of acquired images for each lens. The third table illustrates sample images for each area or surface presented in the second table.

103

Table 19: Index table of the collected data. From the photographic recording of the Igreja de Stº André in Mafra, Portugal. THE IGREJA DE STº ANDRÉ Nº PHOTOS 8MM LENS 18MM LENS 50MM GENERAL SURVEY 145 145 COMPLEX ELEMENTS 173 FROM WHICH 80 93 HARD-TO-REACH AND NARROW SPACES 41 Nº PHOTOS 41 SCALABILITY 271 126 145 UNSYSTEMATIC TOTAL 630 247 383

104

Table 20: Table with detailed information of the collected data. From the photographic recording of the Igreja de Stº André in Mafra, Portugal.

GENERAL SURVEY

AREAS SUB-AREA 8MM 18MM 50MM WALLS SOUTH WALL 31 CONNECTION TO EAST WALL 17 EAST WALL 22 CONNECTION TO NORTH WALL 11 NORTH WALL 34 CONNECTION TO WEST WALL 13 WEST WALL 11 CONNECTION TO SOUTH WALL 6 TOTAL 145 OVERALL TOTAL 145 PHOTOGRAPHS

COMPLEX ELEMENTS

AREAS SUB-AREA 8MM 18MM 50MM DOOR 1 PERPENDICULAR + PARALLEL TO WALL 12 12 GENERAL 11 16 CONVERGENT PIVOT - HORIZONTAL FRAME 10 9 CONVERGENT PIVOT - VERTICAL FRAME 10 13 DOOR 2 PERPENDICULAR + PARALLEL TO WALL 9 9 GENERAL 11 13 CONVERGENT PIVOT - HORIZONTAL FRAME 8 10 CONVERGENT PIVOT - VERTICAL FRAME 9 11 TOTAL 80 93 OVERALL TOTAL 173 PHOTOGRAPHS

HARD-TO-REACH SPACES AND NARROW SPACES

AREAS SUB-AREA 8MM 18MM 50MM ROOF 2M EXTENSION 41 TOTAL 41 OVERALL TOTAL 41 PHOTOGRAPHS

SCALABILITY

AREAS SUB-AREA 8MM 18MM 50MM PAVEMENT PAVEMENT 74 145 TRANSITION FROM PAVEMENT TO WALL 52 TOTAL 126 145 OVERALL TOTAL 271 PHOTOGRAPHS

105

Table 21: Table with sample images of the collected data. From the photographic recording of the Igreja de Stº André in Mafra, Portugal.

GENERAL SURVEY

WALLS SOUTH WALL

18MM

CONNECTION TO EAST WALL

18MM

EAST WALL

18MM

CONNECTION TO NORTH WALL

18MM

NORTH WALL

18MM

CONNECTION TO WEST WALL

18MM

WEST WALL

18MM

106

CONNECTION TO SOUTH WALL

18MM

COMPLEX ELEMENTS

DOOR 1 PERPENDICULAR + PARALLEL TO WALL

8MM

18MM

GENERAL

8MM

18MM

CONVERGENT PIVOT – HORIZONTAL FRAME

8MM

18MM

CONVERGENT PIVOT - VERTICAL FRAME

107

8MM

18MM

DOOR 2 PERPENDICULAR + PARALLEL TO WALL

8MM

18MM

GENERAL

8MM

18MM

CONVERGENT PIVOT - HORIZONTAL FRAME

8MM

18MM

108

CONVERGENT PIVOT - VERTICAL FRAME

8MM

18MM

HARD-TO-REACH SPACES AND NARROW SPACES

ROOF 2M EXTENSION

8MM

SCALABILITY

PAVEMENT PAVEMENT

8MM

18MM

TRANSITION FROM PAVEMENT TO WALL

8MM

109

Page intentionally left blank

110

2.9. TERRESTRIAL LASER SCANNING OF THE CASTLE OF THE CONVENT OF CHRIST

In this section it is presented information relative to the terrestrial laser scanning of The Castle of Tomar of The Convent of Christ.

111

Table 22: Index table of the collected data. Terrestrial laser scanning of The Castle of Tomar. AREAS TOTAL SCANS 21-11-2013 9-4-2015 30-4-2015 CITADEL 1 1 KNIGHTS 5 5 MALL/PROMENADE 4 1 2 1 BETWEEN DOORS 3 3 EXTERIORS 7 1 6 TOTAL SCANS 20 6 8 6

Figure 26: Planview of the terrestrial laser scanning of the Castle of Tomar.

112

Table 23: Table with detailed information of the collected data. Terrestrial laser scanning of The Castle of Tomar. TERRESTRIAL LASER SCANNING Nº POINTS DATE OF RECORDING _FARO_SCAN_163.PTX 20,634,914 _FARO_SCAN_164.PTX 30,627,771 _FARO_SCAN_165.PTX 23,100,312 21-11-2013 _FARO_SCAN_166.PTX 30,725,500 _FARO_SCAN_170.PTX 21,326,478 _FARO_SCAN_171.PTX 16,525,642 MOST_048.PTX 19,989,593 MOST_049.PTX 18,855,636 MOST_050.PTX 23,088,151 MOST_051.PTX 23,753,850 9-4-2015 MOST_052.PTX 24,491,410 MOST_053.PTX 19,579,905 MOST_054.PTX 19,455,893 MOST_055.PTX 18,865,148 MOST_087.PTX 28,507,080 MOST_088.PTX 28,536,969 MOST_089.PTX 25,176,214 30-4-2015 MOST_091.PTX 31,647,644 MOST_092.PTX 26,826,396 MOST_093.PTX 27,374,745 TOTAL 478,998,251

113

Table 24: Table with matrix transformation from the terrestrial laser scanning of The Castle of Tomar. LASER SCANNING MATRIX TRANSFORMATION - ALIGNMENT MATRIX TRANSFORMATION GEO-REFERENCE _FARO_SCAN_163.PTX 0.144742 0.989470 0.000488 1.192860 0.951658 -0.307156 -0.000398 85.250712 -0.989469 0.144742 0.000321 -38.012188 0.307155 0.951659 0.000428 75.730189 0.000246 -0.000530 1.000000 -15.584333 0.000246 -0.000530 1.000000 112.239556 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 1.000000 0.978619 0.205682 -0.000207 -22.599655 0.040198 -0.999191 0.000200 83.345026 -0.205681 0.978619 -0.000168 -32.070133 0.999191 0.040199 -0.000176 51.281082 _FARO_SCAN_164.PTX 0.000168 0.000207 1.000001 -14.178490 0.000168 0.000207 1.000001 113.645399 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 1.000000 0.922377 0.386290 -0.000680 -49.046581 0.227643 -0.973743 0.000473 79.934436 -0.386291 0.922377 -0.000365 -24.154634 0.973744 0.227642 -0.000610 23.886534 _FARO_SCAN_165.PTX 0.000484 0.000599 1.000000 -11.665421 0.000484 0.000599 1.000000 116.158468 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 1.000000 0.184331 -0.982864 -0.001065 -56.439823 -0.999830 -0.018439 0.000556 64.474275 0.982865 0.184331 -0.000384 -7.230564 0.018439 -0.999829 -0.000986 13.783721 _FARO_SCAN_166.PTX 0.000573 -0.000975 1.000000 -9.551448 0.000573 -0.000975 1.000000 118.272441 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 1.000000 0.245589 -0.969375 0.000545 -54.160759 -0.996705 -0.081086 0.001321 48.999781 0.969373 0.245588 -0.001431 8.078071 0.081088 -0.996707 0.000775 13.487167 _FARO_SCAN_170.PTX 0.001252 0.000880 0.999999 -8.890163 0.001252 0.000880 0.999999 118.933726 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 1.000000 0.012336 0.999924 -0.000247 -31.717541 0.983970 -0.178327 0.000358 48.237581 -0.999924 0.012335 -0.000321 5.068904 0.178328 0.983970 -0.000190 36.118363 _FARO_SCAN_171.PTX -0.000319 0.000250 1.000001 6.941897 -0.000319 0.000250 1.000001 134.765786 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 1.000000 0.880297 -0.474423 -0.000013 39.039650 -0.614110 -0.789220 0.000132 17.646480 0.474423 0.880298 -0.000132 24.167423 0.789219 -0.614111 0.000009 102.717960 MOST_048.PTX 0.000075 0.000110 1.000001 -7.796500 0.000075 0.000110 1.000001 120.027389 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 1.000000 0.961447 -0.274991 0.000079 -12.885027 -0.430937 -0.902381 0.000088 69.430917 0.274991 0.961447 -0.000103 -19.596922 0.902381 -0.430937 0.000095 58.787879 MOST_049.PTX -0.000049 0.000121 1.000001 -3.812025 -0.000049 0.000121 1.000001 124.011864 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 1.000000 0.961861 -0.273540 0.000411 -6.695651 -0.429575 -0.903032 -0.000075 57.455627 0.273540 0.961862 0.000007 -8.495797 0.903031 -0.429575 0.000404 63.046452 MOST_050.PTX -0.000399 0.000105 1.000001 -3.376403 -0.000399 0.000105 1.000001 124.447486 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 1.000000 -0.603957 0.797021 0.000294 3.708075 0.886299 0.463112 -0.000234 46.365532 -0.797018 -0.603956 0.000188 0.997452 -0.463114 0.886301 0.000259 71.727964 MOST_051.PTX 0.000329 -0.000122 1.000001 -5.871014 0.000329 -0.000122 1.000001 121.952875 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 1.000000 -0.725141 0.688600 0.000412 13.809134 0.799527 0.600629 -0.000336 31.459688 -0.688601 -0.725141 0.000271 14.411262 -0.600629 0.799526 0.000361 79.459514 MOST_052.PTX 0.000488 -0.000087 1.000000 -5.776559 0.000488 -0.000087 1.000000 122.047330 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 1.000000 0.547178 -0.837016 0.000457 15.913668 -0.916303 -0.400478 0.000300 45.696709 0.837014 0.547178 -0.000381 -0.381171 0.400478 -0.916305 0.000514 83.992932 MOST_053.PTX 0.000068 0.000593 1.000001 -10.836979 0.000068 0.000593 1.000001 116.986910 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 1.000000 0.728955 0.684562 0.000358 -42.946335 0.553906 -0.832577 0.001142 21.963408 -0.684560 0.728955 -0.001218 33.605839 0.832576 0.553908 0.000555 20.303552 MOST_054.PTX -0.001096 0.000642 0.999999 -9.124761 -0.001096 0.000642 0.999999 118.699128 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 1.000000

114

0.771316 -0.636453 0.000488 -36.176804 -0.755776 -0.654827 0.001674 4.931976 0.636452 0.771315 -0.001780 49.736637 0.654828 -0.755777 0.000777 24.298401 MOST_055.PTX 0.000755 0.001684 0.999998 -9.090312 0.000755 0.001684 0.999998 118.733577 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 1.000000 0.999998 0.001999 0.000043 -0.021721 -0.164204 -0.986425 -0.000901 47.971528 -0.001999 0.999998 0.000906 -0.002647 0.986425 -0.164204 -0.000108 68.216223 MOST_087.PTX -0.000041 -0.000906 1.000000 0.021981 -0.000041 -0.000906 1.000000 127.84587 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 1.000000 0.999999 0.001471 0.000321 -0.015020 -0.164725 -0.986338 0.000101 47.974159 -0.001471 0.999999 -0.000157 -0.006445 0.986338 -0.164725 0.000343 68.223462 MOST_088.PTX -0.000322 0.000156 1.000000 -0.005023 -0.000322 0.000156 1.000000 127.818866 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 1.000000 0.999999 0.001196 0.000197 -0.011453 -0.164996 -0.986293 0.000157 47.972419 -0.001196 0.999999 -0.000192 -0.005281 0.986293 -0.164996 0.000226 68.226786 MOST_089.PTX -0.000197 0.000192 1.000000 -0.003079 -0.000197 0.000192 1.000000 127.820810 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 1.000000 1.000000 0.000333 0.000074 -0.000096 -0.165848 -0.986150 0.000216 47.969840 -0.000333 1.000000 -0.000232 -0.004580 0.986150 -0.165848 0.000112 68.237868 MOST_091.PTX -0.000074 0.000232 1.000000 -0.002836 -0.000074 0.000232 1.000000 127.821053 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 1.000000 1.000000 0.000570 0.000104 -0.004493 -0.165614 -0.986190 -0.000022 47.968196 -0.000570 1.000000 0.000005 -0.002172 0.986190 -0.165614 0.000102 68.233132 MOST_092.PTX -0.000104 -0.000005 1.000000 0.001159 -0.000104 -0.000005 1.000000 127.82504 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0.000000 1.000000 1.000000 0.000000 0.000000 0.000000 -0.166176 -0.986095 0 47.965308 0.000000 1.000000 0.000000 0.000000 0.986095 -0.166176 0 68.237202 MOST_093.PTX 0.000000 0.000000 1.000000 0.000000 0 0 1 127.823889 0.000000 0.000000 0.000000 1.000000 0.000000 0.000000 0 1.000000 Notes: i) “Most_093.ptx” used as reference for the inter-alignment procedure. ii) “Most_093.ptx” contains the original geo-referencing matrix transformation values (last column) to calculate the final geo-referencing matrix transformation values of the other laser scanning point clouds. In other words, multiply the geo- referencing matrix transformation values of “Most_093.ptx” with the matrix values of the other point clouds found in the middle column to obtain the final matrix transformation values.

115

Page intentionally left blank

116

2.10. GPS RECORDING OF THE CASTLE OF TOMAR

In this section it is presented information relative to the GPS recording of the Castle of Tomar.

117

Table 25: Index table of the collected data. GPS recording of The Castle of Tomar. AREAS Nº POINTS CITADEL 28 (1-28) KNIGHTS 11 (29-39) MALL/PROMENADE 4 (40-43) BETWEEN DOORS 9 (44-52) EXTERIORS 9 (53-61) TOTAL 61 POINTS

Figure 27: Planview of the GPS recording with the position of the GPS points.

118

Table 26: Table with sample images of the collected data. GPS recording of The Castle of Tomar.

GPS POINTS

CITADEL

1

2

3

4

5

6

7

8

119

9

10

11

12

13

14

15

16

17

120

18

19

20

21

22

23

24

25

26

121

27

28

KNIGHTS

29

30

31

32

33

34

35

122

36 N.D. N.D.

37 N.D. N.D.

38 N.D. N.D.

39 N.D. N.D.

MALL/PROMENADE

40

41

42

43

BETWEEN DOORS

44

123

45

46

47

48

49

50

51

52

EXTERIOR

53

124

54

55

56

57

58

59

60

61

125

Table 27: Table with coordinates information from the GPS recording of The Castle of Tomar. GPS POINTS ORIGINAL COORDINATES (X; Y; Z) TRANSFORMED COORDINATES (X; Y; Z) 1 -24425.2639; -7147.9465; 133.4682 67.5621; 56.9789; 133.4682 2 -24424.6085; -7150.9560; 133.5497 68.2176; 53.9694; 133.5497 3 -24422.9302; -7159.1795; 133.5444 69.8959; 45.7459; 133.5444 4 -24425.1101; -7171.3090; 133.4866 67.7160; 33.6164; 133.4866 5 -24425.3317; -7173.3036; 134.8642 67.4943; 31.6218; 134.8642 6 -24430.3485; -7175.1361; 135.3466 62.4776; 29.7893; 135.3466 7 -24428.0270; -7175.0809; 135.4071 64.7991; 29.8445; 135.4071 8 -24433.2930; -7174.8695; 135.0693 59.5331; 30.0559; 135.0693 9 -24436.4656; -7175.1981; 133.0053 56.3604; 29.7273; 133.0053 10 -24435.3295; -7175.4258; 132.9890 57.4966; 29.4996; 132.9890 11 -24447.7801; -7164.2061; 132.5143 45.0460; 40.7193; 132.5143 12 -24452.6003; -7158.1227; 132.8362 40.2257; 46.8027; 132.8362 13 -24454.0514; -7154.2670; 132.6417 38.7747; 50.6584; 132.6417 14 -24452.8403; -7152.1345; 132.7664 39.9858; 52.7909; 132.7664 15 -24443.2939; -7148.2360; 132.6187 49.5322; 56.6894; 132.6187 16 -24453.9322; -7153.9625; 131.9949 38.8938; 50.9629; 131.9949 17 -24448.5477; -7164.3583; 133.9455 44.2783; 40.5671; 133.9455 18 -24438.4169; -7174.0479; 134.0175 54.4092; 30.8775; 134.0175 19 -24430.4159; -7178.2223; 136.9270 62.4101; 26.7031; 136.9270 20 -24428.4246; -7172.9792; 136.9109 64.4014; 31.9462; 136.9109 21 -24423.2409; -7162.8263; 134.6595 69.5852; 42.0991; 134.6595 22 -24424.5811; -7149.9741; 134.6523 68.2449; 54.9513; 134.6523 23 -24434.4704; -7147.1196; 134.5657 58.3557; 57.8058; 134.5657 24 -24444-8262; -7157.5492; 144.1228 47.9999; 47.3762; 144.1228 25 -24445.4324; -7156.4890; 144.1124 47.3936; 48.4364; 144.1124 26 -24440.3947; -7151.5565; 144.3705 52.4314; 53.3689; 144.3705 27 -24446.0667; -7160.7033; 144.4402 46.7593; 44.2221; 144.4402 28 -24444.4890; -7153.0772; 144.2752 48.3370; 51.8482; 144.2752 29 -24424.5577; -7196.5790; 129.0236 31.7550; 65.3172; 120.6689 30 -24460.9969; -7139.8974; 120.7043 31.8291; 65.0280; 120.7043 31 -24456.0029; -7142.5582; 120.9624 36.8232; 62.3672; 120.9624 32 -24448.7160; -7138.8724; 122.1389 44.1100; 66.0530; 122.1389 33 -24445.9335; .7141.8274; 122.1908 46.8925; 63.0980; 122.1908 34 -24451.5773; -7135.6096; 125.9704 41.2487; 69.3158; 125.9704 35 -24444.7692; -7139.7038; 126.0138 48.0569; 65.2216; 126.0138 36 -24457.7400; -7132.3277; 125.9991 35.0860; 72.5977; 125.9991 37 -24455.2578; -7133.7192; 126.0133 37.5682; 71.2062; 126.0133 38 -24452.2512; -7134.6810; 126.1094 40.5749; 70.2444; 126.1094 39 -24452.7026; -7135.5116; 126.0716 40.1234; 69.4138; 126.0716 40 -24439.4451; -7193.8552; 117,1876 53.3810; 11.0702; 117.1876 41 -24443.2112; -7182.2417; 117.1502 49.6149; 22.6837; 117.1502 42 -24444.6261; -7181.2178; 117.9077 48.1999; 23.7076; 117.9077 43 -24447.4380; -7179.9392; 117.1368 45.3880; 24.9862; 117.1368 44 -24415.7701; -7183.4788; 115.0088 77.0560; 21.4466; 115.0088 45 -24412.4776; -7178.2823; 114.3752 80.3484; 26.6431; 114.3752 46 -24410.3293; -7170.7503; 113.6333 82.4968; 34.1751; 113.6333 47 -24432.6745; -7185.2175; 124.8335 60.1516; 19.7079; 124.8335 48 -24422.0804; -7187.8315; 124.9506 59.7457; 17.0939; 124.9506 49 -24432.4862; -7193.6925; 124.8525 60.3399; 11.2329; 124.8525

126

50 -24421.9142; -7195.0346; 127.2422 70.9119; 09.8908; 127.2422 51 -24421.9142; -7195.0346; 127.2422 68.4112; 09.2064; 128.3526 52 -24424.5577; -7196.5790; 129.0236 68.2684; 08.3464; 129.0236 53 -24416.3843; -7139.0612; 112.7397 76.4418; 65.8642; 112.7397 54 -24462.1998; -7123.5734; 120.3438 30.6263; 81.3520; 120.3438 55 -24461.8380; -7126.0402; 120.7359 30.9881; 78.8852; 120.7359 56 -24426.1641; -7143.7232; 122.9700 66.6620; 61.2022; 122.9700 57 -24435.2607; -7142.2996; 122.8459 57.5653; 62.6258; 122.8459 58 -24439.3857; -7135.1139; 119.2021 53.4403; 69.8115; 119.2021 59 -24435.1532; -7136.6568; 118.4760 57.6728; 68.2686; 118.4760 60 -24461.7228; -7118.5689; 117.2741 31.1032; 86.3565; 117.2741 61 -24422.4808; -7134.9022; 112.1943 70.3453; 70.0232; 112.1943 Notes: The coordinates were transformed to the first quadrant to reduce the total length of the values due to limited number of input digits of the software.

127

Page intentionally left blank

128

2.11. FIRST PHOTOGRAPHIC RECORDING OF THE CASTLE OF TOMAR

The following pages present the reader with information regarding the first photogrammetric recording of The Castle of Tomar. The information is given in the form of two tables structured hierarchically. The information in the first table is organized according to the identified areas in the general plan of the site and describes the total number of images acquired with each lens of focal length. The second table contains sample images of the areas identified in the previous table.

129

Table 28: Index table and charts of the collected data. From the first photographic recording of The Castle of Tomar. Nº THE CASTLE OF TOMAR 8MM LENS 18MM LENS 50MM LENS PHOTOS EXTERIOR 306 306 BETWEEN DOORS 165 165 BRIDGE 2 36 36 FROM MALL 91 91 WHICH CITADEL 130 130 Nº PHOTOS CITADEL – CHEMINS DE RONDE AND WALL 945 945 TRANSITION TO CHEMINS DE RONDE 80 80 UNSYSTEMATIC 23 23 TOTAL 1776 1776

5% 1%

17%

9%

2% 5% 53% 8% 100%

Page intentionally left blank

130

Table 29: Table with sample images from the collected data. From the photographic recording of The Castle of Tomar.

PHOTOGRAMMETRIC SURVEYING

EXTERIOR WALL AND BATTER PERPENDICULAR

18MM

BETWEEN DOORS BASE OF THE WALL AND WALL

18MM

BRIDGE 2 TRANSITION

18MM

MALL WALL

18MM

CITADEL PIVOT

18MM

CITADEL – CHEMINS DE RONDE AND WALL PANORAMAS

18MM

131

TRANSITION TO CHEMINS DE RONDE TRANSITION

18MM

UNSYSTEMATIC

18MM

132

2.12. SECOND PHOTOGRAPHIC RECORDING OF THE CASTLE OF TOMAR

The following pages present information regarding the second photogrammetric recording of The Castle of Tomar. The information is given in the form of 2 images illustrating the general plan of the site depicting the procedure for the acquisition of the photographic data; and 3 sets of tables following a hierarchized structure. The general plans of the site identify important construction elements, paths to follow, and several areas that are subdivided into sub-areas and/or surfaces. The information in the first table is structured according to the areas identified in the general plan of the site and describes the total number of acquired images with each photographic lens. The second table contains detailed information on the total number of images acquired with each photographic lens for each of the identified sub-areas and/or surfaces. Lastly, the third table contains sample images of the sub-areas and/or surfaces acquired with each type of photographic lens.

133

Figure 28: First image of the pre-planning of the second recording of The Castle of Tomar.

134

Figure 29: Second image of the pre-planning of the second recording of The Castle of Tomar.

135

Table 30: Index table and charts of the collected data. From the second photographic recording of The Castle of Tomar.

THE CASTLE OF TOMAR Nº PHOTOS 8MM LENS 18MM LENS 50MM LENS EXTERIOR 889 351 420 118 BRIDGE 1 208 118 90 BETWEEN DOORS 738 129 511 98 BRIDGE 2 298 247 51

MALL 268 117 128 23 TRANSITION - MALL TO KNIGHTS 32 FROM 32 KNIGHTS 732 WHICH 263 455 14 Nº PHOTOS TRANSITION – KNIGHTS TO CITADEL 60 60 CITADEL 682 682 CITADEL – CHEMINS DE RONDE AND WALL 2144 2144 CITADEL – KEEP 1410 1410 UNSYSTEMATIC 458 405 53 TOTAL 7919 5958 1708 253

3% 6% 11% 3% 18% 22% 9%

4% 3% 0% 9% 28% 75% 1% 8%

136

Table 31: Table with detailed information of the collected data. From the second photographic recording of The Castle of Tomar. EXTERIOR

AREAS SUB-AREA 8MM 18MM 50MM PATH (1) RAMP WALL AND BATTER PERPENDICULAR 237 36 WALL DETAILS 75 TRANSITION - BATTER TO PARAPET 21 PARAPET TOP 90 PARAPET PERPENDICULAR 133 PATH (2) ADJACENT TO WALL BASE OF THE WALL 111 TRANSITION - WALL TO PATH TOP 10 PATH TOP 81 STAIRS DIVERGENT RIGHT 23 STAIRS DIVERGENT LEFT 15 PATH (3) PARKING LOT WALL PERPENDICULAR 50 7 TOTAL 351 420 118 OVERALL TOTAL 889 PHOTOGRAPHS

BRIDGE 1

AREAS SUB-AREA 8MM 18MM 50MM BRIDGE 1 WALL 1 20 WALL 2 13 DOORWAY 44 57 CHEMIN DE RONDE BATTLEMENTS HORIZONTAL 35 BATTLEMENTS TOP 29 TRANSITION PHOTOGRAPHS 10 TOTAL 118 90 OVERALL TOTAL 208 PHOTOGRAPHS

BETWEEN DOORS

AREAS SUB-AREA 8MM 18MM 50MM WALL OF CITADEL BATTER AND WALL CONVERGENT RIGHT 112 BATTER AND WALL CONVERGENT LEFT 108 WALL DETAILS 59 SOUTHERN WALL WALL 46 39 BATTLEMENTS TRANSITION BATTLEMENTS TO BATTER 54 BATTLEMENTS DIVERGENT 178 BATTLEMENTS PERPENDICULAR 67 BATTLEMENTS TOP 75 TOTAL 129 511 98 OVERALL TOTAL 738 PHOTOGRAPHS

BRIDGE 2

AREAS SUB-AREA 8MM 18MM 50MM BRIDGE 2 WALL 1 6 WALL 2 10

137

DOORWAY 51 35 CHEMIN DE RONDE STAIRS DIVERGENT RIGHT 13 STAIRS DIVERGENT LEFT 15 BATTLEMENTS DIVERGENT RIGHT 46 BATTLEMENTS DIVERGENT LEFT 42 BATTLEMENTS TOP 80 TOTAL 247 51 OVERALL TOTAL 298 PHOTOGRAPHS

MALL

AREAS SUB-AREA 8MM 18MM 50MM PROMENADE PATH 1 WALL OF CITADEL 12 128 23 CONNECTION TO PATH 2 26 PATH 2 ADJACENT TO THE WALL 79 TOTAL 117 128 23 OVERALL TOTAL 268 PHOTOGRAPHS

TRANSITION – MALL TO KNIGHTS

AREAS SUB-AREA 8MM 18MM 50MM TRANSITION TRANSITION 32 TOTAL 32 OVERALL TOTAL 32 PHOTOGRAPHS

KNIGHTS

AREAS SUB-AREA 8MM 18MM 50MM GROUND LEVEL WALL 1 18 WALL 1 AND 2 5 WALL 2 11 WALL 2 AND 3 3 WALL 2 AND 4 5 WALL 3 4 WALL 4 18 WALL 4 AND 5 6 WALL 4 AND 6 12 WALL 5 4 WALL 6 28 WALL 6 AND 7 12 WALL 7 37 WALL 8 27 WALL 9 5 CITADEL CASTLE WALL FAR AWAY 9 CITADEL CASTLE WALL CLOSE UP 11 60 PERIMETER WALL 34 STAIRS 13 REMAINING STRUCTURES / STAIRS 57 COVERAGE OF THE FLOOR 29 82 SPIRAL STAIRCASE CLIMBING UP - LOOKING UPWARDS 37

138

CLIMBING UP - DIVERGENT RIGHT 36 CLIMBING UP - DIVERGENT LEFT 33 CLIMBING DOWN 30 TRANSITION TO SPIRAL STAIRCASE 25 CHEMIN DE RONDE DIVERGENT RIGHT 29 DIVERGENT LEFT 33 KEEP KEEP 14 TOTAL 263 455 14 OVERALL TOTAL 732 PHOTOGRAPHS

TRANSITION – KNIGHTS TO CITADEL

AREAS SUB-AREA 8MM 18MM 50MM TRANSITION TRANSITION 60 TOTAL 60 OVERALL TOTAL 60 PHOTOGRAPHS

CITADEL

AREAS SUB-AREA 8MM 18MM 50MM HOUSES CORRIDOR 23 WALL 1 18 WALL 1 AND 2 3 WALL 2 11 WALL 2 AND 3 4 WALL 3 24 WALL 3 AND 10 9 WALL 4 28 WALL 4 AND 5 3 WALL 4 AND 7 3 WALL 5 9 WALL 6 6 WALL 7 10 WALL 8 19 WALL 9 12 WALL 10 11 WALL 10 AND 16 3 WALL 11 31 WALL 11 AND 12 4 WALL 11 AND CORRIDOR 20 WALL 12 16 WALL 12 AND 14 4 WALL 13 17 WALL 13 (2) 20 WALL 13 EXTRA 16 WALL 13 AND 10 3 WALL 14 37 WALL 15 34 WALL 16 5 WALL 17 7

139

TRANSITION TRANSITION TO CITADEL 13 GROUND LEVEL KEEP - LOWER FAÇADE 55 WALL - DIVERGENT RIGHT 17 WALL - DIVERGENT LEFT 34 TUB 12 WALL TOWER WALL TOWER PARAPET 26 WALL TOWER PARAPET TOP EXTERIOR 17 TRANSITION TO WALL TOWER 98 TOTAL 682 OVERALL TOTAL 682 PHOTOGRAPHS

CITADEL – CHEMINS DE RONDE AND WALL

AREAS SUB-AREA 8MM 18MM 50MM CHEMIN DE RONDE PART 1 BATTLEMENTS DIVERGENT RIGHT 115 PART 1 BATTLEMENTS DIVERGENT LEFT 131 PART 1 BATTLEMENTS TOP 96 PART 1 BATTLEMENTS TOP EXTERIOR 46 PART 1 BATTLEMENTS TOP EXTERIOR 2 77 PART 1 BATTLEMENTS ARROW SLITS 48 PART 1 PARAPET 123 PART 2 BATTLEMENTS DIVERGENT RIGHT 88 PART 2 BATTLEMENTS DIVERGENT LEFT 120 PART 2 BATTLEMENTS TOP 110 PART 2 BATTLEMENTS TOP EXTERIOR 82 PART 2 BATTLEMENTS TOP EXTERIOR 2 113 PART 2 BATTLEMENTS ARROW SLITS 44 PART 2 PARAPET 96 TOWER BATTLEMENTS 97 TOWER BATTLEMENTS TOP 44 TOWER BATTLEMENTS TOP EXTERIOR 18 TOWER BATTLEMENTS TOP EXTERIOR 2 36 TOWER BATTLEMENTS ARROW SLITS 27 WALL INTERIORS WALL INTERIOR 1 108 WALL INTERIOR 2 66 WALL INTERIOR 3 102 TOWER INTERIOR 173 TRANSITION – STAIRS CLIMBING UP DIVERGENT LEFT 70 CLIMBING UP DIVERGENT RIGHT 23 CLIMBING DOWN 45 TRANSITION - TOWER LOOKING AT THE KEEP 43 TOTAL 2144 OVERALL TOTAL 2144 PHOTOGRAPHS

CITADEL KEEP

CITADEL KEEP SUB-AREA 8MM 18MM 50MM FLOOR -1 GENERAL 21 HORIZONTAL 55 UPWARDS 37

140

TRANSITION FLOOR -1 TO 0 CLIMBING LADDER 137 FLOOR 0 HORIZONTAL 38 UPWARDS 38 WINDOWS AND DOORS 99 TRANSITION FLOOR 0 TO EXTERIOR CLIMBING LADDER 31 TRANSITION FLOOR 1 TO EXTERIOR TRANSITION 39 FLOOR 1 HORIZONTAL 63 UPWARDS 49 TRANSITION FLOOR 1 TO FLOOR 2 CLIMBING STAIRS DIVERGENT LEFT 35 CLIMBING STAIRS DIVERGENT RIGHT 33 FLOOR 2 HORIZONTAL 56 UPWARDS 72 TRANSITION FLOOR 2 TO 3 CLIMBING STAIRS DIVERGENT LEFT 24 CLIMBING STAIRS DIVERGENT RIGHT 19 TRANSITION FLOOR 3 TO ROOFTOP CLIMBING UP WOODEN LADDER 119 FLOOR 3 / ROOFTOP LADDER 51 STONE 20 BATTLEMENTS DIVERGENT RIGHT 61 BATTLEMENTS DIVERGENT LEFT 68 BATTLEMENTS TOP 145 BATTLEMENTS TOP EXTERIOR 63 BATTLEMENTS ARROW SLITS 37 TOTAL 1410 OVERALL TOTAL 1410 PHOTOGRAPHS

141

Table 32: Table with sample images of the collected data. From the second photographic recording of The Castle of Tomar. EXTERIOR

PATH (1) RAMP WALL AND BATTER PERPENDICULAR

18MM

50MM

WALL DETAILS

50MM

TRANSITION - BATTER TO PARAPET

8MM

PARAPET TOP

8MM

PARAPET PERPENDICULAR

18MM

PATH (2) ADJACENT TO WALL BASE OF THE WALL

8MM

TRANSITION - WALL TO PATH TOP

142

8MM

PATH TOP

8MM

STAIRS DIVERGENT RIGHT

8MM

STAIRS DIVERGENT LEFT

8MM

PATH (3) PARKING LOT WALL PERPENDICULAR

18MM

50MM

BRIDGE 1

BRIDGE 1

WALL 1

8MM

WALL 2

143

8MM

DOORWAY

8MM

18MM

BETWEEN DOORS

WALL OF CITADEL BATTER AND WALL CONVERGENT RIGHT

18MM

BATTER AND WALL CONVERGENT LEFT

18MM

WALL DETAILS

50MM

SOUTHERN WALL WALL

18MM

144

50MM

BATTLEMENTS TRANSITION BATTLEMENTS TO BATTER

8MM

BATTLEMENTS DIVERGENT

18MM

BATTLEMENTS PERPENDICULAR

18MM

BATTLEMENTS TOP

8MM

BRIDGE 2

BRIDGE 2 WALL 1

18MM

WALL 2

18MM

DOORWAY

145

8MM

18MM

CHEMINS DE RONDE STAIRS DIVERGENT RIGHT

8MM

STAIRS DIVERGENT LEFT

8MM

BATTLEMENTS DIVERGENT RIGHT

8MM

BATTLEMENTS DIVERGENT LEFT

8MM

BATTLEMENTS TOP

8MM

MALL

PROMENADE PATH 1 WALL OF CITADEL

146

8MM

18MM

50MM

CONNECTION TO PATH 2

8MM

PATH 2 ADJACENT TO THE WALL

8MM

TRANSITION – MALL TO KNIGHTS

TRANSITION TRANSITION

8MM

KNIGHTS

GROUND LEVEL WALL 1

147

18MM

WALL 1 AND 2

18MM

WALL 2

18MM

WALL 2 AND 3

18MM

WALL 2 AND 4

18MM

WALL 3

18MM

WALL 4

18MM

WALL 4 AND 5

148

18MM

WALL 4 AND 6

18MM

WALL 5

18MM

WALL 6

18MM

WALL 6 AND 7

18MM

WALL 7

18MM

WALL 8

18MM

WALL 9

149

18MM

CITADEL CASTLE WALL FAR AWAY

18MM

CITADEL CASTLE WALL CLOSE UP

8MM

18MM

PERIMETER WALL

18MM

STAIRS

18MM

REMAINING STRUCTURES / STAIRS

18MM

COVERAGE OF THE FLOOR

8MM

150

18MM

SPIRAL STAIRCASE CLIMBING UP - LOOKING UPWARDS

8MM

CLIMBING UP - DIVERGENT RIGHT

8MM

CLIMBING UP - DIVERGENT LEFT

8MM

CLIMBING DOWN

8MM

TRANSITION TO SPIRAL STAIRCASE

8MM

CHEMIN DE RONDE DIVERGENT RIGHT

8MM

151

DIVERGENT LEFT

8MM

KEEP KEEP

50MM

TRANSITION – KNIGHTS TO CITADEL

TRANSITION TRANSITION

8MM

CITADEL

HOUSES CORRIDOR

8MM

WALL 1

8MM

WALL 1 AND 2

8MM

152

WALL 2

8MM

WALL 2 AND 3

8MM

WALL 3

8MM

WALL 3 AND 10

8MM

WALL 4

8MM

WALL 4 AND 5

8MM

WALL 4 AND 7

8MM

153

WALL 5

8MM

WALL 6

8MM

WALL 7

8MM

WALL 8

8MM

WALL 9

8MM

WALL 10

8MM

WALL 10 AND 16

8MM

154

WALL 11

8MM

WALL 11 AND 12

8MM

WALL 11 AND CORRIDOR

8MM

WALL 12

8MM

WALL 12 AND 14

8MM

WALL 13

8MM

WALL 13 (2)

8MM

155

WALL 13 EXTRA

8MM

WALL 13 AND 10

8MM

WALL 14

8MM

WALL 15

8MM

WALL 16

8MM

WALL 17

8MM

TRANSITION TRANSITION TO CITADEL

8MM

156

GROUND LEVEL KEEP - LOWER FAÇADE

8MM

WALL - DIVERGENT RIGHT

8MM

WALL - DIVERGENT LEFT

8MM

TUB

8MM

WALL TOWER WALL TOWER PARAPET

8MM

WALL TOWER PARAPET TOP EXTERIOR

8MM

TRANSITION TO WALL TOWER

8MM

157

CITADEL – CHEMINS DE RONDE AND WALL

CHEMINS DE RONDE PART 1 battlements divergent right

8MM

PART 1 BATTLEMENTS DIVERGENT LEFT

8mm

PART 1 BATTLEMENTS TOP

8MM

PART 1 BATTLEMENTS TOP EXTERIOR

8MM

PART 1 BATTLEMENTS TOP EXTERIOR 2

8MM

PART 1 BATTLEMENTS ARROW SLITS

8MM

PART 1 PARAPET

8MM

158

PART 2 BATTLEMENTS DIVERGENT RIGHT

8MM

PART 2 BATTLEMENTS DIVERGENT LEFT

8MM

PART 2 BATTLEMENTS TOP

8MM

PART 2 BATTLEMENTS TOP EXTERIOR

8MM

PART 2 BATTLEMENTS TOP EXTERIOR 2

8MM

PART 2 BATTLEMENTS ARROW SLITS

8MM

PART 2 PARAPET

8MM

159

TOWER BATTLEMENTS

8MM

TOWER BATTLEMENTS TOP

8MM

TOWER BATTLEMENTS TOP EXTERIOR

8MM

TOWER BATTLEMENTS TOP EXTERIOR 2

8mm

WALL INTERIORS WALL INTERIOR 1

8MM

WALL INTERIOR 2

8MM

WALL INTERIOR 3

8MM

160

TOWER INTERIOR

8MM

TRANSITION - STAIRS CLIMBING UP DIVERGENT LEFT

8MM

CLIMBING UP DIVERGENT RIGHT

8MM

CLIMBING DOWN

8MM

TRANSITION -TOWER LOOKING AT THE KEEP

8MM

CITADEL KEEP

FLOOR -1 GENERAL

8MM

HORIZONTAL

8MM

161

UPWARDS

8MM

TRANSITION FLOOR -1 TO 0 CLIMBING LADDER

8MM

FLOOR 0 HORIZONTAL

8MM

UPWARDS

8MM

WINDOWS AND DOORS

8MM

TRANSITION FLOOR 0 TO EXTERIOR CLIMBING LADDER

8MM

TRANSITION FLOOR 1 TO EXTERIOR TRANSITION

8MM

162

FLOOR 1 HORIZONTAL

8MM

UPWARDS

8MM

TRANSITION FLOOR 1 TO FLOOR 2 CLIMBING STAIRS DIVERGENT LEFT

8MM

CLIMBING STAIRS DIVERGENT RIGHT

8MM

FLOOR 2 HORIZONTAL

8MM

UPWARDS

8MM

TRANSITION FLOOR 2 TO 3 CLIMBING STAIRS DIVERGENT LEFT

8MM

163

CLIMBING STAIRS DIVERGENT RIGHT

8MM

TRANSITION FLOOR 3 TO ROOFTOP CLIMBING UP WOODEN LADDER

8MM

FLOOR 3 / ROOFTOP LADDER

8MM

STONE

8MM

BATTLEMENTS DIVERGENT RIGHT

8MM

BATTLEMENTS DIVERGENT LEFT

8MM

BATTLEMENTS TOP

8MM

164

BATTLEMENTS TOP EXTERIOR

8MM

BATTLEMENTS ARROW SLITS

8MM

165

Table 33: Table with detailed information concerning the first photogrammetric processing with Pix4D software. AREA SUB-PROJECTS PROJECTS EXTERIOR “TORRINHA_EXTERIOR” “EXTERIORES_CAMINHOA_MURETE_8MM” “MERGE_EXTERIORES2_GOOD”*3 “EXTERIORES_A3” “EXTERIORES” BETWEEN DOORS “PONTE1” “PONTE2” “MERGE_PORTAS_PONTES_2” *1 “PONTE2_2” “ENTREPORTAS” “_ENTREPORTAS_ATRAS” “_PORTS_TUDO”*4 “MERGE_PORTAS_PONTES_2” *1 MALL/PROMENADE “JARDINS” “MERGE_JAR_CAV”*2 & “CAVALEIROS” KNIGHTS “ESPIRAL_BOM” “_MERGE_JAR_CAV_ESP”*5 “MERGE_JAR_CAV” *2 CITADEL “ADARVE1” “ALCACOVA_CASAS2” “MERGE_ALCACOVA”*6 “ALCACOVA_CASAS3” “_MENAGEM_INTERIOR” “MERGE_EXTALC_TOR_INTMEN”*7 “TORRINHA_EXTERIOR” FINAL “__REENTRANCIA1” “__REENTRANCIA2” “__REENTRANCIA3” “__TORRINHA_INTERIOR” “__TOPOFORA2” “TUDO” “MERGE_EXTERIORES2_GOOD”*3 “_PORTS_TUDO”*4 “_MERGE_JAR_CAV_ESP”*5 “MERGE_ALCACOVA”*6 “MERGE_EXTALC_TOR_INTMEN”*7 TOTAL IMAGES 6769 ENABLED OUT OF 6930 AS INPUT Note: Photogrammetric processing executed without inserting GPS coordinates in Pix4D software. Sub-projects created and merged in a hierarchized order.

166

Table 34: Table with detailed information concerning the exports from the first photogrammetric processing with Pix4D software. Nº OF POINT CLOUD NAME POINTS SIZE (MB IN BINARY) 1 “TUDO_ADARVE_DENSIFIED_POINT_CLOUD_PART_1” 14,073,051 207 2 “TUDO_ADARVE_DENSIFIED_POINT_CLOUD_PART_2” 27,731,835 407 3 “TUDO_ALC_DENSIFIED_POINT_CLOUD” 5,401,689 80 4 “TUDO_CASAS_DENSIFIED_POINT_CLOUD” 55,018,208 806 5 “TUDO_CAV_DENSIFIED_POINT_CLOUD” 42,900,422 629 6 “TUDO_ENTREPORTAS_DENSIFIED_POINT_CLOUD” 63,943,884 937 7 “TUDO_EXTA_DENSIFIED_POINT_CLOUD” 42,495,613 623 8 “TUDO_EXTA2_DENSIFIED_POINT_CLOUD” 16,759,323 246 9 “TUDO_JARC_DENSIFIED_POINT_CLOUD” 7,932,067 117 10 “TUDO_JARC2_DENSIFIED_POINT_CLOUD” 13,943,875 205 11 “TUDO_MENEXT_DENSIFIED_POINT_CLOUD” 11,153,948 164 12 “TUDO_MENFACHADA_DENSIFIED_POINT_CLOUD” 11,407,518 168 13 “TUDO_MENINFERIOR_DENSIFIED_POINT_CLOUD” 22,213,294 326 14 “TUDO_MENSUPERIOR_DENSIFIED_POINT_CLOUD” 39,063,370 573 15 “TUDO_PONTE1_DENSIFIED_POINT_CLOUD” 3,643,993 54 16 “TUDO_TORREZECA_DENSIFIED_POINT_CLOUD” 8,427,123 124 TOTAL 386,109,213 5,666 Note: Pix4d software automatically determines the total number of point clouds to export depending on the total number of groups of images. In the first photogrammetric processing only 1 group of images was created and maximized outputs were selected.

167

Figure 30: Pix4D report from the first photogrammetric processing (spans 9 pages).

168

\

169

170

171

172

173

174

175

176

Table 35: Table with detailed information concerning the second photogrammetric processing with Pix4D software. AREA SUB-PROJECTS PROJECTS EXTERIOR “BEST_EXTNA1NA2NDOORS” BETWEEN DOORS “MER_BEST_EXT_50MM”” 1 MALL/PROMENADE “MER_TEST_BATTLSNEXT”* “BEST_PARKLOT” KNIGHTS MER BATTLES BATTLEMENTS “ _ 2” “MER_BATTLES2” “MER_TEST_BATTLSNCAVJAR”*2 “BEST_JARNCAVNPONTE” “MER_TEST_BATTLSNEXT”*1 “MER2_EXTNJARNCAVNBATTLS”*4 “MER_TEST_BATTLSNCAVJAR”*2 CITADEL “BEST_HOUSING” “BEST_INDOOR1” “MER_BEST_HOUS_INTS”*3 “BEST_INDOOR3” “MER_BEST_HOUS_INTS” *3 “MER2_BEST_CITNHOUSEINTS” *5 “MER_BEST_CITADEL” FINAL “MER3_EVERYTHING_GPSTOPROCESS” 4 “MER2_EXTNJARNCAVNBATTLS”* “MER3_EVERYTHING_GPS” “MER3_EVERYTHING” 5 “MER2_BEST_CITNHOUSEINTS”* “MER3_EVERYTHING_DENSE” (FINAL) TOTAL IMAGES 7176 ENABLED OUT OF 7576 AS INPUT Note: GPS coordinates were inserted in Pix4D software. Sub-projects were created and merged into a single and global project.

177

Table 36: Table with detailed information concerning the exports from the second photogrammetric processing with Pix4D software. Nº OF POINT CLOUD NAME POINTS SIZE (MB IN BINARY) 1 “MER3_EVERYTHING_DENSE_ALCACOVA_ADARVE_DEN 194 1 SIFIED_POINT_CLOUD_PART_1” “MER3_EVERYTHING_DENSE_ALCACOVA_ADARVE_DEN 71,655,915 1050 -/- SIFIED_POINT_CLOUD_PART_2” “MER3_EVERYTHING_DENSE_ALCACOVA_ADARVE_DEN 14,331,276 210 2 SIFIED_POINT_CLOUD_PART_2_1” “MER3_EVERYTHING_DENSE_ALCACOVA_ADARVE_DEN 6,890,152 101 3 SIFIED_POINT_CLOUD_PART_2_2” “MER3_EVERYTHING_DENSE_ALCACOVA_ADARVE_DEN 19,484,579 286 4 SIFIED_POINT_CLOUD_PART_2_3” “MER3_EVERYTHING_DENSE_ALCACOVA_ADARVE_DEN 12,254,501 180 5 SIFIED_POINT_CLOUD_PART_2_4” “MER3_EVERYTHING_DENSE_ALCACOVA_ADARVE_DEN 8,751,820 129 6 SIFIED_POINT_CLOUD_PART_2_5” “MER3_EVERYTHING_DENSE_ALCACOVA_ADARVE_DEN 2,159,838 32 7 SIFIED_POINT_CLOUD_PART_2_CAV” “MER3_EVERYTHING_DENSE_ALCACOVA_ADARVE_DEN 2,796,925 41 8 SIFIED_POINT_CLOUD_PART_2_DOOR” “MER3_EVERYTHING_DENSE_ALCACOVA_ADARVE_DEN 2,500,480 37 9 SIFIED_POINT_CLOUD_PART_2_EXT” “MER3_EVERYTHING_DENSE_ALCACOVA_ADARVE_DEN 2,486,344 37 10 SIFIED_POINT_CLOUD_PART_2_MALL” “MER3_EVERYTHING_DENSE_ALCACOVA_ADARVE_DEN 2 1 11 SIFIED_POINT_CLOUD_PART_3” “MER3_EVERYTHING_DENSE_ALCACOVA_ADARVE_DEN 1 1 12 SIFIED_POINT_CLOUD_PART_4” “MER3_EVERYTHING_DENSE_ALCACOVA_ADARVE_DEN 255 1 13 SIFIED_POINT_CLOUD_PART_5” “MER3_EVERYTHING_DENSE_ALCACOVA_ADARVE_DEN 2 1 14 SIFIED_POINT_CLOUD_PART_6” “MER3_EVERYTHING_DENSE_ALCACOVA_ADARVE_DEN 1 1 15 SIFIED_POINT_CLOUD_PART_7” “MER3_EVERYTHING_DENSE_ALCACOVA_ADARVE_DEN 1 1 16 SIFIED_POINT_CLOUD_PART_8” “MER3_EVERYTHING_DENSE_ALCACOVA_DENSIFIED_P 87,612,071 1284 -/- OINT_CLOUD_PART_1” “MER3_EVERYTHING_DENSE_ALCACOVA_DENSIFIED_P 15.279,356 224 17 OINT_CLOUD_PART_1_1” “MER3_EVERYTHING_DENSE_ALCACOVA_DENSIFIED_P 20,133,755 295 18 OINT_CLOUD_PART_1_2” “MER3_EVERYTHING_DENSE_ALCACOVA_DENSIFIED_P 7,371,305 108 19 OINT_CLOUD_PART_1_3” “MER3_EVERYTHING_DENSE_ALCACOVA_DENSIFIED_P 19,867,738 292 20 OINT_CLOUD_PART_1_4” “MER3_EVERYTHING_DENSE_ALCACOVA_DENSIFIED_P 24,959,917 366 21 OINT_CLOUD_PART_1_5” “MER3_EVERYTHING_DENSE_ALCACOVA_DENSIFIED_P 1,152 19 22 OINT_CLOUD_PART_2” “MER3_EVERYTHING_DENSE_ALCACOVA_DENSIFIED_P 5,318 80 23 OINT_CLOUD_PART_3” “MER3_EVERYTHING_DENSE_ALCACOVA_DENSIFIED_P 127 3 24 OINT_CLOUD_PART_4” “MER3_EVERYTHING_DENSE_ALCACOVA_DENSIFIED_P 1 1 25 OINT_CLOUD_PART_5” “MER3_EVERYTHING_DENSE_ALCACOVA_DENSIFIED_P 1 1 26 OINT_CLOUD_PART_6”

178

“MER3_EVERYTHING_DENSE_CAVALEIROS_DENSIFIED_ 130,322 2 27 POINT_CLOUD_PART_1” “MER3_EVERYTHING_DENSE_CAVALEIROS_DENSIFIED_ 79,996,019 1172 -/- POINT_CLOUD_PART_2” “MER3_EVERYTHING_DENSE_CAVALEIROS_DENSIFIED_ 18,101,083 266 28 POINT_CLOUD_PART_2_1” “MER3_EVERYTHING_DENSE_CAVALEIROS_DENSIFIED_ 12,359,310 182 29 POINT_CLOUD_PART_2_2” “MER3_EVERYTHING_DENSE_CAVALEIROS_DENSIFIED_ 11,210,004 165 30 POINT_CLOUD_PART_2_3” “MER3_EVERYTHING_DENSE_CAVALEIROS_DENSIFIED_ 19,847,668 291 31 POINT_CLOUD_PART_2_4” “MER3_EVERYTHING_DENSE_CAVALEIROS_DENSIFIED_ 18,477,954 271 32 POINT_CLOUD_PART_2_STAIRS” “MER3_EVERYTHING_DENSE_CAVALEIROS_DENSIFIED_ 8 1 33 POINT_CLOUD_PART_3” “MER3_EVERYTHING_DENSE_CAVALEIROS_DENSIFIED_ 220 1 34 POINT_CLOUD_PART_4” “MER3_EVERYTHING_DENSE_CAVALEIROS_DENSIFIED_ 31 1 35 POINT_CLOUD_PART_5” “MER3_EVERYTHING_DENSE_CAVALEIROS_DENSIFIED_ 271 1 36 POINT_CLOUD_PART_6” “MER3_EVERYTHING_DENSE_ENTREPORTAS_DENSIFIED 1 1 37 _POINT_CLOUD_PART_1” “MER3_EVERYTHING_DENSE_ENTREPORTAS_DENSIFIED 1 1 38 _POINT_CLOUD_PART_2” “MER3_EVERYTHING_DENSE_ENTREPORTAS_DENSIFIED 36,652,220 537 -/- _POINT_CLOUD_PART_3” “MER3_EVERYTHING_DENSE_ENTREPORTAS_DENSIFIED 3,619,885 54 39 _POINT_CLOUD_PART_3_1” “MER3_EVERYTHING_DENSE_ENTREPORTAS_DENSIFIED 8,542,404 126 40 _POINT_CLOUD_PART_3_2” “MER3_EVERYTHING_DENSE_ENTREPORTAS_DENSIFIED 24,489,931 359 41 _POINT_CLOUD_PART_3_3” “MER3_EVERYTHING_DENSE_ENTREPORTAS_DENSIFIED 3 1 42 _POINT_CLOUD_PART_4” “MER3_EVERYTHING_DENSE_ENTREPORTAS_DENSIFIED 83,450,719 1223 -/- _POINT_CLOUD_PART_5” “MER3_EVERYTHING_DENSE_ENTREPORTAS_DENSIFIED 8,772,390 129 43 _POINT_CLOUD_PART_5_1” “MER3_EVERYTHING_DENSE_ENTREPORTAS_DENSIFIED 20,652,990 303 44 _POINT_CLOUD_PART_5_2” “MER3_EVERYTHING_DENSE_ENTREPORTAS_DENSIFIED 17,846,570 262 45 _POINT_CLOUD_PART_5_3” “MER3_EVERYTHING_DENSE_ENTREPORTAS_DENSIFIED 33,844,885 496 46 _POINT_CLOUD_PART_5_4” “MER3_EVERYTHING_DENSE_ENTREPORTAS_DENSIFIED 2,333,884 35 47 _POINT_CLOUD_PART_5_5” “MER3_EVERYTHING_DENSE_ENTREPORTAS_DENSIFIED 1 1 48 _POINT_CLOUD_PART_6” “MER3_EVERYTHING_DENSE_EXTERIOR_A1_DENSIFIE 66,082,874 969 -/- D_POINT_CLOUD_PART_1” “MER3_EVERYTHING_DENSE_EXTERIOR_A1_DENSIFIE 17,631,237 259 49 D_POINT_CLOUD_PART_1_1” “MER3_EVERYTHING_DENSE_EXTERIOR_A1_DENSIFIE 12,925,990 190 50 D_POINT_CLOUD_PART_1_2” “MER3_EVERYTHING_DENSE_EXTERIOR_A1_DENSIFIE 13,139,224 193 51 D_POINT_CLOUD_PART_1_3” “MER3_EVERYTHING_DENSE_EXTERIOR_A1_DENSIFIE 22,386,423 328 52 D_POINT_CLOUD_PART_1_4”

179

“MER3_EVERYTHING_DENSE_EXTERIOR_A1_DENSIFIE 115,573 2 53 D_POINT_CLOUD_PART_2” “MER3_EVERYTHING_DENSE_EXTERIOR_A1_DENSIFIE 11 1 54 D_POINT_CLOUD_PART_3” “MER3_EVERYTHING_DENSE_EXTERIOR_A1_DENSIFIE 46 1 55 D_POINT_CLOUD_PART_4” “MER3_EVERYTHING_DENSE_EXTERIOR_A1_DENSIFIE 1 1 56 D_POINT_CLOUD_PART_5” “MER3_EVERYTHING_DENSE_EXTERIOR_A2_DENSIFIE 15,255 1 57 D_POINT_CLOUD_PART_1” “MER3_EVERYTHING_DENSE_EXTERIOR_A2_DENSIFIE 24,057,642 353 58 D_POINT_CLOUD_PART_2” “MER3_EVERYTHING_DENSE_EXTERIOR_A2_DENSIFIE 5 1 59 D_POINT_CLOUD_PART_3” “MER3_EVERYTHING_DENSE_EXTERIOR_A2_DENSIFIE 141 1 60 D_POINT_CLOUD_PART_4” “MER3_EVERYTHING_DENSE_EXTERIOR_A2_DENSIFIE 18 1 61 D_POINT_CLOUD_PART_5” “MER3_EVERYTHING_DENSE_EXTERIOR_A3_DENSIFIE 1,182,396 18 62 D_POINT_CLOUD” “MER3_EVERYTHING_DENSE_GROUP1_DENSIFIED_POI 1,413,205 21 63 NT_CLOUD” “MER3_EVERYTHING_DENSE_JARDIM_C1_DENSIFIED_ 18,949,692 278 64 POINT_CLOUD_PART_1” “MER3_EVERYTHING_DENSE_JARDIM_C1_DENSIFIED_ 18 1 65 POINT_CLOUD_PART_2” “MER3_EVERYTHING_DENSE_JARDIM_C1_DENSIFIED_ 3 1 66 POINT_CLOUD_PART_3” “MER3_EVERYTHING_DENSE_JARDIM_C1_DENSIFIED_ 40 1 67 POINT_CLOUD_PART_4” “MER3_EVERYTHING_DENSE_JARDIM_C2_DENSIFIED_ 11,342,935 167 68 POINT_CLOUD” “MER3_EVERYTHING_DENSE_MENAGEM_INF_DENSIFIE 137,148 2 69 D_POINT_CLOUD_PART_1” “MER3_EVERYTHING_DENSE_MENAGEM_INF_DENSIFIE 1 1 70 D_POINT_CLOUD_PART_2” “MER3_EVERYTHING_DENSE_MENAGEM_INF_DENSIFIE 11,093 1 71 D_POINT_CLOUD_PART_3” “MER3_EVERYTHING_DENSE_MENAGEM_INF_DENSIFIE 36,338,160 533 -/- D_POINT_CLOUD_PART_4” “MER3_EVERYTHING_DENSE_MENAGEM_INF_DENSIFIE 5,074,608 75 74 D_POINT_CLOUD_PART_4_1” “MER3_EVERYTHING_DENSE_MENAGEM_INF_DENSIFIE 13,841,692 203 75 D_POINT_CLOUD_PART_4_2” “MER3_EVERYTHING_DENSE_MENAGEM_INF_DENSIFIE 17,421,860 256 76 D_POINT_CLOUD_PART_4_3” “MER3_EVERYTHING_DENSE_MENAGEM_INF_DENSIFIE 6,144 1 77 D_POINT_CLOUD_PART_5” “MER3_EVERYTHING_DENSE_MENAGEM_INF_DENSIFIE 1 1 78 D_POINT_CLOUD_PART_6” “MER3_EVERYTHING_DENSE_MENAGEM_SUP_DENSIFI 61,796,660 906 -/- ED_POINT_CLOUD_PART_1” “MER3_EVERYTHING_DENSE_MENAGEM_SUP_DENSIFI 3,911,383 58 79 ED_POINT_CLOUD_PART_1_1” “MER3_EVERYTHING_DENSE_MENAGEM_SUP_DENSIFI 32,284,164 473 80 ED_POINT_CLOUD_PART_1_2” “MER3_EVERYTHING_DENSE_MENAGEM_SUP_DENSIFI 25,601,113 376 81 ED_POINT_CLOUD_PART_1_3” “MER3_EVERYTHING_DENSE_MENAGEM_SUP_DENSIFI 1 1 82 ED_POINT_CLOUD_PART_2”

180

“MER3_EVERYTHING_DENSE_MENAGEM_SUP_DENSIFI 1 1 83 ED_POINT_CLOUD_PART_3” “MER3_EVERYTHING_DENSE_MENAGEM_SUP_DENSIFI 1 1 84 ED_POINT_CLOUD_PART_4” “MER3_EVERYTHING_DENSE_PONTE2_DENSIFIED_POI 15,225,815 224 85 NT_CLOUD” “MER3_EVERYTHING_DENSE_REENT_1_DENSIFIED_PO 11,467,918 168 86 INT_CLOUD” “MER3_EVERYTHING_DENSE_REENT_2_DENSIFIED_PO 7,037,801 104 87 INT_CLOUD” “MER3_EVERYTHING_DENSE_REENT_3_DENSIFIED_PO 10,480,160 154 88 INT_CLOUD” “MER3_EVERYTHING_DENSE_TORRE_INT_DENSIFIED_P 15,506,717 228 89 OINT_CLOUD” 669,093,872 (BINARY FORMAT) 9,716 TOTAL (ASCII FORMAT) 21,100 THE VALUES REPRESENTED IN THE ABOVE ROW DO NOT TAKE INTO ACCOUNT POINT CLOUDS IDENTIFIED AS “-/-“

THE TRUE TOTAL NUMBER OF POINT CLOUDS IS 17. INDICATED WITH A *. Note: Pix4d software automatically determines the total number of point clouds to export depending on the total number of groups of images. In the second photogrammetric processing several groups of images were created and maximized outputs activated. Approximately 700 million points were computed, more than in the previous computation that provided with 400 million points, approximately.

181

Figure 31: Pix4D report from the first photogrammetric processing (spans 14 pages )

182

183

184

185

186

187

188

189

190

191

192

193

194

195

196

2.13. PYTHON SCRIPTS - SKYREMOVER BUILDINGS

1 ###import libs 2 import sys, os 3 4 ###run program 5 if __name__ == '__main__': 6 # Original_file = "dado.ply" 7 Accept_file = "accepted_points.ply" 8 Reject_file = "rejected_points.ply" 9 10 Original_file = sys.argv[1] 11 Accept_file = Original_file.split(".")[0] + "_accepted.ply" 12 Reject_file = Original_file.split(".")[0] + "_rejected.ply" 13 14 ###point clouds 15 RawData=open(Original_file,"r") 16 RawDataLines=RawData.readlines() 17 18 ###what is header and what is not 19 HeaderData = RawDataLines[:12] 20 CloudData=RawDataLines[12:] 21 22 ###file with accepted and rejected points 23 Accepted=open(Accept_file,"w") 24 Rejected=open(Reject_file,"w") 25 26 ###count lines 27 accepted_points = 0 28 rejected_points = 0 29 accepted_data = "" 30 rejected_data = "" 31 32 ###for each line then do 33 for point in CloudData: 34 Columns = point.split(" ") 35 # print Columns 36 red = int(Columns[3]) 37 green=int(Columns[4]) 38 blue=int(Columns[5]) 39 40 ###conditions 41 if red <= green < blue and (green - red >= 25 or blue - green >= 25): 42 rejected_data += point 43 elif 160 <= red <= green < blue and (green - red >= 10 or blue - green >= 10): 44 rejected_data += point 45 elif red <= green <= blue and red in range(200,256) and green in range(200,256) and blue in range(200,256): 46 rejected_data += point 47 elif red in range(235,256) and green in range(235,256) and blue in range(235,256): 48 rejected_data += point 49 elif red in range(185,256) and green in range(185,256) and blue in range(185,256) and (red - green <= 10 or red - blue <= 10 or green - red <= 10 or green - blue <= 10 or blue - red <= 10 or blue - green <= 10): 50 rejected_data += point 51 else: 52 accepted_points += 1 53 accepted_data += point 54 55 ### fetch the number of points in the original point cloud. 56 original_n_points = int(HeaderData[4].split(" ")[2]) 57 rejected_points = original_n_points - accepted_points 58 59 ### create a header for the rejected point cloud. 60 rejected_header = HeaderData 61 rejected_header[4] = rejected_header[4].split("vertex")[0] + "vertex " + str(rejected_points) + "\n" 62 rejected_header = ''.join(rejected_header) 63 64 ### Print out the rejected data as a point cloud form. 65 Rejected.write(rejected_header) 66 Rejected.write(rejected_data) 67 Rejected.close() 68 69 ### replace the correct number of points in the accepted point cloud 70 HeaderData[4] = HeaderData[4].split("vertex")[0] + "vertex " + str(accepted_points) + "\n" 71 HeaderData = ''.join(HeaderData)

197

72 73 ### print out the accepted data, including the header and points in the file 74 Accepted.write(HeaderData) 75 Accepted.write(accepted_data) 76 Accepted.close()

198

2.14. PYTHON SCRIPTS - SKYREMOVER VEGETATION

1 ###import libs 2 import sys, os 3 4 ###run program 5 if __name__ == '__main__': 6 # Original_file = "dado.ply" 7 Accept_file = "accepted_points.ply" 8 Reject_file = "rejected_points.ply" 9 10 Original_file = sys.argv[1] 11 Accept_file = Original_file.split(".")[0] + "_accepted.ply" 12 Reject_file = Original_file.split(".")[0] + "_rejected.ply" 13 14 ###point clouds 15 RawData=open(Original_file,"r") 16 RawDataLines=RawData.readlines() 17 18 ###what is header and what is not 19 HeaderData = RawDataLines[:12] 20 CloudData=RawDataLines[12:] 21 22 ###file with accepted and rejected points 23 Accepted=open(Accept_file,"w") 24 Rejected=open(Reject_file,"w") 25 26 ###count lines 27 accepted_points = 0 28 rejected_points = 0 29 accepted_data = "" 30 rejected_data = "" 31 32 ###for each line then do 33 for point in CloudData: 34 Columns = point.split(" ") 35 # print Columns 36 red = int(Columns[3]) 37 green=int(Columns[4]) 38 blue=int(Columns[5]) 39 40 ###conditions 41 if 100 < red <= green < blue: 42 rejected_points += 1 43 elif red <= green <= blue and red in range(200,256) and green in range(200,256) and blue in range(200,256): 44 rejected_data += point 45 elif red in range(235,256) and green in range(235,256) and blue in range(235,256): 46 rejected_data += point 47 elif green >= blue >= red > 140 and (green - blue < 15 or green - red < 15): 48 rejected_data += point 49 elif green >= red >= blue > 140 and (green - blue < 15 or green - red < 15): 50 rejected_data += point 51 elif red >= green >= blue > 140 and (red - green < 15 or red - blue < 15): 52 rejected_data += point 53 elif red >= blue >= green > 140 and (red - green < 15 or red - blue < 15): 54 rejected_data += point 55 else: 56 accepted_points += 1 57 accepted_data += point 58 59 ### fetch the number of points in the original point cloud. 60 original_n_points = int(HeaderData[4].split(" ")[2]) 61 rejected_points = original_n_points - accepted_points 62 63 ### create a header for the rejected point cloud. 64 rejected_header = HeaderData 65 rejected_header[4] = rejected_header[4].split("vertex")[0] + "vertex " + str(rejected_points) + "\n" 66 rejected_header = ''.join(rejected_header) 67 ### Print out the rejected data as a point cloud form. 68 Rejected.write(rejected_header) 69 Rejected.write(rejected_data) 70 Rejected.close() 71 72 ### replace the correct number of points in the accepted point cloud 73 HeaderData[4] = HeaderData[4].split("vertex")[0] + "vertex " + str(accepted_points) + "\n"

199

74 HeaderData = ''.join(HeaderData) 75 ### print out the accepted data, including the header and points in the file 76 Accepted.write(HeaderData) 77 Accepted.write(accepted_data) 78 Accepted.close()

200

2.15. PYTHON SCRIPTS – CONTOUR LINES

1 from __future__ import division 2 import math as mt 3 import os 4 from time import time 5 6 ###define paths 7 PathToFolder_Main = str(raw_input("Path to folder containing all <.ply> files.\n >>")) 8 PathToFolder_Clouds = PathToFolder_Main + "\\ContourLines" 9 ###input minimum/maximum height for cut 10 low = str(raw_input("Lowest altitude value.\n >>")) 11 high = str(raw_input("Highest altitude value.\n >>")) 12 13 lowf = float(low) 14 highf = float(high) 15 ### Color for Meter 16 value = str(255) 17 ### Color for HalfMeter 18 value0 = str(0) 19 20 if not os.path.exists(PathToFolder_Clouds): 21 os.mkdir(PathToFolder_Main + "\\" + "ContourLines") 22 program_start=time() 23 for doc in os.listdir(PathToFolder_Main): 24 start=time() 25 if doc.endswith(".ply"): 26 #print(doc) 27 RawData_Original = doc 28 RawData_File = open(PathToFolder_Main + "\\" + RawData_Original,"r") 29 30 SectionCut_Create = PathToFolder_Clouds + "\\SectionCut.ply" 31 SectionCut_File = open(SectionCut_Create,"a") 32 33 Header=True 34 35 for Point in RawData_File: 36 if Header==True and Point=="end_header\n": 37 Header=False 38 continue 39 elif Header==True and Point!="end_header\n": 40 continue 41 PointColumns = Point.split(" ") 42 #print PointColumns 43 Z = float(PointColumns[2]) 44 R = float(PointColumns[3]) 45 G = float(PointColumns[4]) 46 B = float(PointColumns[5]) 47 48 Zf = float(Z) 49 Zi = int(Z) 50 Zvalue = float(Zf - Zi) 51 52 ###this way also works 53 #for i in range (low, high): 54 # ai = float(i+0.03) 55 # i2 = float(i) 56 # if i2 < Z < ai: 57 # write 58 59 m5 = Zi%5 60 61 if 0 < Zvalue < 0.03 and lowf < Z < highf and m5 == 0: 62 PointColumns[3] = value 63 PointColumns[4] = value0 64 PointColumns[5] = value0 65 FinalPoint = " ".join(PointColumns) 66 SectionCut_File.write(str(FinalPoint)) 67 68 elif 0 < Zvalue < 0.03 and lowf < Z < highf: 69 PointColumns[3] = value0 70 PointColumns[4] = value 71 PointColumns[5] = value0 72 FinalPoint = " ".join(PointColumns) 73 SectionCut_File.write(str(FinalPoint))

201

74 75 elif 0.5 < Zvalue < 0.53 and lowf < Z < highf: 76 PointColumns[3] = value0 77 PointColumns[4] = value0 78 PointColumns[5] = value 79 FinalPoint = " ".join(PointColumns) 80 SectionCut_File.write(str(FinalPoint)) 81 82 83 84 85 86 87 end=time() 88 print "Finished %s. Took %f seconds" % (str(doc),end-start) 89 SectionCut_File.close() 90 91 92 ###Reopen SectionCut and create new SectionCut with Header 93 94 SectionCut_File = open(PathToFolder_Clouds + "\\SectionCut.ply","r") 95 96 SectionCut_File_ReadLines =SectionCut_File.readlines() 97 NumberOfLines = len(SectionCut_File_ReadLines) 98 99 Header = "ply\nformat ascii 1.0\ncomment Author: CloudCompare (TELECOM PARISTECH/EDF R&D)\nobj_info Generated by CloudCompare!\nelement vertex %d\nproperty float x\nproperty float y\nproperty float z\nproperty uchar red\nproperty uchar green\nproperty uchar blue\nend_header\n" %(NumberOfLines) 100 101 SectionCut_Final = open(PathToFolder_Clouds + "\\SectionCut_Final.ply","w") 102 SectionCut_Final.write(Header) 103 104 for line in SectionCut_File_ReadLines: 105 SectionCut_Final.write(line) 106 107 SectionCut_File.close() 108 os.remove(PathToFolder_Clouds + "\\SectionCut.ply") 109 SectionCut_Final.close() 110 program_end=time() 111 print "Program took %f" %(program_end-program_start)

202

2.16. PYTHON SCRIPTS – PROCESS PLAN

1 ###import libs 2 from __future__ import division 3 import math as mt 4 import os 5 from time import time 6 import numpy as np 7 8 ###define path 9 PathToFolder_Main = str(raw_input("Path to folder containing all <.ply> files.\n >>")) 10 11 ###open file with coordinates 12 InputCoords_Original = PathToFolder_Main + "\\Coordinates.txt" 13 InputCoords_File = open(InputCoords_Original,"r") 14 InputCoords_Lines = InputCoords_File.readlines() 15 16 ###timer for overall rocessing 17 program_start=time() 18 19 for Coordinates in InputCoords_Lines: 20 ###step01 21 print "------STEP--01------\nGenerating Temp file with accepted points.\n" 22 ###timer for each plan 23 step01_start=time() 24 ###read coordinates 25 CoordinatesColumns = Coordinates.split(" ") 26 Name = str(CoordinatesColumns[0]) 27 Az = float(CoordinatesColumns[1]) 28 29 ###create more paths 30 PathToFolder_Clouds = str(PathToFolder_Main + "\\" + Name) 31 PathToFolder_Cuts = str(PathToFolder_Clouds + "\\Cuts") 32 33 ###create sub-folders 34 if not os.path.exists(PathToFolder_Clouds): 35 os.mkdir(PathToFolder_Clouds) 36 if not os.path.exists(PathToFolder_Cuts): 37 os.mkdir(PathToFolder_Cuts) 38 39 ###search for cloud files in main folder 40 for doc in os.listdir(PathToFolder_Main): 41 ###timer for each processed cloud 42 individualcloud_start=time() 43 if doc.endswith(".ply"): 44 print(doc) 45 RawData_Original = str(doc) 46 RawData_File = open(PathToFolder_Main + "\\" + RawData_Original,"r") 47 48 ###create Temp file 49 SectionCut_Temp_Create = PathToFolder_Cuts + "\\TEMP.ply" 50 SectionCut_Temp = open(SectionCut_Temp_Create,"a") 51 52 ###write acceptable points in Temp file 53 Header=True 54 for Point in RawData_File: 55 if Header==True and Point=="end_header\n": 56 Header=False 57 continue 58 elif Header==True and Point!="end_header\n": 59 continue 60 PointColumns = Point.split(" ") 61 Dz = float(PointColumns[2]) 62 63 ###generate ply files for each 10 meter### 64 if Az > Dz: 65 SectionCut_Temp.write(str(Point)) 66 67 ###close files 68 RawData_File.close() 69 SectionCut_Temp.close() 70 71 ###timer 72 individualcloud_end=time() 73 print "%f seconds" %(individualcloud_end-individualcloud_start)

203

74 75 ###timer 76 step01_end=time() 77 print "\nFinished STEP 01. Took %f seconds\n------\n" %(step01_end-step01_start) 78 79 ###timer for step02 80 step02_start=time() 81 82 ###step02 83 print "------STEP--02------\nReading Temp to generate files each with 10m interval.\n" 84 85 ###read tempfile to generate files for each 10m interval 86 TempFile = open(PathToFolder_Cuts + "\\TEMP.ply","r") 87 88 filelist = [] 89 for temps in TempFile: 90 tempscolumns = temps.split(" ") 91 Z = float(tempscolumns[2]) 92 93 94 Z10 = float(Z/10) 95 Z10int = int(Z10) 96 Z10intmult = int(Z10int*10) 97 Lowint = int(Z10intmult) 98 99 LowValueStr = str(Lowint) 100 LowValueStr2 = str(LowValueStr + ".ply") 101 SectionsIntervals = PathToFolder_Clouds + "\\%s" % (LowValueStr2) 102 if SectionsIntervals not in filelist: 103 filelist.append(SectionsIntervals) 104 105 #print filelist 106 107 openfiles = {} 108 for item in filelist: 109 filename = item.split("\\")[-1].split(".")[0] 110 openfilename = open(item, "w") 111 openfiles[filename] = openfilename 112 print "Created %s" %(filename) 113 114 ###close Temp file 115 TempFile.close() 116 117 ###timer for step02 118 step02_end=time() 119 print "\nFinished STEP 02. Took %f seconds\n------\n" %(step02_end-step02_start) 120 121 ###timer for step03 122 step03_start=time() 123 124 ###STEP 03 125 print "------STEP--03------\nSorting points into the different intervals\n" 126 127 ###read tempfile points to write into the many 10m intervals 128 TempFile = open(PathToFolder_Cuts + "\\TEMP.ply","r") 129 130 ###create and open plan-section cut 131 SectionCut_CreateOrig = PathToFolder_Clouds + "\\SectionCutOriginal.ply" 132 SectionCut_Original = open(SectionCut_CreateOrig,"a") 133 134 for temps in TempFile: 135 tempscolumns = temps.split(" ") 136 Z = float(tempscolumns[2]) 137 138 139 Z10 = float(Z/10) 140 Z10int = int(Z10) 141 Z10intmult = int(Z10int*10) 142 Lowint = int(Z10intmult) 143 144 Azplus= float(Az - 0.03) 145 146 if Az > Z and Z > Azplus: 147 SectionCut_Original.write(str(temps)) 148 149 if Az > Z: 150 LowValueStr = str(Lowint) 151 openfiles[LowValueStr].write(temps)

204

152 153 SectionCut_Original.close() 154 TempFile.close() 155 os.remove(PathToFolder_Cuts + "\\TEMP.ply") 156 157 for item in openfiles: 158 openfiles[item].close() 159 160 ###timer for step 03 161 step03_end=time() 162 print "Finished STEP 03. Took %f seconds\n------\n" %(step03_end-step03_start) 163 164 ###timer for step 04 165 step04_start=time() 166 167 ###Step04 168 print "------STEP--04------\nWritting header of files. Erasing Temp files.\n" 169 170 for ply in os.listdir(PathToFolder_Clouds): 171 if ply.endswith(".ply"): 172 #print(doc) 173 PlyFile = ply 174 PlyFile_File = open(PathToFolder_Clouds + "\\" + PlyFile,"r") 175 176 count=0 177 for number in PlyFile_File: 178 count+=1 179 180 Header = "ply\nformat ascii 1.0\ncomment Author: CloudCompare (TELECOM PARISTECH/EDF R&D)\nobj_info Generated by CloudCompare!\nelement vertex %d\nproperty float x\nproperty float y\nproperty float z\nproperty uchar red\nproperty uchar green\nproperty uchar blue\nend_header\n" %(count) 181 182 PlyFinalName = PathToFolder_Clouds + "\\" + PlyFile + "Final.ply" 183 PlyFile_Final = open(PlyFinalName, "w") 184 PlyFile_Final.write(Header) 185 186 PlyFile_File = open(PathToFolder_Clouds + "\\" + PlyFile,"r") 187 for line in PlyFile_File: 188 PlyFile_Final.write(line) 189 190 191 PlyFile_File.close() 192 os.remove(PathToFolder_Clouds + "\\" + PlyFile) 193 PlyFile_Final.close() 194 195 ###timer for step04 196 Step04_end=time() 197 print "Finished STEP 04. Took %f seconds\n------\n" %(Step04_end-step04_start) 198 199 InputCoords_File.close() 200 201###TIME 202 program_end=time() 203 print "TOTAL TIME\nProgram took %f seconds" %(program_end-program_start)

205

206 207

2.17. PYTHON SCRIPTS – PROCESS SECTION CUTS

1 from __future__ import division 2 import math as mt 3 import os 4 from time import time 5 import numpy as np 6 7 ###define paths 8 PathToFolder_Main = str(raw_input("Path to folder containing all <.ply> files.\n >>")) 9 10 #track time 11 program_start=time() 12 13 #create log 14 Log_Create = PathToFolder_Main + "\\" + "__log_Plans.txt" 15 Log = open(Log_Create,"a") 16 17 #locate and open file with coordinates 18 InputCoords_Original = PathToFolder_Main + "\\CoordinatesSection.txt" 19 InputCoords_File = open(InputCoords_Original,"r") 20 21 for Coordinates in InputCoords_File: 22 start=time() 23 CoordinatesColumns = Coordinates.split(" ") 24 #print CoordinateColumns 25 Name = str(CoordinatesColumns[0]) 26 Ax = float(CoordinatesColumns[1]) 27 Ay = float(CoordinatesColumns[2]) 28 Bx = float(CoordinatesColumns[3]) 29 By = float(CoordinatesColumns[4]) 30 Cx = float(CoordinatesColumns[5]) 31 Cy = float(CoordinatesColumns[6]) 32 try: 33 E = float(CoordinatesColumns[7]) 34 #print(E) 35 except IndexError: 36 E = 0 37 pass 38 39 ### DETERMINE MIDDLE POINT "M" ### 40 Mxo = float((Ax + Bx) / 2) 41 Myo = float((Ay + By) / 2) 42 ### ROTATE relative to ORIGIN ### 43 Base = abs((float( mt.sqrt( ((Bx - Ax)**2) + ((By - Ay)**2)) ))) 44 45 #for positive slope sectioncuts, rotation counter clockwise 46 if (Ax 0: 59 #rotate Mx and translate to new position 60 Mx2temp = float(cos*Mxo + sin*Myo) 61 My2temp = float(-sin*Mxo + cos*Myo) 62 My2tempdist= float(My2temp + E) 63 #de-rotateMx to give the final Mx and My values 64 Mx = float(cos*Mx2temp - sin*My2tempdist) 65 My = float(sin*Mx2temp + cos*My2tempdist) 66 67 Cxo = float(Cx - Mx) 68 Cyo = float(Cy - My) 69 70 Cxor = float(cos*Cxo + sin*Cyo) 71 Cyor = float(-sin*Cxo + cos*Cyo) 72

73 elif Cyor2 < 0: 74 #rotate Mx and translate to new position 75 Mx2temp = float(-cos*Mxo - sin*Myo) 76 My2temp = float(sin*Mxo - cos*Myo) 77 My2tempdist= float(My2temp + E) 78 #de-rotateMx to give the final Mx and My values 79 Mx = float(-cos*Mx2temp + sin*My2tempdist) 80 My = float(-sin*Mx2temp - cos*My2tempdist) 81 82 Cxo = float(Cx - Mx) 83 Cyo = float(Cy - My) 84 85 Cxor = float(-cos*Cxo - sin*Cyo) 86 Cyor = float(sin*Cxo - cos*Cyo) 87 88 #for negative slope sectioncuts, rotation clockwise 89 elif (AxBy) or (BxAy): 90 cos = abs(float((Bx - Ax)/Base)) 91 sin = abs(float((By - Ay)/Base)) 92 93 ### where the SECTION CUT is LOOKING at ### 94 ### translation### 95 Cxo = float(Cx - Mxo) 96 Cyo = float(Cy - Myo) 97 ### rotation ### 98 Cxor2 = float(cos*Cxo - sin*Cyo) 99 Cyor2 = float(sin*Cxo + cos*Cyo) 100 101 if Cyor2 > 0: 102 #rotate Mx and translate to new position 103 Mx2temp = float(cos*Mxo - sin*Myo) 104 My2temp = float(sin*Mxo + cos*Myo) 105 My2tempdist= float(My2temp + E) 106 #de-rotateMx to give the final Mx and My values 107 Mx = float(cos*Mx2temp + sin*My2tempdist) 108 My = float(-sin*Mx2temp + cos*My2tempdist) 109 110 Cxo = float(Cx - Mx) 111 Cyo = float(Cy - My) 112 113 Cxor = float(cos*Cxo - sin*Cyo) 114 Cyor = float(sin*Cxo + cos*Cyo) 115 116 elif Cyor2 < 0: 117 #rotate Mx and translate to new position 118 Mx2temp = float(-cos*Mxo + sin*Myo) 119 My2temp = float(-sin*Mxo - cos*Myo) 120 My2tempdist= float(My2temp + E) 121 #de-rotateMx to give the final Mx and My values 122 Mx = float(-cos*Mx2temp - sin*My2tempdist) 123 My = float(sin*Mx2temp - cos*My2tempdist) 124 125 Cxo = float(Cx - Mx) 126 Cyo = float(Cy - My) 127 128 Cxor = float(-cos*Cxo + sin*Cyo) 129 Cyor = float(-sin*Cxo - cos*Cyo) 130 131 PathToFolder_Clouds = str(PathToFolder_Main + "\\" + Name) 132 #PathToFolder_Cuts = str(PathToFolder_Clouds + "\\Cuts") 133 134 if not os.path.exists(PathToFolder_Clouds): 135 os.mkdir(PathToFolder_Clouds) 136 137 SectionCut_Temp_Create = PathToFolder_Clouds + "\\AllAcceptedPoints_Temporary.ply" 138 SectionCut_Temp = open(SectionCut_Temp_Create,"a") 139 140 SectionCut_CreateOrig = PathToFolder_Clouds + "\\SectionCutOriginal_Temporary.ply" 141 SectionCut_Original = open(SectionCut_CreateOrig,"a") 142 143 SectionCut_Create = PathToFolder_Clouds + "\\SectionCut_Temporary.ply" 144 SectionCut_File = open(SectionCut_Create,"a") 145 146 for doc in os.listdir(PathToFolder_Main): 147 if doc.endswith(".ply"): 148 #print(doc) 149 RawData_Original= str(doc) 150 RawData_File = open(PathToFolder_Main + "\\" + RawData_Original,"r")

208 209

151 152 ###read specific columns for each and every line 153 Header=True 154 for Point in RawData_File: 155 if Header==True and Point=="end_header\n": 156 Header=False 157 continue 158 elif Header==True and Point!="end_header\n": 159 continue 160 161 PointColumns = Point.split(" ") 162 #print PointColumns 163 Dx = float(PointColumns[0]) 164 Dy = float(PointColumns[1]) 165 ###Translation### 166 Dxo = float(Dx - Mx) 167 Dyo = float(Dy - My) 168 ### rotation ### 169 if (Ax 0: 172 #Cxor=Cxor2 173 #Cyor=Cyor2 174 Dxor = float(cos*Dxo + sin*Dyo) 175 Dyor = float(-sin*Dxo + cos*Dyo) 176 if Cyor > 0 and 0.030000 > Dyor >= 0.000000 and Dyor > 0: 177 PointColumns[0] = str(Dxor) 178 PointColumns[1] = str(Dyor) 179 FinalPoint = " ".join(PointColumns) 180 181 SectionCut_Original.write(Point) 182 #SectionCut_Temp.write(str(FinalPoint)) 183 SectionCut_File.write(str(FinalPoint)) 184 #print "IF%s" % str(FinalPoint) 185 186 elif Cyor < 0 and -0.030000 < Dyor <= 0.000000 and Dyor < 0: 187 PointColumns[0] = str(Dxor) 188 PointColumns[1] = str(Dyor) 189 FinalPoint = " ".join(PointColumns) 190 191 SectionCut_Original.write(Point) 192 #SectionCut_Temp.write(str(FinalPoint)) 193 SectionCut_File.write(str(FinalPoint)) 194 #print "IF%s" % str(FinalPoint) 195 196 elif Cyor > 0 and Dyor > 0: 197 PointColumns[0] = str(Dxor) 198 PointColumns[1] = str(Dyor) 199 FinalPoint = " ".join(PointColumns) 200 201 SectionCut_Temp.write(str(FinalPoint)) 202 #print "IF%s" % str(FinalPoint) 203 204 elif Cyor < 0 and Dyor < 0: 205 PointColumns[0] = str(Dxor) 206 PointColumns[1] = str(Dyor) 207 FinalPoint = " ".join(PointColumns) 208 209 SectionCut_Temp.write(str(FinalPoint)) 210 #print "IF%s" % str(FinalPoint) 211 elif Cyor2 < 0: 212 #Cxor=Cxor2 213 #Cyor=Cyor2 214 ### rotation ### 215 Dxor = float(- cos*Dxo - sin*Dyo) 216 Dyor = float(sin*Dxo - cos*Dyo) 217 if Cyor > 0 and 0.030000 > Dyor >= 0.000000 and Dyor > 0: 218 PointColumns[0] = str(Dxor) 219 PointColumns[1] = str(Dyor) 220 FinalPoint = " ".join(PointColumns) 221 222 SectionCut_Original.write(Point) 223 #SectionCut_Temp.write(str(FinalPoint)) 224 SectionCut_File.write(str(FinalPoint)) 225 #print "IF%s" % str(FinalPoint) 226 227 elif Cyor < 0 and -0.030000 < Dyor <= 0.000000 and Dyor < 0: 228 PointColumns[0] = str(Dxor)

229 PointColumns[1] = str(Dyor) 230 FinalPoint = " ".join(PointColumns) 231 232 SectionCut_Original.write(Point) 233 #SectionCut_Temp.write(str(FinalPoint)) 234 SectionCut_File.write(str(FinalPoint)) 235 #print "IF%s" % str(FinalPoint) 236 237 elif Cyor > 0 and Dyor > 0: 238 PointColumns[0] = str(Dxor) 239 PointColumns[1] = str(Dyor) 240 FinalPoint = " ".join(PointColumns) 241 242 SectionCut_Temp.write(str(FinalPoint)) 243 #print "IF%s" % str(FinalPoint) 244 245 elif Cyor < 0 and Dyor < 0: 246 PointColumns[0] = str(Dxor) 247 PointColumns[1] = str(Dyor) 248 FinalPoint = " ".join(PointColumns) 249 250 SectionCut_Temp.write(str(FinalPoint)) 251 #print "IF%s" % str(FinalPoint) 252 253 elif (AxBy) or (BxAy): 254 if Cyor2 > 0: 255 #Cxor=Cxor2 256 #Cyor=Cyor2 257 Dxor = float(cos*Dxo - sin*Dyo) 258 Dyor = float(sin*Dxo + cos*Dyo) 259 if Cyor > 0 and 0.030000 > Dyor >= 0.000000 and Dyor > 0: 260 PointColumns[0] = str(Dxor) 261 PointColumns[1] = str(Dyor) 262 FinalPoint = " ".join(PointColumns) 263 264 SectionCut_Original.write(Point) 265 #SectionCut_Temp.write(str(FinalPoint)) 266 SectionCut_File.write(str(FinalPoint)) 267 #print "IF%s" % str(FinalPoint) 268 269 elif Cyor < 0 and -0.030000 < Dyor <= 0.000000 and Dyor < 0: 270 PointColumns[0] = str(Dxor) 271 PointColumns[1] = str(Dyor) 272 FinalPoint = " ".join(PointColumns) 273 274 SectionCut_Original.write(Point) 275 #SectionCut_Temp.write(str(FinalPoint)) 276 SectionCut_File.write(str(FinalPoint)) 277 #print "IF%s" % str(FinalPoint) 278 279 elif Cyor > 0 and Dyor > 0: 280 PointColumns[0] = str(Dxor) 281 PointColumns[1] = str(Dyor) 282 FinalPoint = " ".join(PointColumns) 283 284 SectionCut_Temp.write(str(FinalPoint)) 285 #print "IF%s" % str(FinalPoint) 286 287 elif Cyor < 0 and Dyor < 0: 288 PointColumns[0] = str(Dxor) 289 PointColumns[1] = str(Dyor) 290 FinalPoint = " ".join(PointColumns) 291 292 SectionCut_Temp.write(str(FinalPoint)) 293 #print "IF%s" % str(FinalPoint) 294 elif Cyor2 < 0: 295 #Cxor=Cxor2 296 #Cyor=Cyor2 297 ### rotation ### 298 Dxor = float(- cos*Dxo + sin*Dyo) 299 Dyor = float(-sin*Dxo - cos*Dyo) 300 if Cyor > 0 and 0.030000 > Dyor >= 0.000000 and Dyor > 0: 301 PointColumns[0] = str(Dxor) 302 PointColumns[1] = str(Dyor) 303 FinalPoint = " ".join(PointColumns) 304 305 SectionCut_Original.write(Point) 306 #SectionCut_Temp.write(str(FinalPoint))

210 211

307 SectionCut_File.write(str(FinalPoint)) 308 #print "IF%s" % str(FinalPoint) 309 310 elif Cyor < 0 and -0.030000 < Dyor <= 0.000000 and Dyor < 0: 311 PointColumns[0] = str(Dxor) 312 PointColumns[1] = str(Dyor) 313 FinalPoint = " ".join(PointColumns) 314 315 SectionCut_Original.write(Point) 316 #SectionCut_Temp.write(str(FinalPoint)) 317 SectionCut_File.write(str(FinalPoint)) 318 #print "IF%s" % str(FinalPoint) 319 320 elif Cyor > 0 and Dyor > 0: 321 PointColumns[0] = str(Dxor) 322 PointColumns[1] = str(Dyor) 323 FinalPoint = " ".join(PointColumns) 324 325 SectionCut_Temp.write(str(FinalPoint)) 326 #print "IF%s" % str(FinalPoint) 327 328 elif Cyor < 0 and Dyor < 0: 329 PointColumns[0] = str(Dxor) 330 PointColumns[1] = str(Dyor) 331 FinalPoint = " ".join(PointColumns) 332 333 SectionCut_Temp.write(str(FinalPoint)) 334 #print "IF%s" % str(FinalPoint) 335 RawData_File.close() 336 SectionCut_Temp.close() 337 SectionCut_Original.close() 338 SectionCut_File.close() 339 340 #open many 10m interval files 341 SectionCut_Temp = open(SectionCut_Temp_Create,"r") 342 343 PathToFolder_CloudsTemporary = PathToFolder_Clouds + "\\" + "Temporary" 344 if not os.path.exists(PathToFolder_Clouds + "\\" + "Temporary"): 345 os.mkdir(PathToFolder_Clouds + "\\" + "Temporary") 346 TemporaryCreate = PathToFolder_CloudsTemporary + "\\" + "temporary.txt" 347 TemporaryOpen = open(PathToFolder_CloudsTemporary + "\\" + "temporary.txt","w") 348 TemporaryOpen.close() 349 #filelist = [] 350 for temps in SectionCut_Temp: 351 tempscolumns = temps.split(" ") 352 Y = float(tempscolumns[1]) 353 354 Y10 = float(Y/10) 355 Y10int = int(Y10) 356 Y10intmult = int(Y10int*10) 357 Lowint = int(Y10intmult) 358 359 LowValueStr = str(Lowint) 360 LowValueStr2 = str(LowValueStr + "_Temporary.ply") 361 SectionsIntervals = PathToFolder_Clouds + "\\%s\n" % (LowValueStr2) 362 TemporaryOpen = open(PathToFolder_CloudsTemporary + "\\" + "temporary.txt","r+") 363 if SectionsIntervals not in TemporaryOpen: 364 TemporaryOpen.write(SectionsIntervals) 365 TemporaryOpen.close() 366 #print filelist 367 368 # openfiles = {} 369 # for item in filelist: 370 # filename = item.split("\\")[-1].split(".")[0] 371 # openfilename = open(item, "w") 372 # openfiles[filename] = openfilename 373 SectionCut_Temp.close() 374 #TemporaryOpen.close() 375 376 377 Path_To_TemporaryFolderClouds = PathToFolder_CloudsTemporary + "\\" + "TemporaryClouds" 378 #for ply in txt open and sort points 379 TemporaryOpen = open(PathToFolder_CloudsTemporary + "\\" + "temporary.txt","r") 380 for item in TemporaryOpen: 381 itemname = item.split("\n")[0] 382 itemopen=open(itemname,"a") 383 #print(itemopen) 384 if not os.path.exists(Path_To_TemporaryFolderClouds):

385 os.mkdir(PathToFolder_CloudsTemporary + "\\" + "TemporaryClouds") 386 SectionCut_Temp = open(SectionCut_Temp_Create,"r") 387 Temporary_Create = Path_To_TemporaryFolderClouds + "\\" + "temporarypoints.ply" 388 Temporary_Open = open(Path_To_TemporaryFolderClouds + "\\" + "temporarypoints.ply","a") 389 # SectionCut_Create = PathToFolder_Clouds + "\\SectionCut_Temporary.ply" 390 # SectionCut_File = open(SectionCut_Create,"a") 391 for Point in SectionCut_Temp: 392 PointColumns = Point.split(" ") 393 Dyor = float(PointColumns[1]) 394 395 #name of the file where to save 396 Dyor10 = Dyor/10 397 Dyi10int = int(Dyor10) 398 Dyi10intmult = (Dyi10int*10) 399 Lowint = int(Dyi10intmult) 400 401 #to know if to write 402 itemmathsplit= item.split("\\")[-1].split("_")[0] 403 itemmath = int(itemmathsplit) 404 #itemmath10 = itemmath*10 405 #itemmath10int = int(itemmath10) 406 407 # if Cyor > 0 and 0.030000 > Dyor >= 0.000000 and Dyor > 0: 408 # SectionCut_File.write(str(Point)) 409 410 # elif Cyor < 0 and -0.030000 < Dyor <= 0.000000 and Dyor < 0: 411 # SectionCut_File.write(str(Point)) 412 413 if Cyor > 0 and Dyor > 0 and Lowint==itemmath: 414 itemopen.write(Point) 415 416 elif Cyor < 0 and Dyor < 0 and Lowint==itemmath: 417 itemopen.write(Point) 418 419 else: 420 Temporary_Open.write(Point) 421 SectionCut_File.close() 422 itemopen.close() 423 SectionCut_Temp.close() 424 Temporary_Open.close() 425 426 elif os.path.exists(Path_To_TemporaryFolderClouds): 427 Temporary_Open = open(Path_To_TemporaryFolderClouds + "\\" + "temporarypoints.ply","r") 428 Temporary_Secondary = open(Path_To_TemporaryFolderClouds + "\\" + "temporarysecondary.ply","a") 429 for Point in Temporary_Open: 430 PointColumns = Point.split(" ") 431 Dyor = float(PointColumns[1]) 432 433 #name of the file where to save 434 Dyor10 = Dyor/10 435 Dyi10int = int(Dyor10) 436 Dyi10intmult = (Dyi10int*10) 437 Lowint = int(Dyi10intmult) 438 439 #to know if to write 440 itemmathsplit= item.split("\\")[-1].split("_")[0] 441 itemmath = int(itemmathsplit) 442 #itemmath10 = itemmath*10 443 #itemmath10int = int(itemmath10) 444 445 if Cyor > 0 and Dyor > 0 and Lowint==itemmath: 446 itemopen.write(Point) 447 448 elif Cyor < 0 and Dyor < 0 and Lowint==itemmath: 449 itemopen.write(Point) 450 451 else: 452 Temporary_Secondary.write(Point) 453 itemopen.close() 454 Temporary_Open.close() 455 Temporary_Secondary.close() 456 457 #write temporary 458 os.remove(Path_To_TemporaryFolderClouds + "\\" + "temporarypoints.ply") 459 Temporary_Open = open(Path_To_TemporaryFolderClouds + "\\" + "temporarypoints.ply","a") 460 Temporary_Secondary = open(Path_To_TemporaryFolderClouds + "\\" + "temporarysecondary.ply","r") 461 for temporaries in Temporary_Secondary: 462 Temporary_Open.write(temporaries)

212 213

463 Temporary_Open.close() 464 Temporary_Secondary.close() 465 os.remove(Path_To_TemporaryFolderClouds + "\\" + "temporarysecondary.ply") 466 467 for plys in os.listdir(PathToFolder_Clouds): 468 if plys.endswith(".ply"): 469 Accepted_Temporary = str(plys) 470 #print(Accepted_Temporary) 471 Accepted_Temporary_Open = open(PathToFolder_Clouds + "\\" + Accepted_Temporary,"r") 472 count_n_points=0 473 for counts in Accepted_Temporary_Open: 474 count_n_points+=1 475 Accepted_Temporary_Open.close() 476 count_n_points_string=str(count_n_points) 477 478 Final_Create = PathToFolder_Clouds + "\\" + Accepted_Temporary.split("_")[0] + ".ply" 479 Final_Open = open(Final_Create,"a") 480 481 for documents in os.listdir(PathToFolder_Main): 482 if documents.endswith(".ply"): 483 HeaderToCopy = str(documents) 484 HeaderToCopy_Open = open(PathToFolder_Main + "\\" + HeaderToCopy,"r") 485 Header=True 486 for headers in HeaderToCopy_Open: 487 if Header==True and headers.__contains__("element vertex"): 488 elementvertex = str("element vertex %s\n") %(count_n_points_string) 489 Final_Open.write(str(elementvertex)) 490 491 elif Header==True and headers=="end_header\n": 492 Final_Open.write(str(headers)) 493 break 494 495 elif Header==True and headers!="end_header\n": 496 Final_Open.write(str(headers)) 497 HeaderToCopy_Open.close() 498 break 499 500 Accepted_Temporary_Open = open(PathToFolder_Clouds + "\\" + Accepted_Temporary,"r") 501 for accepts in Accepted_Temporary_Open: 502 Final_Open.write(accepts) 503 Final_Open.close() 504 Accepted_Temporary_Open.close() 505 os.remove(PathToFolder_Clouds + "\\" + plys) 506 end=time() 507 minilog = "%s Plan finished. Took %f seconds\n" % (str(Name),end-start) 508 Log.write(minilog) 509 510 program_end=time() 511 biglog = "\n**FINISHED MAKING CUTS**. %f seconds" % (program_end-program_start) 512 Log.write(biglog) 513 Log.close()

214