Experiments on Calibrating Tilt-Shift Lenses for Close-Range Photogrammetry

Total Page:16

File Type:pdf, Size:1020Kb

Experiments on Calibrating Tilt-Shift Lenses for Close-Range Photogrammetry The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XLI-B5, 2016 XXIII ISPRS Congress, 12–19 July 2016, Prague, Czech Republic EXPERIMENTS ON CALIBRATING TILT-SHIFT LENSES FOR CLOSE-RANGE PHOTOGRAMMETRY E. Nocerino a, *, F. Menna a, F. Remondino a, J.-A. Beraldin b, L. Cournoyer b, G. Reain b a 3D Optical Metrology (3DOM) unit, Bruno Kessler Foundation (FBK), Trento, Italy – Email: (nocerino, fmenna, remondino)@fbk.eu, Web: http://3dom.fbk.eu b National Research Council, Ottawa, K1A 0R6, Canada – Email: (Angelo.Beraldin, Luc.Cournoyer, Greg.Reain)@nrc-cnrc.gc.ca Commission V, WG 1 KEY WORDS: Tilt-shift lens, Scheimpflug principle, Close range photogrammetry, Brown model, Pinhole camera, Calibration, Accuracy, Precision, Relative accuracy ABSTRACT: One of the strongest limiting factors in close range photogrammetry (CRP) is the depth of field (DOF), especially at very small object distance. When using standard digital cameras and lens, for a specific camera – lens combination, the only way to control the extent of the zone of sharp focus in object space is to reduce the aperture of the lens. However, this strategy is often not sufficient; moreover, in many cases it is not fully advisable. In fact, when the aperture is closed down, images lose sharpness because of diffraction. Furthermore, the exposure time must be lowered (susceptibility to vibrations) and the ISO increased (electronic noise may increase). In order to adapt the shape of the DOF to the subject of interest, the Scheimpflug rule is to be applied, requiring that the optical axis must be no longer perpendicular to the image plane. Nowadays, specific lenses exist that allow inclining the optical axis to modify the DOF: they are called tilt-shift lenses. In this paper, an investigation on the applicability of the classic photogrammetric model (pinhole camera coupled with Brown’s distortion model) to these lenses is presented. Tests were carried out in an environmentally controlled metrology laboratory at the National Research Council (NRC) Canada and the results are hereafter described in detail. 1. INTRODUCTION image plane. Camera movements are possible when using view or technical cameras, which are large format cameras where the Close-range photogrammetry (CRP) is a powerful technology lens, usually mounted on a folding baseboard or a monorail because of its flexibility for speedy reconfiguration and the (Figure 1a), projects the image on a ground glass to allow for possibility to accomplish very demanding tasks while keeping focusing. Once the focus is set, the ground glass is replaced by the costs reasonable. It is an established technology worldwide the film holder. View cameras are mainly used by professional for 3D recording in industrial/engineering as well as in cultural photographers in studio or for architectural photography. Even if heritage applications. Nevertheless, when the imaging distance is adapters are now available for digital single-lens reflex (DSLR) small (less than 2 m), the depth of field (DOF), i.e. the axial cameras (Figure 1b), these systems are not very suitable for (longitudinal) resolving power of the employed optical system, photogrammetry due to their low flexibility and portability. A becomes a strong limiting factor. DOF extends in front of the more practicable solution is represented by tilt-shift lenses camera and can be visualized as a prism with two faces parallel (Figure 1c), also called perspective control (PC) lenses, which to the sensor plane. The portion of object that falls into the solid are available for smaller camera format (e.g., 35 mm or full will be rendered adequately sharp, or in other words with enough frame) and allow for some of the view camera movements. In detail in the image. particular, by rotating the tilt-shift lens of a proper angle, an A way for controlling and adapting the DOF to the scene of inclined subject may be fitted within the optical system DOF, interest is to use the so called camera movements (Ray, 2002), according to the Scheimpflug rule. i.e. translational and rotational movements both of the lens and a) b) c) Figure 1. a) Linhof Kardan E view camera (http://www.fotografie-boerse.de/angebote_30_4/). b) Cambo Ultima 35: adapter for DSLR cameras (https://cambouk.wordpress.com/2013/09/26/cambo-special-offer-ultima-35-camera-bundles/). c) DSRL camera (Nikon D750) with a tilt-shift lens (PC-E Micro NIKKOR 45mm f/2.8D ED). * Corresponding author This contribution has been peer-reviewed. doi:10.5194/isprsarchives-XLI-B5-99-2016 99 The International Archives of the Photogrammetry, Remote Sensing and Spatial Information Sciences, Volume XLI-B5, 2016 XXIII ISPRS Congress, 12–19 July 2016, Prague, Czech Republic This paper presents the first results of a wide experimental study (Figure 2). The concept of COC goes back to analogue conducted in an environmentally controlled metrology laboratory photography and to the printing and enlargement (or at the National Research Council (NRC) Canada to investigate magnification) of film. The value of COC or blur circle is the metric capability and benefits of tilt-shift lenses. Several originally defined according to the finer detail that is visible by experiments evaluated the overall performance of a DSLR human eye at a comfortable vision distance on a printed and camera equipped with a tilt-shift lens, using quality parameters magnified photograph. It depends on the camera format and lens for measurement uncertainties, i.e. accuracy and precision. To used. COC typical values for different analogue camera formats rigorously model the image formation when the Scheimpflug are provided in Ray (2002). For digital cameras, COC is usually principle is employed, the actual angle between the optical axis taken as around 1-3 pixels (Menna et al., 2012; Luhmann et al., and the image plane should be included in the mathematical 2014). formulation. Neglecting this angle would certainly introduce systematic errors in the photogrammetric measurement process. Nevertheless, if this angle is bounded to a range of ±4 degrees for most practical uses, the error introduced might be limited and compensated by calibration parameters. Such evaluation has not been done until now and therefore the present investigation aims at evaluating if the most popular photogrammetric model, i.e. the pinhole camera coupled with Brown’s formulation for lens distortion, can be considered still valid for tilt-shift lenses and to Figure 2. COC and DOF for a lens focused at finite distance. which extent. The rest of the paper is structured as follows. Section 2 reports According to Ray (2002, page 218), DOF can be computed the motivations behind the study, along with a discussion on according to equation (1): DOF, Scheimpflug condition and tilt-shift lenses. Section 3 presents the experiments, with particular emphasis on the 퐷푂퐹 = 퐹푎푟푃푙푎푛푒 − 푁푒푎푟푃푙푎푛푒 2 2 photographic equipment, the calibration fixture specifically 2 ∙ 푓 ∙ 퐷 ∙ 퐹푛푢푚푏푒푟 ∙ 퐶푂퐶 (1) = designed for the tests and the acquisition protocol adopted. 4 2 2 2 푓 − 퐹푛푢푚푏푒푟 ∙ 퐶푂퐶 ∙ 퐷 Finally (Section 4), the results of the experimental investigation are presented and discussed. where: - 퐹푎푟푃푙푎푛푒 and 푁푒푎푟푃푙푎푛푒 are the far and near limits of 2. MOTIVATION AND BACKGROUND DOF, respectively; - 푓 is the lens focal length; The aim of the collaborative research between FBK and NRC is - 퐷 is the focusing distance; to explore the use of Scheimpflug rule, a well-known principle - 퐹푛푢푚푏푒푟 is lens relative aperture. commonly exploited in “artistic” and professional photography For large object distance, equation (1) can be simplified to (Ray, (Merklinger, 1996; Evens, 2008) as well as active triangulation 2002; Luhmann et al., 2014): scanning technique (Dremel et al., 1986; Rioux et al., 1987; 2 2 Albers et al., 2015) and particle image velocimetry – PIV 퐷푂퐹 ≈ (2 ∙ 퐷 ∙ 퐹푛푢푚푏푒푟 ∙ 퐶푂퐶)⁄푓 (2) (Fournel et al., 2004; Louhichi et al., 2007; Astarita, 2012; Hamrouni et al., 2012), but not yet adequately investigated in In this case, it can be easily argued that the DOF increases with CRP. the COC, the f-number and the square of focusing distance, while decreases when the focal length grows. When working in close 2.1 DOF and Scheimpflug rule up photogrammetry, which corresponds typically to a magnification1 range of 0.1-1 and object distance range of 1000 DOF is the distance between the nearest and farthest object mm - 10 mm (Ray, 2002), DOF is very limited. planes still in focus, i.e. the distance between the nearest and In an ordinary camera setup, the points in space that are in focus farthest objects in a scene that appear acceptably sharp in an ideally lie in a plane parallel to the image plane and, once decided image. Sharpness can be defined as the quality for which the focal length, the desired DOF is generally accomplished adequate resolved details and, hence, information content, are changing the f-stop value. A smaller aperture yields a greater provided by the image. In absence of other degrading factors DOF but may lead to diffraction that in turn leads to a loss of (aberrations, camera shake, etc.) sharpness only depends on image detail (Luhmann et al., 2014). In CRP, usually the objects correctly focusing the subject of interest. From a physical point to be surveyed are three-dimensional and of varying depth, but a of view, a lens can precisely focus at only one distance: there perfectly sharp image is obtained only for the part of the object exists only one specific distance where a point in object space is that is within the DOF. According to the Scheimpflug rule, by imaged in the image plane as a sharp point. At any other tilting the lens so that its axis is no longer perpendicular to the distances, an object point produces a blur circular spot, which plane of the sensitive media, film or digital sensor (Figure 3), the means that the point is defocused.
Recommended publications
  • “Digital Single Lens Reflex”
    PHOTOGRAPHY GENERIC ELECTIVE SEM-II DSLR stands for “Digital Single Lens Reflex”. In simple language, a DSLR is a digital camera that uses a mirror mechanism to either reflect light from a camera lens to an optical viewfinder (which is an eyepiece on the back of the camera that one looks through to see what they are taking a picture of) or let light fully pass onto the image sensor (which captures the image) by moving the mirror out of the way. Although single lens reflex cameras have been available in various shapes and forms since the 19th century with film as the recording medium, the first commercial digital SLR with an image sensor appeared in 1991. Compared to point-and-shoot and phone cameras, DSLR cameras typically use interchangeable lenses. Take a look at the following image of an SLR cross section (image courtesy of Wikipedia): When you look through a DSLR viewfinder / eyepiece on the back of the camera, whatever you see is passed through the lens attached to the camera, which means that you could be looking at exactly what you are going to capture. Light from the scene you are attempting to capture passes through the lens into a reflex mirror (#2) that sits at a 45 degree angle inside the camera chamber, which then forwards the light vertically to an optical element called a “pentaprism” (#7). The pentaprism then converts the vertical light to horizontal by redirecting the light through two separate mirrors, right into the viewfinder (#8). When you take a picture, the reflex mirror (#2) swings upwards, blocking the vertical pathway and letting the light directly through.
    [Show full text]
  • Depth of Field Lenses Form Images of Objects a Predictable Distance Away from the Lens. the Distance from the Image to the Lens Is the Image Distance
    Depth of Field Lenses form images of objects a predictable distance away from the lens. The distance from the image to the lens is the image distance. Image distance depends on the object distance (distance from object to the lens) and the focal length of the lens. Figure 1 shows how the image distance depends on object distance for lenses with focal lengths of 35 mm and 200 mm. Figure 1: Dependence Of Image Distance Upon Object Distance Cameras use lenses to focus the images of object upon the film or exposure medium. Objects within a photographic Figure 2 scene are usually a varying distance from the lens. Because a lens is capable of precisely focusing objects of a single distance, some objects will be precisely focused while others will be out of focus and even blurred. Skilled photographers strive to maximize the depth of field within their photographs. Depth of field refers to the distance between the nearest and the farthest objects within a photographic scene that are acceptably focused. Figure 2 is an example of a photograph with a shallow depth of field. One variable that affects depth of field is the f-number. The f-number is the ratio of the focal length to the diameter of the aperture. The aperture is the circular opening through which light travels before reaching the lens. Table 1 shows the dependence of the depth of field (DOF) upon the f-number of a digital camera. Table 1: Dependence of Depth of Field Upon f-Number and Camera Lens 35-mm Camera Lens 200-mm Camera Lens f-Number DN (m) DF (m) DOF (m) DN (m) DF (m) DOF (m) 2.8 4.11 6.39 2.29 4.97 5.03 0.06 4.0 3.82 7.23 3.39 4.95 5.05 0.10 5.6 3.48 8.86 5.38 4.94 5.07 0.13 8.0 3.09 13.02 9.93 4.91 5.09 0.18 22.0 1.82 Infinity Infinite 4.775 5.27 0.52 The DN value represents the nearest object distance that is acceptably focused.
    [Show full text]
  • What's a Megapixel Lens and Why Would You Need One?
    Theia Technologies white paper Page 1 of 3 What's a Megapixel Lens and Why Would You Need One? It's an exciting time in the security department. You've finally received approval to migrate from your installed analog cameras to new megapixel models and Also translated in Russian: expectations are high. As you get together with your integrator, you start selecting Что такое мегапиксельный the cameras you plan to install, looking forward to getting higher quality, higher объектив и для чего он resolution images and, in many cases, covering the same amount of ground with нужен one camera that, otherwise, would have taken several analog models. You're covering the basics of those megapixel cameras, including the housing and mounting hardware. What about the lens? You've just opened up Pandora's Box. A Lens Is Not a Lens Is Not a Lens The lens needed for an IP/megapixel camera is much different than the lens needed for a traditional analog camera. These higher resolution cameras demand higher quality lenses. For instance, in a megapixel camera, the focal plane spot size of the lens must be comparable or smaller than the pixel size on the sensor (Figures 1 and 2). To do this, more elements, and higher precision elements, are required for megapixel camera lenses which can make them more costly than their analog counterparts. Figure 1 Figure 2 Spot size of a megapixel lens is much smaller, required Spot size of a standard lens won't allow sharp focus on for good focus on megapixel sensors. a megapixel sensor.
    [Show full text]
  • EVERYDAY MAGIC Bokeh
    EVERYDAY MAGIC Bokeh “Our goal should be to perceive the extraordinary in the ordinary, and when we get good enough, to live vice versa, in the ordinary extraordinary.” ~ Eric Booth Welcome to Lesson Two of Everyday Magic. In this week’s lesson we are going to dig deep into those magical little orbs of light in a photograph known as bokeh. Pronounced BOH-Kə (or BOH-kay), the word “bokeh” is an English translation of the Japanese word boke, which means “blur” or “haze”. What is Bokeh? bokeh. And it is the camera lens and how it renders the out of focus light in Photographically speaking, bokeh is the background that gives bokeh its defined as the aesthetic quality of the more or less circular appearance. blur produced by the camera lens in the out-of-focus parts of an image. But what makes this unique visual experience seem magical is the fact that we are not able to ‘see’ bokeh with our superior human vision and excellent depth of field. Bokeh is totally a function ‘seeing’ through the lens. Playing with Focal Distance In addition to a shallow depth of field, the bokeh in an image is also determined by 1) the distance between the subject and the background and 2) the distance between the lens and the subject. Depending on how you Bokeh and Depth of Field compose your image, the bokeh can be smooth and ‘creamy’ in appearance or it The key to achieving beautiful bokeh in can be livelier and more energetic. your images is by shooting with a shallow depth of field (DOF) which is the amount of an image which is appears acceptably sharp.
    [Show full text]
  • Using Depth Mapping to Realize Bokeh Effect with a Single Camera Android Device EE368 Project Report Authors (SCPD Students): Jie Gong, Ran Liu, Pradeep Vukkadala
    Using Depth Mapping to realize Bokeh effect with a single camera Android device EE368 Project Report Authors (SCPD students): Jie Gong, Ran Liu, Pradeep Vukkadala Abstract- In this paper we seek to produce a bokeh Bokeh effect is usually achieved in high end SLR effect with a single image taken from an Android device cameras using portrait lenses that are relatively large in size by post processing. Depth mapping is the core of Bokeh and have a shallow depth of field. It is extremely difficult effect production. A depth map is an estimate of depth to achieve the same effect (physically) in smart phones at each pixel in the photo which can be used to identify which have miniaturized camera lenses and sensors. portions of the image that are far away and belong to However, the latest iPhone 7 has a portrait mode which can the background and therefore apply a digital blur to the produce Bokeh effect thanks to the dual cameras background. We present algorithms to determine the configuration. To compete with iPhone 7, Google recently defocus map from a single input image. We obtain a also announced that the latest Google Pixel Phone can take sparse defocus map by calculating the ratio of gradients photos with Bokeh effect, which would be achieved by from original image and reblured image. Then, full taking 2 photos at different depths to camera and defocus map is obtained by propagating values from combining then via software. There is a gap that neither of edges to entire image by using nearest neighbor method two biggest players can achieve Bokeh effect only using a and matting Laplacian.
    [Show full text]
  • Aperture Efficiency and Wide Field-Of-View Optical Systems Mark R
    Aperture Efficiency and Wide Field-of-View Optical Systems Mark R. Ackermann, Sandia National Laboratories Rex R. Kiziah, USAF Academy John T. McGraw and Peter C. Zimmer, J.T. McGraw & Associates Abstract Wide field-of-view optical systems are currently finding significant use for applications ranging from exoplanet search to space situational awareness. Systems ranging from small camera lenses to the 8.4-meter Large Synoptic Survey Telescope are designed to image large areas of the sky with increased search rate and scientific utility. An interesting issue with wide-field systems is the known compromises in aperture efficiency. They either use only a fraction of the available aperture or have optical elements with diameters larger than the optical aperture of the system. In either case, the complete aperture of the largest optical component is not fully utilized for any given field point within an image. System costs are driven by optical diameter (not aperture), focal length, optical complexity, and field-of-view. It is important to understand the optical design trade space and how cost, performance, and physical characteristics are influenced by various observing requirements. This paper examines the aperture efficiency of refracting and reflecting systems with one, two and three mirrors. Copyright © 2018 Advanced Maui Optical and Space Surveillance Technologies Conference (AMOS) – www.amostech.com Introduction Optical systems exhibit an apparent falloff of image intensity from the center to edge of the image field as seen in Figure 1. This phenomenon, known as vignetting, results when the entrance pupil is viewed from an off-axis point in either object or image space.
    [Show full text]
  • 12 Considerations for Thermal Infrared Camera Lens Selection Overview
    12 CONSIDERATIONS FOR THERMAL INFRARED CAMERA LENS SELECTION OVERVIEW When developing a solution that requires a thermal imager, or infrared (IR) camera, engineers, procurement agents, and program managers must consider many factors. Application, waveband, minimum resolution, pixel size, protective housing, and ability to scale production are just a few. One element that will impact many of these decisions is the IR camera lens. USE THIS GUIDE AS A REFRESHER ON HOW TO GO ABOUT SELECTING THE OPTIMAL LENS FOR YOUR THERMAL IMAGING SOLUTION. 1 Waveband determines lens materials. 8 The transmission value of an IR camera lens is the level of energy that passes through the lens over the designed waveband. 2 The image size must have a diameter equal to, or larger than, the diagonal of the array. 9 Passive athermalization is most desirable for small systems where size and weight are a factor while active 3 The lens should be mounted to account for the back athermalization makes more sense for larger systems working distance and to create an image at the focal where a motor will weigh and cost less than adding the plane array location. optical elements for passive athermalization. 4 As the Effective Focal Length or EFL increases, the field 10 Proper mounting is critical to ensure that the lens is of view (FOV) narrows. in position for optimal performance. 5 The lower the f-number of a lens, the larger the optics 11 There are three primary phases of production— will be, which means more energy is transferred to engineering, manufacturing, and testing—that can the array.
    [Show full text]
  • AG-AF100 28Mm Wide Lens
    Contents 1. What change when you use the different imager size camera? 1. What happens? 2. Focal Length 2. Iris (F Stop) 3. Flange Back Adjustment 2. Why Bokeh occurs? 1. F Stop 2. Circle of confusion diameter limit 3. Airy Disc 4. Bokeh by Diffraction 5. 1/3” lens Response (Example) 6. What does In/Out of Focus mean? 7. Depth of Field 8. How to use Bokeh to shoot impressive pictures. 9. Note for AF100 shooting 3. Crop Factor 1. How to use Crop Factor 2. Foal Length and Depth of Field by Imager Size 3. What is the benefit of large sensor? 4. Appendix 1. Size of Imagers 2. Color Separation Filter 3. Sensitivity Comparison 4. ASA Sensitivity 5. Depth of Field Comparison by Imager Size 6. F Stop to get the same Depth of Field 7. Back Focus and Flange Back (Flange Focal Distance) 8. Distance Error by Flange Back Error 9. View Angle Formula 10. Conceptual Schema – Relationship between Iris and Resolution 11. What’s the difference between Video Camera Lens and Still Camera Lens 12. Depth of Field Formula 1.What changes when you use the different imager size camera? 1. Focal Length changes 58mm + + It becomes 35mm Full Frame Standard Lens (CANON, NIKON, LEICA etc.) AG-AF100 28mm Wide Lens 2. Iris (F Stop) changes *distance to object:2m Depth of Field changes *Iris:F4 2m 0m F4 F2 X X <35mm Still Camera> 0.26m 0.2m 0.4m 0.26m 0.2m F4 <4/3 inch> X 0.9m X F2 0.6m 0.4m 0.26m 0.2m Depth of Field 3.
    [Show full text]
  • Photography Techniques Intermediate Skills
    Photography Techniques Intermediate Skills PDF generated using the open source mwlib toolkit. See http://code.pediapress.com/ for more information. PDF generated at: Wed, 21 Aug 2013 16:20:56 UTC Contents Articles Bokeh 1 Macro photography 5 Fill flash 12 Light painting 12 Panning (camera) 15 Star trail 17 Time-lapse photography 19 Panoramic photography 27 Cross processing 33 Tilted plane focus 34 Harris shutter 37 References Article Sources and Contributors 38 Image Sources, Licenses and Contributors 39 Article Licenses License 41 Bokeh 1 Bokeh In photography, bokeh (Originally /ˈboʊkɛ/,[1] /ˈboʊkeɪ/ BOH-kay — [] also sometimes heard as /ˈboʊkə/ BOH-kə, Japanese: [boke]) is the blur,[2][3] or the aesthetic quality of the blur,[][4][5] in out-of-focus areas of an image. Bokeh has been defined as "the way the lens renders out-of-focus points of light".[6] However, differences in lens aberrations and aperture shape cause some lens designs to blur the image in a way that is pleasing to the eye, while others produce blurring that is unpleasant or distracting—"good" and "bad" bokeh, respectively.[2] Bokeh occurs for parts of the scene that lie outside the Coarse bokeh on a photo shot with an 85 mm lens and 70 mm entrance pupil diameter, which depth of field. Photographers sometimes deliberately use a shallow corresponds to f/1.2 focus technique to create images with prominent out-of-focus regions. Bokeh is often most visible around small background highlights, such as specular reflections and light sources, which is why it is often associated with such areas.[2] However, bokeh is not limited to highlights; blur occurs in all out-of-focus regions of the image.
    [Show full text]
  • Tools for the Paraxial Optical Design of Light Field Imaging Systems Lois Mignard-Debise
    Tools for the paraxial optical design of light field imaging systems Lois Mignard-Debise To cite this version: Lois Mignard-Debise. Tools for the paraxial optical design of light field imaging systems. Other [cs.OH]. Université de Bordeaux, 2018. English. NNT : 2018BORD0009. tel-01764949 HAL Id: tel-01764949 https://tel.archives-ouvertes.fr/tel-01764949 Submitted on 12 Apr 2018 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. THÈSE PRÉSENTÉE À L’UNIVERSITÉ DE BORDEAUX ÉCOLE DOCTORALE DE MATHÉMATIQUES ET D’INFORMATIQUE par Loïs Mignard--Debise POUR OBTENIR LE GRADE DE DOCTEUR SPÉCIALITÉ : INFORMATIQUE Outils de Conception en Optique Paraxiale pour les Systèmes d’Imagerie Plénoptique Date de soutenance : 5 Février 2018 Devant la commission d’examen composée de : Hendrik LENSCH Professeur, Université de Tübingen . Rapporteur Céline LOSCOS . Professeur, Université de Reims . Présidente Ivo IHRKE . Professeur, Inria . Encadrant Patrick REUTER . Maître de conférences, Université de Bordeaux Encadrant Xavier GRANIER Professeur, Institut d’Optique Graduate School Directeur 2018 Acknowledgments I would like to express my very great appreciation to my research supervisor, Ivo Ihrke, for his great quality as a mentor. He asked unexpected but pertinent questions during fruitful discussions, encouraged and guided me to be more autonomous in my work and participated in the solving of the hardships faced during experimenting in the laboratory.
    [Show full text]
  • HALS Photography Guidelines Define the Photographic Products Acceptable for Inclusion in the HABS/HAER/HALS Collections Within the Library of Congress
    Historic American Landscapes Survey Guidelines for Photography Prepared for U.S. Department of the Interior National Park Service Historic American Buildings Survey/ Historic American Engineering Record/ Historic American Landscapes Survey Prepared by Tom Lamb, January 2004 HABS/HAER/HALS Staff (editing & revision, July 2005) 1 INTRODUCTION 1.0 ROLE OF THE PHOTOGRAPHER 1.1 Photographic Procedures 1.1.1 Composition 1.1.2 Lighting 1.1.3 Focus 1.1.4 Exposure 1.1.5 Perspective 1.2 Photographic Copies 1.3 Photography of Measured Drawings 1.4 Property Owners and/or Responsible Agencies 1.5 HALS Photographic Process 1.6 Methodology of Landscape Photography 1.7 Field Journal 1.8 Photographic Key 1.9 Identification 1.10 Captions 1.11 Site 1.12 Other Types of Photography often included in HALS 1.12.1 Historic Photography 1.12.2 Repeat and Matched. 1.12.3 Copy Work 1.12.4 Measured Drawings 1.12.5 Aerial Photography 1.12.6 Historic Aerials 2.0 PHOTOGRAPHING THE LANDSCAPE 2.1 Elements of site context that should be considered for photography 2.1.1 Geographic location 2.1.2 Setting 2.1.3 Natural Systems Context 2.1.4 Cultural/Political Context 2.2 Physical conditions on a site may be influenced by elements of time 2.2.1 Era/Period/Date of Landscape 2.2.2 Design Context or Period Influences 2.2.3 Parallel Historic Events 2.2.4 Parallel Current Events 2.3 The historic continuum/evolution of a site may include: 2.3.1 Chronology of Physical Layers 2.3.2 Periods of Landscape Evolution 2.3.3 Landscape Style 2.3.4 Periods of Construction 2.3.5 Land Use/Land 2.3.6 Settlement.
    [Show full text]
  • A Comprehensive and Versatile Camera Model for Cameras with Tilt Lenses
    Int J Comput Vis (2017) 123:121–159 DOI 10.1007/s11263-016-0964-8 A Comprehensive and Versatile Camera Model for Cameras with Tilt Lenses Carsten Steger1 Received: 14 March 2016 / Accepted: 30 September 2016 / Published online: 22 October 2016 © The Author(s) 2016. This article is published with open access at Springerlink.com Abstract We propose camera models for cameras that are Keywords Camera models · Tilt lenses · Scheimpflug equipped with lenses that can be tilted in an arbitrary direc- optics · Camera model degeneracies · Camera calibration · tion (often called Scheimpflug optics). The proposed models Bias removal · Stereo rectification are comprehensive: they can handle all tilt lens types that are in common use for machine vision and consumer cameras and correctly describe the imaging geometry of lenses for 1 Introduction which the ray angles in object and image space differ, which is true for many lenses. Furthermore, they are versatile since One problem that often occurs when working on machine they can also be used to describe the rectification geometry vision applications that require large magnifications is that of a stereo image pair in which one camera is perspective and the depth of field becomes progressively smaller as the mag- the other camera is telecentric. We also examine the degen- nification increases. Since for regular lenses the depth of field eracies of the models and propose methods to handle the is parallel to the image plane, problems frequently occur if degeneracies. Furthermore, we examine the relation of the objects that are not parallel to the image plane must be imaged proposed camera models to different classes of projective in focus.
    [Show full text]