Mathematical Foundation of Photogrammetry (Part of EE5358)

Total Page:16

File Type:pdf, Size:1020Kb

Mathematical Foundation of Photogrammetry (Part of EE5358) Mathematical Foundation of Photogrammetry (part of EE5358) Dr. Venkat Devarajan Ms. Kriti Chauhan Photogrammetry photo = "picture“, grammetry = "measurement“, therefore photogrammetry = “photo-measurement” Photogrammetry is the science or art of obtaining reliable measurements by means of photographs. Formal Definition: Photogrammetry is the art, science and technology of obtaining reliable information about physical objects and the environment, through processes of recording, measuring, and interpreting photographic images and patterns of recorded radiant electromagnetic energy and other phenomena. - As given by the American Society for Photogrammetry and Remote Sensing (ASPRS) Chapter 1 01/14/19 Virtual Environment Lab, UTA 2 Distinct Areas Metric Photogrammetry Interpretative Photogrammetry • making precise measurements from Deals in recognizing and identifying photos determine the relative locations objects and judging their significance of points. through careful and systematic analysis. • finding distances, angles, areas, volumes, elevations, and sizes and shapes of objects. • Most common applications: Photographic Remote 1. preparation of planimetric and Interpretation Sensing topographic maps (Includes use of multispectral 2. production of digital orthophotos cameras, infrared cameras, thermal scanners, etc.) 3. Military intelligence such as targeting Chapter 1 01/14/19 Virtual Environment Lab, UTA 3 Uses of Photogrammetry Products of photogrammetry: 1. Topographic maps: detailed and accurate graphic representation of cultural and natural features on the ground. 2. Orthophotos: Aerial photograph modified so that its scale is uniform throughout. 3. Digital Elevation Maps (DEMs): an array of points in an area that have X, Y and Z coordinates determined. Current Applications: 1. Land surveying 2. Highway engineering 3. Preparation of tax maps, soil maps, forest maps, geologic maps, maps for city and regional planning and zoning 4. Traffic management and traffic accident investigations 5. Military – digital mosaic, mission planning, rehearsal, targeting etc. Chapter 1 01/14/19 Virtual Environment Lab, UTA 4 Types of photographs Aerial Terrestrial Oblique Vertical Low oblique (does not include horizon) Truly Vertical High oblique (includes horizon) Tilted (1deg< angle < 3deg) Chapter 1 01/14/19 Virtual Environment Lab, UTA 5 Of all these type of photographs, vertical and low oblique aerial photographs are of most interest to us as they are the ones most extensively used for mapping purposes… 01/14/19 Virtual Environment Lab, UTA 6 Aerial Photography Vertical aerial photographs are taken along parallel passes called flight strips. Successive photographs along a flight strip overlap is called end lap – 60% Area of common coverage called stereoscopic overlap area. Overlapping photos called a stereopair. Chapter 1 01/14/19 Virtual Environment Lab, UTA 7 Aerial Photography Position of camera at each exposure is called the exposure station. Altitude of the camera at exposure time is called the flying height. Lateral overlapping of adjacent flight strips is called a side lap (usually 30%). Photographs of 2 or more sidelapping strips used to cover an area is referred to as a block of photos. Chapter 1 01/14/19 Virtual Environment Lab, UTA 8 Now, lets examine the acquisition devices for these photographs… 01/14/19 Virtual Environment Lab, UTA 9 Pass Points for • selected asAerotriangulation 9 points in a format of 3 rows X 3 columns, equally spaced over photo. • The points may be images of natural, well-defined objects that appear in the required photo areas • if such points are not available, pass points may be artificially marked. • Digital image matching can be used to select points in the overlap areas of digital images and automatically match them between adjacent images. • essential step of “automatic aerotriangulation”. Chapter 17 01/14/19 Virtual Environment Lab, UTA 61 Analytical Aerotriangulation The most elementary approach consists of the following basic steps: 1. relative orientation of each stereomodel 2. connection of adjacent models to form continuous strips and/or blocks, and 3. simultaneous adjustment of the photos from the strips and/or blocks to field-surveyed ground control X and Y coordinates of pass points can be located to an accuracy of 1/15,000 of the flying height, and Z coordinates can be located to an accuracy of 1/10,000 of the flying height. With specialized equipment and procedures, planimetric accuracy of 1/350,000 of the flying height and vertical accuracy of 1/180,000 have been achieved. Chapter 17 01/14/19 Virtual Environment Lab, UTA 62 Analytical Aerotriangulation • Several variations exist. Technique • Basically, all methods consist of writing equations that express the unknown elements of exterior orientation of each photo in terms of camera constants, measured photo coordinates, and ground coordinates. • The equations are solved to determine the unknown orientation parameters and either simultaneously or subsequently, coordinates of pass points are calculated. • By far the most common condition equations used are the collinearity equations. • Analytical procedures like Bundle Adjustment can simultaneously enforce collinearity condition on to 100s of photographs. Chapter 17 01/14/19 Virtual Environment Lab, UTA 63 Simultaneous Bundle Adjusting all photogrammetric measurements to ground control values in a single solution is knownAdjustment as a bundle adjustment. The process is so named because of the many light rays that pass through each lens position constituting a bundle of rays. The bundles from all photos are adjusted simultaneously so that corresponding light rays intersect at positions of the pass points and control points on the ground. After the normal equations have been formed, they are solved for the unknown corrections to the initial approximations for exterior orientation parameters and object space coordinates. The corrections are then added to the approximations, and the procedure is repeated until the estimated standard deviation of unit weight converges. Chapter 17 01/14/19 Virtual Environment Lab, UTA 64 Quantities in Bundle The unknown quantities to beAdjustment obtained in a bundle adjustment consist of: 1. The X, Y and Z object space coordinates of all object points, and 2. The exterior orientation parameters of all photographs The observed quantities (measured) associated with a bundle adjustment are: 1. x and y photo coordinates of images of object points, 2. X, Y and/or Z coordinates of ground control points, 3. direct observations of exterior orientation parameters of the photographs. The first group of observations, photo coordinates, is the fundamental photogrammetric measurements. The next group of observations is coordinates of control points determined through field survey. The final set of observations can be estimated using airborne GPS control system as well as inertial navigation systems (INSs) which have the capability of measuring the angular attitude of a photograph. Chapter 17 01/14/19 Virtual Environment Lab, UTA 65 Bundle Adjustment on a Consider a small block consistingPhoto of 2 strips Block with 4 photos per strip, with 20 pass points and 6 control points, totaling 26 object points; with 6 of those also serving as tie points connecting the two adjacent strips. Chapter 17 01/14/19 Virtual Environment Lab, UTA 66 Bundle Adjustment on a To repeat, consider a small block consisting of 2 strips with 4 photos per strip, with 20 pass points and 6 control points, totaling 26Photo object points; with 6Block of those also serving as tie points connecting the two adjacent strips. In this case, The number of unknown object coordinates No. of imaged points = = no. of object points X no. of coordinates per object point = 26X3 = 78 4 X 8 The number of unknown exterior orientation parameters (photos 1, 4, 5 & 8 = no. of photos X no. of exterior orientation parameters per photo = 8X6 = 48 have 8 imaged points Total number of unknowns = 78 + 48 = 126 each) + The number of photo coordinate observations = no. of imaged points X no. of photo coordinates per point = 76 X 2 = 152 4 X 11 The number of ground control observations (photos 2, 3, 6 & 7 have = no. of 3D control points X no. of coordinates per point = 6X3 = 18 11 imaged points each) The number of exterior orientation parameters = total 76 point images = no. of photos X no. of exterior orientation parameters per photo = 8X6 = 48 If all 3 types of observations are included, there will be a total of 152+18+48=218 observations; but if only the first two types are included, there will be only 152+18=170 observations Thus, regardless of whether exterior orientation parameters were observed, a least squares solution is possible since the number of observations in either case (218 and 170) is greater than the number of unknowns (126 and 78, respectively). Chapter 17 01/14/19 Virtual Environment Lab, UTA 67 The next question is, how are these equations solved. Well, we start with observations equations, which would be the collinearity condition equations that we have already seen, we linearize them, and then use least squares procedure to find the unknowns. We will start by refreshing our memories on least squares solution of over-determined equation set. 01/14/19 Virtual Environment Lab, UTA 68 RPC - Conclusion • RPC camera model provides a simple, fast and accurate representation of the Ikonos physical camera model. • If the a-priori knowledge of exposure station position and angles permits a small angle approximation, then adjustment of the exterior orientation reduces to a simple bias in image space. • Due to the high accuracy of IKONOS, even without ground control, block adjustment can be accomplished in the image space. • RPC models are equally applicable to a variety of imaging systems and so could become a standardized representation of their image geometry. • From simulation and numerical examples, it is seen that this method is as accurate as the ground station block adjustment with the physical camera model. 01/14/19 Virtual Environment Lab, UTA 129 Finally, lets review all the topics that we have covered… 01/14/19 Virtual Environment Lab, UTA 130 Summary The mathematical concepts covered today were: 1.
Recommended publications
  • Naval Postgraduate School Thesis
    NAVAL POSTGRADUATE SCHOOL MONTEREY, CALIFORNIA THESIS LIDAR AND IMAGE POINT CLOUD COMPARISON by Amanda R. Mueller September 2014 Thesis Advisor: Richard C. Olsen Second Reader: David M. Trask Approved for public release; distribution is unlimited THIS PAGE INTENTIONALLY LEFT BLANK REPORT DOCUMENTATION PAGE Form Approved OMB No. 0704-0188 Public reporting burden for this collection of information is estimated to average 1 hour per response, including the time for reviewing instruction, searching existing data sources, gathering and maintaining the data needed, and completing and reviewing the collection of information. Send comments regarding this burden estimate or any other aspect of this collection of information, including suggestions for reducing this burden, to Washington headquarters Services, Directorate for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork Reduction Project (0704-0188) Washington DC 20503. 1. AGENCY USE ONLY (Leave blank) 2. REPORT DATE 3. REPORT TYPE AND DATES COVERED September 2014 Master’s Thesis 4. TITLE AND SUBTITLE 5. FUNDING NUMBERS LIDAR AND IMAGE POINT CLOUD COMPARISON 6. AUTHOR Amanda R. Mueller 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) 8. PERFORMING ORGANIZATION Naval Postgraduate School REPORT NUMBER Monterey, CA 93943-5000 9. SPONSORING /MONITORING AGENCY NAME(S) AND ADDRESS(ES) 10. SPONSORING/MONITORING N/A AGENCY REPORT NUMBER 11. SUPPLEMENTARY NOTES The views expressed in this thesis are those of the author and do not reflect the official policy or position of the Department of Defense or the U.S. Government. IRB protocol number ____N/A____. 12a.
    [Show full text]
  • Sterescopic Software Compatible with 3D Pluraview
    Version: Nov. 2020 Stereoscopic Software compatible with Stereo Company Application Category yes The ImagingSource ic3d 3D Image Viewer / Stereo Camera Calibration & Visualization FULL Anchorlab StereoBlend 3D Image Viewer / Stereo Image Generation & Visualization yes Masuji Suto StereoPhoto Maker 3D Image Viewer / Stereo Image Generation & Visualization, Freeware yes Presagis OpenFlight Creator 3D Model Building Environment yes 3dtv Stereoscopic Player 3D Video Player / Stereo Images yes Beijing Blue Sight Tech. Co., Ltd. ProvideService 3D Video Player / Stereo Images yes Bino Bino 3D Video Player / Stereo Images yes sView sView 3D Video Player / Stereo Images yes Xeometric ELITECAD Architecture BIM / Architecture, Construction, CAD Engine FULL Dassault Systems 3DVIA BIM / Interior Modeling yes Xeometric ELITECAD Styler BIM / Interior Modeling FULL SierraSoft Land BIM / Land Survey Restitution and Analysis FULL SierraSoft Roads BIM / Road & Highway Design yes Fraunhofer IAO Vrfx BIM / VR for Revit yes Xeometric ELITECAD ViewerPRO BIM / VR Viewer, Free Option yes ENSCAPE Enscape 2.8 BIM / VR Visualization Plug-In for ArchiCAD, Revit, SketchUp, Rhino, Vectorworks yes Xeometric ELITECAD Lumion BIM / VR Visualization, Architecture Models FULL Autodesk NavisWorks CAx / 3D Model Review FULL Dassault Systems eDrawings for Solidworks CAx / 3D Model Review yes OPEN CASCADE CAD CAD Assistant CAx / 3D Model Review FULL PTC Creo View MCAD CAx / 3D Model Review yes Gstarsoft Co., Ltd HaoChen 3D CAx / CAD, Architecture, HVAC, Electric & Power yes
    [Show full text]
  • Stereo Software List
    Version: Oct. 2020 Stereoscopic Software compatible with Stereo Company Application Category yes 3D Slicer 3D Slicer Medical / 3D Visualization, Open Source Yes 3D Systems, Inc. Geomagic Freeform CAx / Freeform Design yes 3D Systems, Inc. D2P Medical / DICOM-2-Print yes 3D Systems, Inc. VSP Technology Medical / Virtual Surgical Planning yes 3DFlow s.r.l. 3DF ZEPHYR Geospatial / Photogrammetry, 3D Reconstruction, UAV, Mesh yes 3dtv Stereoscopic Player 3D Video Player / Stereo Images yes Adobe Premiere Pro Media & Entertainment / Video Editing yes AGI STK Science / Satellite Tool Kit, Orbit Visualization yes Agisoft Metashape Geospatial / Photogrammetry yes AirborneHydroMapping GmbH KomVISH Geospatial / GIS, LiDAR Integration yes AirborneHydroMapping GmbH HydroVISH Geospatial / GIS, SDK Visualization yes Altair HyperWorks Simulation / Engineering yes Analytica Ltd DPS - Delta Geospatial / Photogrammetry, 3D Feature Collection FULL Anchorlab StereoBlend 3D Image Viewer / Stereo Image Generation & Visualization yes ANSYS EnSight Simulation / Engineering, VR Visualization yes ANSYS SPEOS VR Engineering / Light Simulation yes ANSYS Optis VREXPERIENCE SCANeR VR Engineering / Driving Simulation yes ANSYS Optis VRXPERIENCE HMI VR Engineering / Human Machine Interface yes Aplitop TcpStereo Geospatial / Photogrammetry yes Asia Air Survey Co., Ltd. Azuka Geospatial / Photogrammetry, 3D Feature Collection yes Assimilate Europe Ltd. SCRATCH Vs. 9 Media & Entertainment / Video Editing yes Autodesk NavisWorks CAx / 3D Model Review yes Autodesk VRED Professional
    [Show full text]
  • 3D Mapping by Photogrammetry and Lidar in Forest Studies
    Available online at www.worldscientificnews.com WSN 95 (2018) 224-234 EISSN 2392-2192 3D Mapping by Photogrammetry and LiDAR in Forest Studies Firoz Ahmad1,*, Md Meraj Uddin2, Laxmi Goparaju1 1Vindhyan Ecology and Natural History Foundation, Mirzapur, Uttar Pradesh, India 2Department of Mathematics, Ranchi University, Ranchi, Jharkhand, India E-mail address: [email protected] ABSTRACT Aerial imagery have long been used for forest Inventories due to the high correlation between tree height and forest biophysical properties to determine the vertical canopy structure which is an important variable in forest inventories. The development in photogrammetric softwares and large availability of aerial imagery has carved the path in 3D mapping and has accelerated significantly the use of photogrammetry in forest inventory. There is tremendous capacity of 3D mapping which has been recognized in research, development and management of forest ecosystem. The aim of this article is to provide insights of 3D mapping (photogrammetry including Lidar) in forest-related disciplines. We utilizing the satellite stereo pair and LiDAR point cloud as a case study for producing the anaglyph map and Canopy Height Model (CHM) respectively. The study also revealed the area verses canopy height graph. Present study has some strength because it was demonstrated the use of advance software module of ARC/GIS and Erdas Imagine for 3D mapping using Photogrammetry and LiDAR data sets which is highly useful in forest management, planning and decision making. Keywords: Stereo-Image, Photogrammetry, Lidar, Point cloud, Anaglyph, Canopy Height Model 1. INTRODUCTION The prime objective of forest mapping is to generate, manipulate and update various thematic datasets representing forestry attributes.
    [Show full text]
  • Digital Photogrammetry
    Component-I(A) - Personal Details Role Name Affiliation Principal Investigator Prof.MasoodAhsanSiddiqui Department of Geography, JamiaMilliaIslamia, New Delhi Paper Coordinator, if any Dr. Mahaveer Punia BISR, Jaipur Content Writer/Author (CW) Ms. Kaushal Panwar Senior Research Fellow, Birla Institute of Scientific Research, Jaipur Content Reviewer (CR) Dr. Mahaveer Punia BISR, Jaipur Language Editor (LE) Component-I (B) - Description of Module Items Description of Module Subject Name Geography Paper Name Remote Sensing, GIS, GPS Module Name/Title DIGITAL PHOTOGRAMMETRY Module Id RS/GIS 12 Pre-requisites Objectives 1. Keywords DIGITAL PHOTOGRAMMETRY Learning Outcome Student will get to know about digital photogrammetry. Student will acquire skill to work upon DEM, DTM and ortho photos. Student will be equipped with knowledge to study further digital photogrammetry needs, applications and advancement in remote sensing field. Introduction to Photogrammetry: Photogrammetry as a science is among the earliest techniques of remote sensing. The word photogrammetry is the combination of three distinct Greek words: ‘Photos’ - light; ‘Gramma’ -to draw; and ‘Metron’ –to measure. The root words originally signify "measuring graphically by means of light." The fundamental goal of photogrammetry is to rigorously establish the geometric relationship between an object and an image and derive information about the object from the image. For the laymen, photogrammetry is the technological ability of determining the measurement of any object by means of photography. Why Digital Photogrammetry? With the advent of computing and imaging technology, photogrammetry has evolved from analogue to analytical to digital (softcopy) photogrammetry. The main difference between digital photogrammetry and its predecessors (analogue and analytical) is that it deals with digital imagery directly rather than (analogue) photographs.
    [Show full text]
  • 3D Pluraview Supported Stereoscopic Applications
    Version: Jan. 2021 Stereoscopic Software compatible with Stereo Company Application Category yes 3D Slicer 3D Slicer Medical / 3D Visualization, Open Source Yes 3D Systems, Inc. Geomagic Freeform CAx / Freeform Design yes 3D Systems, Inc. D2P Medical / DICOM-2-Print yes 3D Systems, Inc. VSP Technology Medical / Virtual Surgical Planning FULL 3DFlow s.r.l. 3DF ZEPHYR Geospatial / Photogrammetry, 3D Reconstruction, UAV, Mesh FULL 3dtv Stereoscopic Player 3D Video Player / Stereo Images yes Adobe Premiere Pro Media & Entertainment / Video Editing yes AGI STK Science / Satellite Tool Kit, Orbit Visualization FULL Agisoft Metashape Geospatial / Photogrammetry FULL AirborneHydroMapping GmbH KomVISH Geospatial / GIS, LiDAR Integration FULL AirborneHydroMapping GmbH HydroVISH Geospatial / GIS, SDK Visualization yes Altair HyperWorks Simulation / Engineering FULL Analytica Ltd DPS - Delta Geospatial / Photogrammetry, 3D Feature Collection FULL Anchorlab StereoBlend 3D Image Viewer / Stereo Image Generation & Visualization FULL ANSYS EnSight Simulation / Engineering, VR Visualization FULL ANSYS SPEOS VR Engineering / Light Simulation FULL ANSYS Optis VREXPERIENCE SCANeR VR Engineering / Driving Simulation FULL ANSYS Optis VRXPERIENCE HMI VR Engineering / Human Machine Interface FULL Aplitop TcpStereo Geospatial / Photogrammetry yes Asia Air Survey Co., Ltd. Azuka Geospatial / Photogrammetry, 3D Feature Collection yes Assimilate Europe Ltd. SCRATCH Vs. 9 Media & Entertainment / Video Editing FULL Autodesk NavisWorks CAx / 3D Model Review FULL Autodesk
    [Show full text]
  • Accuracy Assessment of Photogrammetric Digital Elevation
    ACCURACY ASSESSMENT OF PHOTOGRAMMETRIC DIGITAL ELEVATION MODELS GENERATED FOR THE SCHULTZ FIRE BURN AREA By Danna K. Muise A Thesis Submitted in Partial Fulfillment of the Requirements for the degree of Master of Science in Applied Geospatial Sciences Northern Arizona University August 2014 Approved: Erik Schiefer, Ph.D., Chair Ruihong Huang, Ph.D. Mark Manone, M.A. Abstract ACCURACY ASSESSMENT OF PHOTOGRAMMETRIC DIGITAL ELEVATION MODELS GENERATED FOR THE SCHULTZ FIRE BURN AREA Danna K. Muise This paper evaluates the accuracy of two digital photogrammetric software programs (ERDAS Imagine LPS and PCI Geomatica OrthoEngine) with respect to high- resolution terrain modeling in a complex topographic setting affected by fire and flooding. The site investigated is the 2010 Schultz Fire burn area, situated on the eastern edge of the San Francisco Peaks approximately 10 km northeast of Flagstaff, Arizona. Here, the fire coupled with monsoon rains typical of northern Arizona drastically altered the terrain of the steep mountainous slopes and residential areas below the burn area. To quantify these changes, high resolution (1 m and 3 m) digital elevation models (DEMs) were generated of the burn area using color stereoscopic aerial photographs taken at a scale of approximately 1:12000. Using a combination of pre-marked and post-marked ground control points (GCPs), I first used ERDAS Imagine LPS to generate a 3 m DEM covering 8365 ha of the affected area. This data was then compared to a reference DEM (USGS 10 m) to evaluate the accuracy of the resultant DEM. Findings were then divided into blunders (errors) and bias (slight differences) and further analyzed to determine if different factors (elevation, slope, aspect and burn severity) affected the accuracy of the DEM.
    [Show full text]
  • The Development of a Geospatial Information Database Project in the Republic of Zimbabwe
    The Republic of Zimbabwe The Department of the Surveyor-General (DSG) THE DEVELOPMENT OF A GEOSPATIAL INFORMATION DATABASE PROJECT IN THE REPUBLIC OF ZIMBABWE FINAL REPORT June 2017 Japan International Cooperation Agency (JICA) Asia Air Survey Co., Ltd. PASCO Corporation EI JR 17-065 USD 1.00=JPY 111.326 (as of June 2017) The Republic of Zimbabwe The Department of the Surveyor-General (DSG) THE DEVELOPMENT OF A GEOSPATIAL INFORMATION DATABASE PROJECT IN THE REPUBLIC OF ZIMBABWE FINAL REPORT June 2017 Japan International Cooperation Agency (JICA) Asia Air Survey Co., Ltd. PASCO Corporation EI JR 17-065 The Development of A Geospatial Information Database Project in the Republic of Zimbabwe Final Report Project Location Map (Source: JICA Study Team) The Development of A Geospatial Information Database Project in the Republic of Zimbabwe Final Report Photo Album City Center in Harare City Skyscrapers in Harare City A major road (Milton street) in Harare city Distant view of Chitungwiza Municipality Inception seminar Existing National Geodetic Point 387/T(GPS-35) The Development of A Geospatial Information Database Project in the Republic of Zimbabwe Final Report Existing National Geodetic Point 3118/T(GPS-38) Existing National Benchmark BM21M 366 Aircraft for aerial photography Digital aerial camera Training on leveling Training on GNSS observation The Development of A Geospatial Information Database Project in the Republic of Zimbabwe Final Report Training on an air mark installation Lecture on analysis of observed data from GNSS survey
    [Show full text]