Mathematical Foundation of Photogrammetry (Part of EE5358)
Total Page:16
File Type:pdf, Size:1020Kb
Mathematical Foundation of Photogrammetry (part of EE5358) Dr. Venkat Devarajan Ms. Kriti Chauhan Photogrammetry photo = "picture“, grammetry = "measurement“, therefore photogrammetry = “photo-measurement” Photogrammetry is the science or art of obtaining reliable measurements by means of photographs. Formal Definition: Photogrammetry is the art, science and technology of obtaining reliable information about physical objects and the environment, through processes of recording, measuring, and interpreting photographic images and patterns of recorded radiant electromagnetic energy and other phenomena. - As given by the American Society for Photogrammetry and Remote Sensing (ASPRS) Chapter 1 01/14/19 Virtual Environment Lab, UTA 2 Distinct Areas Metric Photogrammetry Interpretative Photogrammetry • making precise measurements from Deals in recognizing and identifying photos determine the relative locations objects and judging their significance of points. through careful and systematic analysis. • finding distances, angles, areas, volumes, elevations, and sizes and shapes of objects. • Most common applications: Photographic Remote 1. preparation of planimetric and Interpretation Sensing topographic maps (Includes use of multispectral 2. production of digital orthophotos cameras, infrared cameras, thermal scanners, etc.) 3. Military intelligence such as targeting Chapter 1 01/14/19 Virtual Environment Lab, UTA 3 Uses of Photogrammetry Products of photogrammetry: 1. Topographic maps: detailed and accurate graphic representation of cultural and natural features on the ground. 2. Orthophotos: Aerial photograph modified so that its scale is uniform throughout. 3. Digital Elevation Maps (DEMs): an array of points in an area that have X, Y and Z coordinates determined. Current Applications: 1. Land surveying 2. Highway engineering 3. Preparation of tax maps, soil maps, forest maps, geologic maps, maps for city and regional planning and zoning 4. Traffic management and traffic accident investigations 5. Military – digital mosaic, mission planning, rehearsal, targeting etc. Chapter 1 01/14/19 Virtual Environment Lab, UTA 4 Types of photographs Aerial Terrestrial Oblique Vertical Low oblique (does not include horizon) Truly Vertical High oblique (includes horizon) Tilted (1deg< angle < 3deg) Chapter 1 01/14/19 Virtual Environment Lab, UTA 5 Of all these type of photographs, vertical and low oblique aerial photographs are of most interest to us as they are the ones most extensively used for mapping purposes… 01/14/19 Virtual Environment Lab, UTA 6 Aerial Photography Vertical aerial photographs are taken along parallel passes called flight strips. Successive photographs along a flight strip overlap is called end lap – 60% Area of common coverage called stereoscopic overlap area. Overlapping photos called a stereopair. Chapter 1 01/14/19 Virtual Environment Lab, UTA 7 Aerial Photography Position of camera at each exposure is called the exposure station. Altitude of the camera at exposure time is called the flying height. Lateral overlapping of adjacent flight strips is called a side lap (usually 30%). Photographs of 2 or more sidelapping strips used to cover an area is referred to as a block of photos. Chapter 1 01/14/19 Virtual Environment Lab, UTA 8 Now, lets examine the acquisition devices for these photographs… 01/14/19 Virtual Environment Lab, UTA 9 Pass Points for • selected asAerotriangulation 9 points in a format of 3 rows X 3 columns, equally spaced over photo. • The points may be images of natural, well-defined objects that appear in the required photo areas • if such points are not available, pass points may be artificially marked. • Digital image matching can be used to select points in the overlap areas of digital images and automatically match them between adjacent images. • essential step of “automatic aerotriangulation”. Chapter 17 01/14/19 Virtual Environment Lab, UTA 61 Analytical Aerotriangulation The most elementary approach consists of the following basic steps: 1. relative orientation of each stereomodel 2. connection of adjacent models to form continuous strips and/or blocks, and 3. simultaneous adjustment of the photos from the strips and/or blocks to field-surveyed ground control X and Y coordinates of pass points can be located to an accuracy of 1/15,000 of the flying height, and Z coordinates can be located to an accuracy of 1/10,000 of the flying height. With specialized equipment and procedures, planimetric accuracy of 1/350,000 of the flying height and vertical accuracy of 1/180,000 have been achieved. Chapter 17 01/14/19 Virtual Environment Lab, UTA 62 Analytical Aerotriangulation • Several variations exist. Technique • Basically, all methods consist of writing equations that express the unknown elements of exterior orientation of each photo in terms of camera constants, measured photo coordinates, and ground coordinates. • The equations are solved to determine the unknown orientation parameters and either simultaneously or subsequently, coordinates of pass points are calculated. • By far the most common condition equations used are the collinearity equations. • Analytical procedures like Bundle Adjustment can simultaneously enforce collinearity condition on to 100s of photographs. Chapter 17 01/14/19 Virtual Environment Lab, UTA 63 Simultaneous Bundle Adjusting all photogrammetric measurements to ground control values in a single solution is knownAdjustment as a bundle adjustment. The process is so named because of the many light rays that pass through each lens position constituting a bundle of rays. The bundles from all photos are adjusted simultaneously so that corresponding light rays intersect at positions of the pass points and control points on the ground. After the normal equations have been formed, they are solved for the unknown corrections to the initial approximations for exterior orientation parameters and object space coordinates. The corrections are then added to the approximations, and the procedure is repeated until the estimated standard deviation of unit weight converges. Chapter 17 01/14/19 Virtual Environment Lab, UTA 64 Quantities in Bundle The unknown quantities to beAdjustment obtained in a bundle adjustment consist of: 1. The X, Y and Z object space coordinates of all object points, and 2. The exterior orientation parameters of all photographs The observed quantities (measured) associated with a bundle adjustment are: 1. x and y photo coordinates of images of object points, 2. X, Y and/or Z coordinates of ground control points, 3. direct observations of exterior orientation parameters of the photographs. The first group of observations, photo coordinates, is the fundamental photogrammetric measurements. The next group of observations is coordinates of control points determined through field survey. The final set of observations can be estimated using airborne GPS control system as well as inertial navigation systems (INSs) which have the capability of measuring the angular attitude of a photograph. Chapter 17 01/14/19 Virtual Environment Lab, UTA 65 Bundle Adjustment on a Consider a small block consistingPhoto of 2 strips Block with 4 photos per strip, with 20 pass points and 6 control points, totaling 26 object points; with 6 of those also serving as tie points connecting the two adjacent strips. Chapter 17 01/14/19 Virtual Environment Lab, UTA 66 Bundle Adjustment on a To repeat, consider a small block consisting of 2 strips with 4 photos per strip, with 20 pass points and 6 control points, totaling 26Photo object points; with 6Block of those also serving as tie points connecting the two adjacent strips. In this case, The number of unknown object coordinates No. of imaged points = = no. of object points X no. of coordinates per object point = 26X3 = 78 4 X 8 The number of unknown exterior orientation parameters (photos 1, 4, 5 & 8 = no. of photos X no. of exterior orientation parameters per photo = 8X6 = 48 have 8 imaged points Total number of unknowns = 78 + 48 = 126 each) + The number of photo coordinate observations = no. of imaged points X no. of photo coordinates per point = 76 X 2 = 152 4 X 11 The number of ground control observations (photos 2, 3, 6 & 7 have = no. of 3D control points X no. of coordinates per point = 6X3 = 18 11 imaged points each) The number of exterior orientation parameters = total 76 point images = no. of photos X no. of exterior orientation parameters per photo = 8X6 = 48 If all 3 types of observations are included, there will be a total of 152+18+48=218 observations; but if only the first two types are included, there will be only 152+18=170 observations Thus, regardless of whether exterior orientation parameters were observed, a least squares solution is possible since the number of observations in either case (218 and 170) is greater than the number of unknowns (126 and 78, respectively). Chapter 17 01/14/19 Virtual Environment Lab, UTA 67 The next question is, how are these equations solved. Well, we start with observations equations, which would be the collinearity condition equations that we have already seen, we linearize them, and then use least squares procedure to find the unknowns. We will start by refreshing our memories on least squares solution of over-determined equation set. 01/14/19 Virtual Environment Lab, UTA 68 RPC - Conclusion • RPC camera model provides a simple, fast and accurate representation of the Ikonos physical camera model. • If the a-priori knowledge of exposure station position and angles permits a small angle approximation, then adjustment of the exterior orientation reduces to a simple bias in image space. • Due to the high accuracy of IKONOS, even without ground control, block adjustment can be accomplished in the image space. • RPC models are equally applicable to a variety of imaging systems and so could become a standardized representation of their image geometry. • From simulation and numerical examples, it is seen that this method is as accurate as the ground station block adjustment with the physical camera model. 01/14/19 Virtual Environment Lab, UTA 129 Finally, lets review all the topics that we have covered… 01/14/19 Virtual Environment Lab, UTA 130 Summary The mathematical concepts covered today were: 1.