An Integrated INS/GPS Approach to the Georeferencing of Remotely
Total Page:16
File Type:pdf, Size:1020Kb
I PEER-REVIEWED ARTICLE An Integrated INSIGPS Approach to the Georeferencing of Remotely Sensed Data Abstract remote areas such as are found in many developing coun- A general model for the georeferencing of remotely sensed tries. data by an onboard positioning and orientation system is In the case of pushbroom imagery, the georeferencing presented as a problem of rigid body motion. The determina- problem is further complicated due to the need for the in- tion of the six independent parameters of motion by discrete stantaneous position and attitude information at each line of measurements from ineNealand satellife systems is directly imagery. Ground control alone is not sufficient to resolve all related to the problem of exterior orientation. The contribu- of the three position and three orientation (attitude) parame- tion of each measuring system to the determination of the ters associated with each exposure station, of which there three translational and three rotational parameters is treated may be thousands in a single mission. In order to overcome in detail, with emphasis on the contribution of inertial navi- these problems, systems have been proposed whereby multi- gation systems (INS] and single- and multi-antenna receivers ple look angles are simultaneously recorded with slight vari- of the Global Positioning System [GPS). The advantages of an ations in alignment (Hofmann et al., 1988). integrated INSIGPS approach are briefly discussed. Position- Another approach has been through the use of auxiliary ing and orientation accuracies obtainable from available position and navigation sensors which have been simulta- systems are then highlighted using selected results to em- neously flown during imaging missions. Output of these sen- phasize signifcant points. The implementation of the gen- sors is used to determine the six parameters of exterior eral georeferencing concept is demonstrated by brief orientation, either completely or partially, and thus eliminate descriptions of a number of projects in which The University the need for dense ground control. This approach will be of Calgary group is currently involved. They include aerial called onboard exterior orientation in the following. Such photography applications, airborne tests with pushbroom sensors have included statoscopes, laser profilers, stellar imagers, motion compensation for SLAR systems, and the de- cameras, inertial navigation systems (INS), and Global Posi- velopment of a mobile survey system for a road inventory tioning System (GPS) receivers. Each of these systems has the GIs using digital frame cameras. The paper concludes with a potential to resolve all or a subset of the six parameters re- brief discussion of some application areas which ofer a high quired for individual exposure stations. Only recently have potential for future development. two of these systems, namely INS and GPS, been fully inte- grated such that all six orientation parameters are determina- Introduction ble with sufficient accuracy at any instant of time. Aerial remote sensing has evolved from the exclusive use of Further problems are encountered when using multi- film-based optical sensors to fully digital electro-optical and look side-looking radar (SLAR),for which aircraft velocity in- active electronic sensors with multispectral capabilities in formation is required in order to compensate for the finite many cases. These sensors can be roughly categorized as time interval over which the image is generated. In this case, frame-based, as in the case of aerial cameras, or as line scan- real-time motion compensation is needed which, from an op- erational point of view, puts a number of constraints on the ners (or pushbroom imagers), as represented by the MEIS sys- tem (Gibson et al., 1983) or casi system (Babey and Anger, model implementation. Velocity information is directly 1989). available from an integrated INSIGPS system which is pro- Traditionally, frame imagery has relied on the process of posed to be simultaneously flown onboard the remote sens- visually or digitally correlating ground control points with ing aircraft. their corresponding images for achieving georeferencing. This paper discusses the underlying principles of on- This process, which involves the estimation of the position board exterior orientation through the use of a fully inte- and attitude information for each exposure, has become com- grated INSJGPS positioning and orientation system, highlights monly known as aerotriangulation. It is efficiently accom- some of the results achieved, and outlines some future plans. plished due to the stable geometric nature of frame-based The proposed method has demonstrated initial success and is being tested in an ongoing manner with various imaging imagery along with the use of ground control points (GCPS) of sufficient number, distribution, and accuracy. Unfortunately, sensor arrangements, including frame cameras, pushbroom the costs associated with the establishment of these GCPS has scanners, and SLAR. represented a significant portion of the aerotriangulation budget. In some cases, these costs can be prohibitive, espe- Remote Sensing with Onboard Exterior Orientation cially when imagery is to be acquired and georeferenced in Conceptually, the problem of exterior orientation can be re- duced to defining the transformation between the sensor gen- K. P. Schwan Photogrammetric Engineering & Remote Sensing, M. A. Chapman Vol. 59, No. 11, November 1993, pp. 1667-1674. M. W. Cannon P. Gong 0099-1112/93/5911-1667$03.00/0 Department of Geomatics Engineering, 01993 American Society for Photogrammetry The University of Calgary, 2500 University Drive NW, and Remote Sensing Calgary, Alberta T2N 1N4, Canada. PEER-REVIEWED ARTICLE denoted by lower case letters, matrices by upper case letters. Vector superscripts define the coordinate frame in which the vector elements are expressed. Matrix superscripts and sub- ..'"-*-sms ..'"-*-sms ..,,,,a,,,,,, ,,,,,,,,,,1111*-"'11( scripts indicate the direction of the rotation which is from ~!,~gh!,.~tb..hhhhhhh~~-~~~~---~~~~J"ncorrectml subscript to superscript. Thus, defines the rotation from the m-frame to the b-frame. The inverse rotation is simply given by an interchange of subscript and superscript. Thus, fP \ irnagerdor RT is the inverse of e.The analytical form of the rotation r image matrix R't: is cosmoscp - sinminosincp - sinKcoso coswsincp + sinminocoscp sin~coso+ cosminwsing cos~coso sinming - cosminocosg -cosminp sino COS~OSQ (2) Note that these rotations correspond to the ones often used Figure 1. The relation between the mapping in photogrammetry if the frame conventions for b and m are frame and the body frame. the same. They express rotations from the system of ground coordinates to the system of photo coordinates at a point where the z-axis of the ground coordinate system coincides with the ellipsoidal normal. The main difference between the erated images or records and the coordinate system in which control coordinate system and the local geodetic system is the results are required. For convenience, the latter will be that one is a planar, and the other an ellipsoidal approxima- called the mapping frame and will be denoted by m. It can tion of the Earth's surface. be a system of curvilinear geodetic coordinates (latitude, lon- Georeferencing is possible if at any instant of time (t) the gitude, height), a system of uTM or 3TM coordinates, or any position of the perspective center of the camera or the scan- other conveniently chosen Earth-fixed reference system. In ner is given in coordinates of the m-frame, i.e., rm, and the order to transform the sensor output to the mapping frame, rotation matrix Rrg has been determined (see Figure 1).The three essential steps are necessary. First, the motion of the georeferencing equation can then be written as sensor frame with respect to the Earth-fixed mapping frame has to be determined. Second, the image coordinates have to be corrected for both the rotational and translational part of where Arm is the position vector in the chosen mapping sys- this motion. Third, the corrected image coordinates have to tem, P is the position of the center of the remote sensing de- be transformed into the mapping frame. The total procedure vice, smis a scale factor derived from the height of the sensor is usually called georeferencing. above ground, and pb is the vector between the projective The concept of georeferencing is shown in Figure 1. The center and an arbitrary point (x,y) in the image plane given airborne sensor has a coordinate system assigned to it which in the b-frame. The vector pb may be rotated into the m- changes position and orientation with respect to the mapping frame as pm and is then given by frame (m) as a function of time. The sensor frame is usually called body frame (b) and corresponds to the photo coordin ate frame in photogrammetry. If the sensor is rigidly mounted in the aircraft, the changes of the b-frame with re- spect to the m-frame will correspond to those of the aircraft. In general, small differential movements between aircraft where (x,y) are the image coordinates in the b-frame and f is frame and sensor frame can occur, however, and these have the sensor focal length, while (x1,y',z') are coordinates of the to be measured. image vector transformed to the mapping frame. For a line For simplicity, it will be assumed that the b-frame is de- scanner, y would be the pixel displacement along the scan fined such that forward motion defines the yb-axis, sideward line and x would be zero. A more detailed discussion of line motion to the right of the forward motion, the xb-axis,and scanning georeferencing is given in Cosandier et al. (1992). upward motion, completing the three-dimensional Cartesian Equation 3 is the simplest form of the georeferencing frame, the zb-axis.Assuming that the mapping frame is a cur- equation. It relates points in the image plane to points on the vilinear geodetic coordinate system (latitude, longitude, ground, using the defined mapping frame.