Positionit Jan/Feb 2006 39 VISUALISATION Technical
Total Page:16
File Type:pdf, Size:1020Kb
Visualisation technical High resolution satellite imaging systems – an overview by Dr.-Ing Karsten Jacobsen, Hannover University, Germany More and more high and very high resolution optical space sensors are becoming available. Synthetic Aperture Radar (SAR) sensors with ground sampling distance (GSD) of up to 1 m are being announced for the near future. The various systems are well not always well known, and this paper discusses the systems available for topographic mapping. ith the higher resolution Radar has the advantage of penetrating TDI-sensors and unrestricted access to clouds, so mapping is possible also in The optical space sensors are located Wimages taken by satellites, a rain-forest areas. competition between aerial images and in a flying altitude corresponding to a Within a few years there will also be an space data exists, starting for a map speed of approximately 7 km/s for the scale 1:5000. alternative available between satellite nadir point. So for a GSD of 1 m, only images and aerial images, coming from 1,4 ms exposure time is available. This Based on experience, optical images high altitude long endurance (HALE) is a not sufficient integration time for should have a ground sampling unmanned aerial vehicles (UAV) with an the generation of an acceptable image distance (GSD) of approximately operating altitude in the range of quality, and for this reason, some of 0,05 mm up to 0,1 mm in the map 20 km. the very high resolution space sensors scale, corresponding to a map scale of are equipped with time delay and Images are however not accessible 1:20000 up to 1:10000 for a GSD of integration (TDI) sensors. The TDI- from all systems - while they may not 1 m. GSD is the distance of the centre sensors used in space are CCD-arrays be classified, sometimes no distribution of neighbouring pixels projected on with small dimensions in the flight channels exist and it is difficult to order the ground. Because of over- or under- direction. The charge generated by the images. sampling, the GSD is not identical to the energy reflected from the ground the projected size of a pixel, but for the Details of imaging sensors is shifted with the speed of the image user, the GSD appears as pixel size on motion to the next CCD-element, and the ground. An over- or under-sampling View direction more charge can be added to the only influences the image contrast, charge collected by the first CCD- The first imaging satellites had a which may also be caused by the element. So a larger charge is summed fixed view direction in relation to the atmosphere. up over several CCD-elements. There orbit. Only by panoramic cameras, are some limits due to inclined view Mapping today involves data acquisition scanning from one side to the other, directions and vibrations, so in most for geo-information systems (GIS). In was the swath width enlarged. For cases the energy is summed up over a GIS, the positions are available with stereoscopic coverage, a combination 13 CCD-elements. IKONOS, QuickBird their national coordinates, so by simple of cameras with different longitudinal and OrbView-3 are equipped with TDI- theory, a GIS is independent of the view directions was used, as in the sensors while EROS-A and the Indian map scale, but the information content CORONA 4 serious and later the MOMS, TES do not have these, and they have corresponds to a publishing scale. In no ASTER, SPOT 5 HRS and Cartosat-1 to increase the integration time by case is the full information available in systems. With SPOT, the change of the permanent rotation of the satellite a GIS - for a large presentation scale view direction across the orbit came the generalisation starts with the size of through a steerable mirror. IRS-1C and building extensions which are included, -1D have the possibility of rotating while for small scales the full effect of the whole panchromatic camera in generalisation is required. So for large relation to the satellite. This requires presentation scales, more details have fuel, and so it has not been used very to be identified in the images, while often. IKONOS, launched in 1999, for smaller scales a larger GSD may was the first civilian reconnaissance be sufficient. If the GSD exceeds 5 m, satellite, with flexible view direction. not all details, usually shown in the Such satellites are equipped with high corresponding publishing scale, can be torque reaction wheels for all axes. If identified. these reaction wheels are slowed down Not only do optical images have to be or accelerated, a moment will go to taken into account, because in the near the satellite and it rotates. No fuel is Fig. 1: Increase of integration time with future high resolution synthetic aperture required for this, only electric energy factor b/a by continuous change of view radar (SAR) images will be available. from the solar panels. direction. PositionIT Jan/Feb 2006 39 VISUALISATION technical have a radiometric resolution up to 11 bit, corresponding to 2048 different grey values. Usually there is not a good distribution of grey values over the whole histogram, so the important part can be optimised for presentation with the 8 bit grey values of a computer screen. The Fig. 2: Arrangement of CCD-lines in focal Fig. 3: Staggered CCD-lines. higher radiometric resolution includes plane. Above: panchromatic; below: multi-spectral. the advantage of an optimal use of the grey values in extreme cases such as bright roofs alongside a shadow. during imaging (see Fig. 1). Also, physical pixel size projected to the Also, for 11 bit-sensors, there are QuickBird uses this technique because ground is 5 m, and based on staggered some limits. If sunlight is reflected the sensor originally was planned for CCD-lines, the supermode has 2,5 m by a glass roof directly to the sensor, the same flying altitude as IKONOS, GSD. In theory this corresponds to the over-saturation will occur and the but with the allowance of a smaller information contents of an image with generated electrons will flow to the GSD, the flying height was reduced, 3 m GSD. neighbouring CCD-elements, and the resulting in a smaller pixel size. The Multi spectral information read-out will be influenced over a short sampling rate of 6900 lines/s could time. The over-saturation (Fig. 4) does not be increased, and this has to be Sensors usable for topographic mapping not cause problems, but the human compensated by change of the view are sensitive at the visible and near operator should know about it to avoid direction during imaging, but with a infrared (NIR) spectral range. The blue a misinterpretation of the objects. significantly smaller factor compared to range with a wavelength of 420 - 520 nm is not used by all sensors because EROS-A and TES. Direct sensor orientation of the higher atmospheric scatter effect, CCD-configuration which reduces the contrast. In most The satellites are equipped with a cases the multi-spectral information positioning system such as GPS, Most of the sensors do not have just is collected with a larger GSD like the gyroscopes and star sensors. So one CCD-line but a combination of panchromatic. With so-called pan- without control points, the geo-location shorter CCD-lines or small CCD-arrays. sharpening, the lower resolution multi- can be determined. For example The CCD lines are shifted with respect spectral information can be merged to each other - see Fig. 2. IKONOS can determine the imaged with the higher resolution panchromatic positions with a standard deviation of The merging of sub-images achieved to achieve a higher resolution colour approximately 4 m. Often, with national by the panchromatic CCD-lines belongs image. This pan-sharpening uses the datum that are not well known, further characteristic of the human eye which to the inner orientation, and the user problems exist. notice it. Usually the matching accuracy is more sensitive to grey values than of the corresponding sub-images is in to colour. A linear relation of 4 between Imaging satellites the lower sub-pixel range, so that the panchromatic and colour GSD is common. Imaging satellites were first used geometry of the mosaiced image does for military reconnaissance. So, 20 not show any influence. This may be The panchromatic range does not months after the launch of SPUTNIK different for the larger offset of the correspond to the original definition in October 1957, the US tests with the colour CCD-lines. Stationary objects - the visible spectral range. Often CORONA system started in 1959. For are fused without any problems during the blue range is cut off and the NIR reconnaissance, the USA used film up the pan-sharpening process. In theory, is added to the spectral range of to 1963, while the Soviet Union and only in extreme mountainous areas approximately 500 to 900 nm. later Russia made the last satellite can unimportant effects be seen. This photo flight in 2000. The historical is different for moving objects - the Imaging problems images were declassified by the USA in time delay of the colour against the Modern CCD-sensors used in space panchromatic image causes different locations in the intensity and the colour. The different colour bands are following the intensity. This effect is unimportant for mapping, because only stationary objects are used. Staggered CCD-lines The ground resolution can be improved by staggered CCD-lines (Fig. 3). Two CCD-lines are used, shifted by half a pixel with respect to each other, so more detail can be seen in the generated images.