Fusion of Optical and SAR Satellite Data to Improve
Total Page:16
File Type:pdf, Size:1020Kb
FUSION OF OPTICAL AND SAR SATELLITE DATA FOR IMPROVED LAND COVER MAPPING IN AGRICULTURAL AREAS T. Riedel, C. Thiel, C. Schmullius Friedrich-Schiller-University Jena, Institute of Geography, Earth Observation, Loebdergraben 32, D-07743 Jena, Germany, Email: [email protected] ABSTRACT pixel, feature or decision level. With the availability of multifrequency and high-resolution spaceborne SAR Focus of this paper is on the integration of optical and data such as provided by the TerraSAR-X and PALSAR SAR data for improved land cover mapping in ALOS mission an increased interest in tools exploiting agricultural areas. The test site is located east of the full information content of both data types will arise. Nordhausen, Thuringia, Germany. From April to December 2005 Landsat-5 TM, ASAR APP and ERS-2 Objective of this paper is a comparison of different data were acquired continuously over the test site image fusion techniques for optical and SAR data in building up a comprehensive time series. Regarding the order to improve the classification accuracy in agricul- fusion of optical and SAR data following three aspects turally used areas. Furthermore, the potential of will be addressed by this paper. The value of different multitemporal SAR data for land cover and crop type methodologies for the synergistic usage of both data mapping will be demonstrated. In this context optimal types is subject of a first analysis. Multitemporal SAR image parameters for the derivation of basic land cover images provide an important data base for land cover classes will be defined. On base of these findings and crop type mapping issues. This will be (amongst other things) a processing chain for the auto- demonstrated in the second section of this paper. mated generation of basic land cover products will be Finally, a classification scheme for the generation of introduced. basic land cover maps using both optical and SAR data will be presented. With respect to operational 2. STUDY AREA AND EXPERIMENTAL DATA applications the proposed procedure should have a high potential for automation. The study area “Goldene Aue” is located east of Nord- hausen at the southern border of the middle mountain 1. INTRODUCTION range Harz, North Thuringia, Germany and is charac- terized by intensive agricultural land use. Main crop The availability of up-to-date and reliable land cover types are winter wheat, rape, corn and winter barley and crop type information is of great importance for (Fig. 1). many earth science applications. For operational appli- cations the development of robust, transferable, semi- automated and automated approaches is of special inter- est. In regions with frequent cloud cover such as Central Europe the number of suitable optical data is often lim- ited. The all-weather capability is one major advantage of SAR data beyond optical systems. Furthermore, radar sensors provide information complementary to those contained in visible-infrared imagery. In the optical range of the electromagnetic spectrum the information depends on reflective and emissive characteristics of the Earth surface, whereas the radar backscatter coefficient is primarily determined by structural and dielectric at- tributes of the surface target. The benefit of combining optical and SAR data for improved land cover mapping was demonstrated in several studies [1, 2, 3]. Multisen- Figure 1. Location of the test site and crop type map sor image data analyses are often performed without an from 2005 alteration of the digital numbers amongst the different data types. The term image fusion itself is defined as From April to December 2005 optical and SAR data ‘the combination of two or more different images to were acquired continuously over the test site building up form a new image by using a certain algorithm’ [4]. In a comprehensive time series. During the main growing general, the data fusion process can be performed on season (April – late August) 2 Landsat-5 TM, 9 Envisat _____________________________________________________ Proc. ‘Envisat Symposium 2007’, Montreux, Switzerland 23–27 April 2007 (ESA SP-636, July 2007) ASAR APP and 6 ERS-2 scenes were recorded (Fig. 2). distance varies between 0 and 1414, whereas 0 signifies On July 10 optical and SAR data were acquired nearly no and 1414 a very high separability. simultaneously providing an excellent data base for image fusion analysis. Parallel to each satellite overpass Main objective of this study was to set up an automatic extensive field data were obtained including crop type, and transferable classification scheme for the derivation growth stage and vegetation height. of basic land cover categories. The proposed processing chain – shown in Figure 3 – is composed of three main stages. The first step comprises the segmentation of the optical EO-data using the multiresolution segmentation approach [6] implemented in the eCognition software. Next, for each land cover class potential training sites will be selected automatically on base of a decision tree. As the application of fixed thresholds sometimes will Figure. 2 EO-data base fail, the thresholds values for reflectance, backscattering coefficient, ratios and texture information specified in 3. METHODOLOGY the nodes of the decision tree will be adapted to each EO-scene separately. To achieve this, for each land All EO-data were pre-processed on base of widely used cover type an optimal set of characteristic image pa- standard techniques. As parts of the test site are charac- rameters was defined by analyzing the time series avail- terized by significant topography, the normalization able in a systematic manner. Additionally, information procedure introduced by Stussi et al. [5] was applied to reported in literature and libraries (e.g. European RA- all SAR data. The pre-processing of the multispectral dar-Optical Research Assemblage library - ERA-ORA) optical images includes atmospheric correction and were considered. By the combination of this expert orthorectification using the free accessible C-band knowledge about typical target characteristics (e.g. low SRTM DEM. In the context of multisource image data reflectance of water bodies in the near infrared) and fusion a critical pre-processing step is an accurate co- histogram analyses, it is possible to assess scene-spe- registration of all EO-scenes used. cific threshold values. In the third stage of the proposed classification scheme the identified trainings sites will After pre-processing different image fusion approaches be used as input for a supervised classification. In the were applied. Generally, in literature following methods framework of this study three classification techniques – for the integration of optical and SAR data are com- nearest neighbour, fuzzy logic and a combined pixel- monly applied: combination of both data sets without an /object based approach – were compared. In the latter alteration of the original input channels and image fu- case a pixel-based maximum likelihood classification sion on pixel- and decision level. In the framework of was performed. The final land cover category assigned this study the potential of the first and the second data to each image object corresponds to the most frequent integration approach for land cover and cop type map- class per image segment. Postclassification procedures ping will be assessed. Performed pixel-based fusion involve simple GIS-analysis such as the recoding of techniques include IHS-transformations, principle com- island segments within residential areas. ponent analyses (PCA), a multiplicative approach and wavelet-transformations. The benefit of these proce- Thematic map accuracy of the final land cover products dures was estimated by separability analyses and a com- was assessed by calculating the confusion matrix and parison of the classification accuracies achieved by a the kappa coefficient for fifty randomly distributed simple pixel-based maximum likelihood classification reference points per land cover category. The class (MLK). The Jeffries-Matusita distance (JM), which is membership of each reference target was specified on widely used in the field of remote sensing to determine base of official land information GIS layers and field the statistical distance between two multivariate, Gaus- data. sian distributed signatures, was calculated. The JM Figure 3. Proposed processing chain 4. RESULTS suitable tool to combine medium resolution optical and SAR data. Perhaps, it will be appropriate to fuse images 4.1 Pixel-based image fusion versus combination of with different spatial resolution such as Envisat ASAR optical and SAR data – a comparison WSM and MERIS data. In literature two methods for the integration of optical 4.2 Potential of multitemporal SAR data for land and SAR data are commonly applied. Image fusion cover and crop type mapping products were generated on base of Landsat-5 TM (channel 3, 4 and 5) and HV-polarized ASAR APP data A further objective of this paper was to demonstrate the acquired nearly simultaneously on July 10, 2005. The power of multitemporal SAR data for crop type map- results of the multiplicative approach and the wavelet ping issues, as in Central Europe the number of optical transformations (tested filter functions: Haar, Daube- data available is often limited due to frequent cloud chies, Coiflet, symmetric) are visually very similar to cover. For example, over the Nordhausen test site only the original optical data. For other approaches such as two cloudless Landsat-5 TM scenes were acquired be- the PCA fusion product and the HIS transformation, the tween April and mid of August, i.e. during the main SAR information is more pronounced. To assess the growing season. The first image was recorded on April benefit of these approaches for land cover and crop type 22 and the second on July 10. Both acquisition dates are mapping separability analyses were performed. By the not well suited for crop type mapping. In consequence, combination of the optical and SAR data without an the achieved classification accuracies for the monotem- alteration of the pixel values the separability rises sig- poral optical data are not sufficient (Tab.