Regional Center NA/ME/E WMO SDS-WAS Programme

Total Page:16

File Type:pdf, Size:1020Kb

Regional Center NA/ME/E WMO SDS-WAS Programme

REGIONAL CENTER FOR NORTHERN AFRICA, MIDDLE EAST AND EUROPE OF THE WMO SDS-WAS PROGRAMME

Evaluation of Dust Optical Depth Forecasts using AERONET observations

Authors: Enric Terradellas (AEMET) and Sara Basart (BSC-CNS) Barcelona, November 2012 Contents

1 INTRODUCTION...... 1

2 MODELS...... 1

3 AERONET VALUES...... 2

4 EVALUATION METRICS...... 3

5 REFERENCES...... 5

APPENDIX A...... 6 1 Introduction

Sand and Dust Storm Warning Advisory and Assessment System (SDS-WAS) is a program of the World Meteorological Organization (WMO) with the mission to enhance the ability of countries to deliver timely and quality sand and dust storm forecasts, observations, information and knowledge to end users. The Regional Center for Northern Africa, Middle East and Europe (NAMEE), hosted by the Spanish State Meteorological Agency (AEMET) and the Barcelona Supercomputing Center (BSC- CNS), supports a network of research and operational partners implementing the objectives of the SDS-WAS program in the region.

A system to routinely exchange forecast model products has been established as the basis for a model comparison that includes common near real-time (NRT) evaluation and generation of multi-model products.

In particular, a system to evaluate the performance of the different models has been set. The system yields, on a monthly, seasonal and annual basis, evaluation scores computed from the comparison of the simulated dust optical depth (DOD) and the direct-sun AOD observations from the AErosol RObotic NETwork (AERONET).

The geographical domain for the evaluation covers the main source areas for North Africa, Europe and Middle East, as well as the main transport routes and deposition zones.

2 Models

At present, 7 modelling systems: BSC-DREAM8b (Pérez et al., 2006; Basart et al., 2012), MACC-ECMWF (Morcrette et al, 2009; Benedetti et al, 2009), DREAM8- MACC (Nickovic et al., 2001), NMMB/BSC-Dust (Pérez et al., 2011), UM MetOffice (Woodward, 2001, 2011), NCEP/NGAC (Lu et al., 2010) and NASA/GEOS-5 (Colarco et al., 2010) daily provide dust forecast products (surface concentration and dust optical depth at 550 nm) for the reference area with a 3-hourly basis until a lead time of 72 hours.

1 Run time Model Institution Contact (UTC) BSC-DREAM8b BSC-CNS 12 J. M. Baldasano CHIMERE LMD 00 L. Menut LMDzT-INCA LSCE 00 M. Schulz J.-J. Morcrette MACC-ECMWF ECMWF 00 A. Benedetti DREAM8-NMME-MACC SEEVCCC 12 G. Pejanovic NMMB/BSC-Dust BSC-CNS 12 J. M. Baldasano UM U.K. MetOffice 00 D. Walters NGAC NCEP 00 S. Lu GEOS-5 NASA 00 A. da Silva

The evaluation system is applied to instantaneous forecast values of DOD ranging from the initial day (D) at 15:00 UTC to the following day (D+1) at 12:00 UTC. It means that the lead times of forecasts to be evaluated range from 15 to 36 hours for model runs starting at 00 UTC, but from 3 to 24 hours for model runs starting at 12 UTC.

Furthermore, a median multi-model is generated from the values of the different forecasts. In order to produce it, the model outputs are previously bi-linearly interpolated to a common grid mesh of 0.5º x 0.5º. Median values ranging from the initial day at 15:00 UTC to the following day at 12:00 UTC are also evaluated.

The evaluation system is built to easily incorporate new dust forecast models or multi- model products.

3 AERONET values

The forecasts of dust optical depth (DOD) are compared with the total AOD provided by the AERONET network for 42 selected dust-prone stations located around the Mediterran Basin, Iberian Peninsula, northern Africa and Middle East (see the figure below). They are described in detail in Appendix A.

Version 2-Level 1.5 of AERONET products are used for the present NRT evaluation. Level 1.5 AERONET data is automatically cloud screened but may not have final calibration applied. Thus, these data are not quality assured.

Since AERONET sun photometers do not yield AOD at 550 nm (AOD550), this variable is calculated from AOD at 440, 675 and 870 nm (AOD440, AOD675, AOD870) and the Ångström exponent 440-870 (AE440_870) using the Ångström law.

 AE440 _ 870 AE440 _ 870 AE 440 _ 870  1  440 675 870  AOD550   AOD440  AOD675  AOD870  3    550 550 550 

2 To minimize the sources of error, it is intended to restrict the comparison to situations in which mineral dust is the dominant aerosol type. Threshold discrimination is made by discarding observations with an Ångström exponent 440-870 higher than 0.6 (Pérez et al., 2006b).

Rather than time-interpolated, AERONET observations are assigned to the nearest multiple-of-3 hour. In case more than one observation is assigned to the same hour, only the closest-in-time is considered.

4 Evaluation metrics

The common metrics that are used to quantify the mean departure between modelled (Mi) and observed (Oi) quantities are the mean bias error (BE), the root mean square error (RMSE), the correlation coefficient (r) and the fractional gross error (FGE). Their are presented in the next table. oi and ci are the observed and the modelled concentrations at time and location i, respectively. n: the number of data.

Statistic Perfect Formula Range Parameter score

Mean Bias Error 1 n BE  ci  oi  − to + 0 (BE) n i 1

Root Mean 1 n Square Error RMSE  c  o 2 0 to +  i i 0 (RMSE) n i1

n ci  coi  o Correlation r  i1 n N coefficient (r) 2 2 -1 to 1 1 ci  c  oi  o i1 i1

2 n c  o Fractional Gross FGE   i i Error (FGE) n i 1 ci  oi 0 to 2 0

− The mean bias error (BE) captures the average deviations between two datasets. It has the units of the variable. Values near 0 are the best, negative values indicate underestimation and positive values indicate overestimation. Besides dust, there might be other aerosol types (anthropogenic source, biomass fire, etc.). Therefore, negative BE could be expected in our results.

3 − The root mean square error (RMSE) combines the spread of individual errors. It is strongly dominated by the largest values, due to the squaring operation. Especially in cases where prominent outliers occur, the usefulness of RMSE is questionable and the interpretation becomes more difficult. − The correlation coefficient (r) indicates the extent to which patterns in the model match those in the observations. − The fractional gross error (FGE) is a measure of model error, ranging between 0 and 2 and behaves symmetrically with respect to under- and overestimation, without over emphasizing outliers.

The scores of each model and median multi-model are computed on a monthly, seasonal and annual basis for each AERONET site, for 3 selected regions (Sahel/Sahara, Middle East and Mediterranean; see Appendix A) as well as globally considering all sites. It should be noted that scores for individual sites can be little significant for being calculated from a small number of data.

5 References

Basart, S., C. Pérez, S. Nickovic, E. Cuevas, and J. M. Baldasano (2012), Development and evaluation of the BSC-DREAM8b dust regional model over Northern Africa, the Mediterranean and the Middle East, Tellus B, 64, 18539. Benedetti, A., J.-J. Morcrette, O. Boucher, A. Dethof, R. J. Engelen, M. Fisher, H. Flentjes, N. Huneeus, L. Jones, J. W. Kaiser, S. Kinne, A. Mangold, M. Razinger, A. J. Simmons, M. Suttie, and the GEMS-AER team (2009), Aerosol analysis and forecast in the ECMWF Integrated Forecast System. Part II: Data assimilation, J. Geophys. Res., 114, D13205, doi:10.1029/2008JD011115. Colarco, P.R., A. da Silva, M. Chin and T. Diehl (2010), Online simulations of global aerosol distributions in the NASA GEOS-4 model and comparisons to satellite and ground-based aerosol optical depth, J. Geophys. Res. doi:10.1029/2009JD012820. Lu et al. (2010), Development of NCEP Global Aerosol Forecasting System: An overview and its application for improving weather and air quality forecasts, NATO Science for Peace and Security Series: Air Pollution Modeling and Its Application XX, 451-454, doi:10.1007/978-90-481-3812-8. Morcrette, J.-J., Boucher, O., et al. (2009) Aerosol analysis and forecast in the ECMWF Integrated Forecast System. Part I: Forward modelling, J. Geophys. Res., 114, D06206. Nickovic, S., G. Kallos, A. Papadopoulos and O. Kakaliagou (2001) A model for prediction of desert dust cycle in the atmosphere, J. Geophys. Res., 106, 18113 – 18129. Pérez C., S. Nickovic, G. Pejanovic, J.M. Baldasano and E. Ozsoy (2006a) Interactive Dust-radiation Modeling: A Step to improve Weather Forecast, J. Geophys. Res., 111, D16206, doi:10.1029/2005JD006717. Perez, C., S. Nickovic, J. M. Baldasano, M. Sicard, F. Rocadenbosch, and V. E. Cachorro (2006b) A long Saharan dust event over the western Mediterranean: Lidar, Sunphotometer observations, and regional dust modeling, J. Geophys. Res., 111, D15214.

4 Pérez, C., K. Haustein et al. (2011) Atmospheric dust modeling from meso to global scales with the online NMMB/BSC-Dust model–Part 1: Model description, annual simulations and evaluation, Atmos. Chem. Phys., 11, 13001–13027. Woodward, S. (2001) Modeling the atmospheric life cycle and radiative impact of mineral dust in the Hadley Centre climate model, J. Geophys. Res., 106, D16, doi: 10.1029/2000JD900795. Woodward, S. (2011) Mineral dust in HadGEM2, Tech. Note 87, Hadley Cent., Met Office, Exeter, UK.

5 Appendix A. AERONET stations used in the evaluation

Site name Longitude Latitude Country Region Avignon 4.88ºE 43.93ºN France Mediterranean Banizoumbou 2.66ºE 13.54ºN Niger Sahel/Sahara Barcelona 2.12ºE 41.39ºN Spain Mediterranean Cabo_da_Roca 9.50ºW 38.78ºN Portugal Mediterranean Caceres 6.34ºW 39.48ºN Spain Mediterranean Cairo_EMA_2 31.29ºE 30.08ºN Egypt Mediterranean Capo_Verde 22.93ºW 16.73ºN Capo Verde Sahel/Sahara CUT-TEPAK 33.04ºE 34.67ºN Cyprus Mediterranean Dakar 16.96ºW 14.39ºN Sénégal Sahel/Sahara Eilat 34.92ºE 29.50ºN Israel Mediterranean Ersa 9.36ºE 43.00ºN France Mediterranean ETNA 15.02ºE 37.61ºN Italy Mediterranean Evora 7.91ºW 38.57ºN Portugal Mediterranean FORTH_CRETE 25.28ºE 35.33ºN Greece Mediterranean Granada 3.60ºW 37.16ºN Spain Mediterranean IASBS 48.51ºE 36.70ºN Iran Middle East IER_Cinzana 5.93ºW 13.28ºN Mali Sahel/Sahara Ilorin 4.34 ºE 8.32ºN Nigeria Sahel/Sahara IMAA_Potenza 15.72ºE 40.60ºN Italy Mediterranean IMS-METU-ERDEMLI 34.26ºE 36.56ºN Turkey Mediterranean KAUST_Campus 39.10 ºE 22.30 ºN Saudi Arabia Middle East Kuwait University 47.97 ºE 29.33 ºN Kuwait Middle East Lampedusa 12.63ºE 35.52ºN Italy Mediterranean Lecce_University 18.11ºE 40.33ºN Italy Mediterranean Oujda 1.90ºW 34.65ºN Morocco Mediterranean Ouarzazate 6.91ºW 30.93ºN Morocco Sahel/Sahara Palma_de_Mallorca 2.62ºE 39.55ºN Spain Mediterranean Porquerolles 6.16ºE 43.00ºN France Mediterranean Rome_Tor_Vergata 12.65ºE 41.84ºN Italy Mediterranean Saada 8.16ºW 31.62ºN Morocco Sahel/Sahara Santa_Cruz_Tenerife 16.25ºW 28.47ºN Spain Sahel/Sahara SEDE_BOKER 34.78ºE 30.85ºN Israel Mediterranean Seysses 1.26ºE 43.50ºN France Mediterranean Solar_Village 46.40ºE 24.91ºN Saudi Arabia Middle East Tabernas_PSA-DLR 2.36ºW 37.09ºN Spain Mediterranean Tamanrasset_INM 5.53ºE 22.79ºN Algeria Sahel/Sahara Villefranche 7.33ºE 43.68ºN France Mediterranean Xanthi 24.92ºE 41.15ºN Greece Mediterranean Zinder_Airport 8.99ºE 13.78ºN Niger Sahel/Sahara Zouerate-Fennec 12.48ºW 22.75ºN Mauritania Sahel/Sahara

6

Recommended publications