Operational Procedures to Process, Screen and Collocate Radiosondes with NOAA Polar Satellite Observations

Reale and Tilley 2006

(Preliminary Draft) 1. INTRODUCTION NESDIS operational systems for ATOVS sounding products utilize Online sub- systems to process orbital sounding products, and Offline sub-systems which provide support data sets and associated coefficients for product validation and tuning. Collocated radiosonde and satellite observations are a key dataset for the tuning and validation functions.

NESDIS Offline systems which compile collocated radiosonde and satellite observations (daily) include specialized preprocessing and QC of the radiosondes to meet requirements for product validation and tuning prior to collocation. This report summarizes a series of modifications with respect to radiosonde processing and QC techniques that have been developed and currently under evaluation. Modifications include improved tests for analyzing the meteorological representation of radiosonde reports particularly the moisture profiles.

An important aspect of these changes concerned changing the source of input radiosonde observations from the Office-Note 29 (ON29) to the new “PREPBUFR” files. Unlike ON29, the PREPBUFR files are routinely supported by NOAA/EMC and utilized in operational numerical weather prediction (NWP) assimilation. Additional benefits of transitioning to “PREPBUFR” files include having access to 1) observational QC markers (Appendix-1) determined during NWP assimilation, 2) updated and better documented observational corrections (i.e., due to radiative heating and cooling of the sensor during flight), and 3) ancillary information such as balloon drift and collocated NWP (temperature and moisture) data.

The following report discusses NESDIS approaches for processing and screening radiosonde data prior to collocation with satellite observations, including revised procedures to:  integrate QC markers from PREPBUFR files,  analyze moisture profiles,  extrapolate moisture profiles to 200mb, and  analyze temperature inversions.

The report also addresses NESDIS strategies to compile collocated radiosonde and satellite observations in support of (ATOVS) operational sounding product systems and concludes with results.

2 2. RADIOSONDE PROCESSING 2.1 Integration of Radiosonde QC Marks and “Tier-1” Tests Tier-1 refers to the initial group of tests to determine whether a given radiosonde profile is suitable for collocation with satellite observations.

A strategy to utilize the QC marker values (Appendix-1) stored on the PREPBUFR file was developed based on internal analyses of radiosonde observations and consultations with appropriate NESDIS and EMC/NCEP staff. The respective QC-marks for pressure (P), temperature (T) and dewpoint temperature (TD) data as stored for each mandatory and significant level are checked with the following actions taken:  accept data if QC mark is 0,1 or 2,  additionally accept if TD above 300mb and QC mark is 9 or 15,  additionally accept data if first significant level and QC mark is 8, 9 or 15,  accept if ship and all QC marks for P are 15,  otherwise reject the respective data.

The integration of the NCEP QC markers adds significant value to the NESDIS screening protocol. However, there is some degree of ambiguity concerning QC mark protocols and cases can be found in which accepted data are suspect and rejected data appear reasonable.

The most notable ambiguity concerns the occurrences of QC markers 9 and 15 for TD above the 300mb level. Appendix 2 indicates that all TD values above 300mb are assigned a QC mark of 9 (not used in NWP), but NESDIS internal analyses found that QC marks of both 9 and 15 were common. Analyses also cited numerous occurrences below 300mb often in conjunction with relatively large differences from NWP and for which the radiosonde was not clearly suspect. The NESDIS approach to screen such data below (lower in atmosphere) 300mb and accept them above 300mb is a conservative approach designed to integrate with subsequent TD gap analyses as described later.

It has also been noted that selected ship reports contain observations for which the QC markers are predominantly 15, although the report data look reasonable.

A clarification concerning radiation corrected temperature is needed. The PREPBUFR file contains both observed and radiation corrected temperature observations. Radiation corrections are applied either a-priori at the report site or by NCEP upon receipt, with these later cases identified by differences between observed (uncorrected) and radiation corrected temperature at a given level. There is no difference in the assigned QC marker for each. Unless indicated otherwise, the T profile refers to the radiation corrected temperature

3 The remaining screening procedures integrate these results in a manner similar to what is currently done in support of NESDIS operational systems.

Screening can refer to specific data at a given level, the entire level, or the entire report. Screened data are not omitted from the output file however they are not used for tuning and validation. Screened profiles are not retained.

Any level (mandatory or significant) with a rejected P or a missing or rejected T value is screened.

Remaining T data are tested against climatology as defined per pressure level, 15- degree latitude belt and month (see Appendix 3). Levels containing T values which fall outside these limits are screened.

Levels containing a TD which exceeds the “non-radiation corrected” T value (if available) or for which the dewpoint depression (TDD) exceeds 50oC are screened.

Any significant level containing T data which deviates by 25oC or more from the standard level T immediately above and below is screened.

Any significant level TD data which deviates by 25oC or more from the standard level TD data immediately above and below is screened.

Radiosonde surface pressure (SP), T and TD parameters are separately defined using the Category Zero surface level report data. These respective parameters are set to missing for screened or otherwise unavailable Category Zero data.

Non-rejected levels and data are used to construct final merged arrays comprised of mandatory and significant level report data for P, T and TD, respectively. Merged profiles must satisfy minimal data completeness criteria for initial vertical extent and gaps.

The initial vertical extent required for T is from 500mb to 300mb with a minimum of three (3) levels reported. Reports not meeting these criteria are screened.

There are no initial vertical extent requirements for TD, however a minimum of 2 levels are required; TD is not considered above 200mb. Reports not meeting these criteria are screened.

Remaining merged profiles are tested for gaps. A gap is defined as an interval of missing (e.g., screened) data that “exceeds” a specified vertical increment. The specified vertical increments are the designated ( * ) standard layers indicated in

4 Appendix 2. Gap tests begin at the first designated standard level above the surface-most report pressure.

Gaps are separately analyzed for T and TD profiles.

For T, if a gap is observed below 300mb the report is screened. T gaps above 300mb result in the screening of all data above the gap, referred to as capping the profile (below the gap).

Remaining reports are checked for TD gaps. If a gap is observed, the TD profile is capped below the gap. The remaining TD profile must at least span the two lowest standard levels of the temperature profile otherwise the report is screened.

The lowest and highest report levels or final vertical extent for the remaining merged profiles of T and TD are stored on the output file.

2.2 Interpolation to ATOVS Levels Acceptable merged reports are interpolated to the ATOVS pressure levels (Appendix-2). Reported standard and significant level data which exactly correspond to a given ATOVS pressure level PI are assigned as such. Otherwise, the two report levels in the merged array immediately above (I+) and below (I-) a given ATOVS level are interpolated to level (I) using equation (1): TI = TI- + (dT/dP)I (DELP) (1)

where: DELP = Ln (PI / PI-) (dT/dP)I = (TI+ - TI-) / Ln (PI+ / PI-)

Interpolated values using equation (1) are determined for T and TD (replace T with TD) for all ATOVS levels within the respective vertical extents, and for uncorrected T up to 200mb. The report levels used to construct the interpolated T and TD profiles are identical. The lowest and highest levels (vertical extent) of the interpolated T and TD profile, respectively, are stored on the output file.

The WVMR (g/kg) is also computed for each interpolated level (I) within the vertical extent of the TD profile using equation (2): WVMRI = 622. ( WVPI ) / (PI - WVPI ) (2)

WVP is the observed vapor pressure over water and is estimated (Nordquist, 1973) using equation (3): WVP = 10**(C1 - 1.3816E - 7(10**C2) + 8.1328E-3(10**C3) - 2949.076/TD) (3) where: C1 = 23.832241 – 5.02808 (ALOG10 (TD)),

5 C2 = 11.344 – 0.0303998 (TD), C3 = 3.49149 – 1302.8844 / (TD), and TD in Kelvin.

Special consideration when computing WVMR is needed for reports in which the uncorrected and radiation corrected T differ at a given level (PI) within the vertical extent of the TD profile. In such cases, the Relative Humidity (RH) based on the TD and uncorrected T is computed using equation (4): RH = WVPTD / WVPSUT = WVPC / WVPSCT = WVMRC / WVMRSC (4) where: WVPTD is uncorrected vapor pressure (3) using report TD, WVPSUT is saturation vapor pressure (3) for uncorrected T, WVPSCT is saturation vapor pressure (3) for corrected T, WVPC is corrected vapor pressure (4), WVMRSC is saturation mixing ratio (2) using WVPSCT, and WVMRC is the corrected mixing ratio (4 or 2 using WVPC)

The protocol is that in cases when NCEP provides radiation corrections for T (i.e., for VIZ-B2), the reported TD is based on the RH and uncorrected T as measured by the radiosonde. The above procedure adjusts the WVMR (and WVP) to be consistent with RH (held constant) and corrected T.

Finally, TD is recomputed using equation (5) TD = X * 238.3 / (17.2694 –X) (5) where X = ln (WVPC / 610.78)

Additional testing for superadiabatic interpolated layers is done using equation (6): .286 THETA = TI ( PI+1 / PI ) (6) where: TI is the temperature at a lower interpolated level (I), P is pressure, and THETA is the dry adiabatic expanded temperature for level (I+1)

Testing is done which compares THETA and TI+1 for each interpolated level from the surface up to 50mb. The superadiabatic flag for each report is defined as follows:

 0: .25 > (THETA - TI+1 ) for all interpolated layers,

 1: .25 < (THETA - TI+1 ) < 1.5 for one layer,

 2: 1.5 < (THETA - TI+1 ) or two superadiabatic layers observed.

All accepted and rejected report data are retained on pre-processed radiosonde report files along with associated flags based on each of the Tier-1 tests described above. This file is overwritten after 96 hours.

6 Only accepted reports (including superadiabatic) contain interpolated T, Tdew and WVMR data. These profiles undergo the additional Tier-2 and Tier-3 tests described below and are candidate for collocation with satellite observations (Section 3).

2.3 Moisture Profiles and the Tier-2 Tests 2.3.1 Overview Radiosonde moisture profiles can be problematic. Some problems are systematic, for example, on the order of a few percent and correlated with sensor manufacturer design. These problems are essentially out of the scope of automated QC other than blanket screening based on the radiosonde instrument type. Such screening is best done after collocation with satellite observations to facilitate performance validation.

Others problems are not systematic and more related to the ambient conditions and/or instrument performance during ascent. Examples include the highly irregular measurements that can result from sensor malfunctions, for example, due to abrupt wetting and drying in precipitating, multi-level cloud environments. In general, any report exhibiting vertical features (real or imaginary) that are not resolvable by the satellite (moisture) sensor tend to be problematic with respect to product tuning and validation. The net impact of such profiles is increased noise leading to reduced product and validation integrity.

Another notable problem is the tendency for mid-latitude-polar sondes to report a constant TD depression (TDD) at and above the tropopause. The integrity of these profiles is doubtful, and the significant number of occurrences is enough to introduce a moist bias in upper level moisture products via tuning, and at the same time the perception of a dry bias in these same products via statistical validation.

The objective of the moisture screening tests described below, referred to as Tier-2 tests, is to flag radiosonde moisture profiles with the above characteristics for possible screening with respect to product tuning and/or validation. The overall procedure is sub-divided into three parts:  Part-1 tests from the surface to 300mb,  Part-2 tests above 400mb, and  Upward extrapolation to 200mb as needed.

Input data for Tier-2 tests are the interpolated radiosonde report data.

2.3.2 Tier-2 Tests Part 1: Surface to 300mb

7 The underlying principle of the moisture testing approach is that in general the water vapor mixing ratio (WVMR) monotonically decreases with height. Profiles which feature non-monotonic/abrupt changes in WVMR with height are identified, tested further and flagged as described below. Examples are shown in Section 4.2.

The basic approach is to systematically test the WVMR values at successive interpolated (ATOVS) levels (Appendix-1) from the lowest to highest report level up to 300mb.

If a given WVMR value increases with height: 1. initiate anomaly-test sequence by defining the preceding (i.e., last decreasing) WVMR value/level as MR0, 2. define a threshold limit based on the MR0 value and layer (see below), 3. test the current (given) WVMR: o if WVMR exceeds the threshold and there is no temperature inversion relative to level MR0 then denote an event o similarly test the WVMR at successive level(s): . re-compute threshold (see below) and if exceeded and no temperature inversion relative to level MR0 denote event . end the sequence when a current WVMR is less than MR0 4. if subsequent WVMR increases with height go to 1 5. end at 300mb

An anomaly-test sequence with single or multiple events is denoted as a single event.

A test-sequence consisting of seven or more levels (including inversions) is denoted as an extended event.

The threshold value for denoting an event in a given test-sequence is based on the MR0 value. Thresholds are based on a nominal 10% increase for values ranging from 1 to 10 g/kg over a vertical increment of 920mb to 850mb (about 2/3 km). Therefore the nominal threshold for an event from 920mb to 850mb given an MR0 at 920mb within this range is (7): Threshold = MR0 + .10 (MR0) (7)

Thresholds for MR0 values within this range but for different layers are normalized to Ln (P920 / P850 ). Table-1 provides Ln (Pi / P) values for successive ATOVS layers from 1000mb to 300mb and the corresponding normalization factors N(i). Normalized thresholds for MR0 values at level ( i ) that are in the range of 1 to 10 g/kg are computed as shown in equation (8): Threshold = MR0 + .10 (MR0) (SUM N(i)) (8)

8 where: SUM N( i ) is the sum of N( i ) from MR0 to the current WVMR

9 Pressure (Pi / P ) Ln (Pi / P ) Normalization N ( i )

1000 / 950 .051 .64 950 / 920 .032 .41 920 / 850 .079 1.00 850 / 780 .086 1.09 780 / 700 .108 1.36 700 / 670 .044 .56 670 / 620 .078 .98 620 / 570 .084 1.06 570 / 500 .131 1.65 500 / 475 .051 .64 475 / 430 .100 1.27 430 / 400 .072 .91 400 / 350 .134 1.69 350 / 300 .154 1.95

Table 1: Ln (Pi / P) values for successive ATOVS layers from 1000mb to 300mb and respective normalization factors to 920 / 850 thickness (approximately 2/3 km).

Thresholds for MR0 values outside the range of 1 to 10 g/kg are additionally adjusted to prevent them from becoming excessively large or small using equations (9) through (14):

MR0 greater than 10g/kg: Threshold = MR0 + (1.00 + .050 (MR0 - 10.00)) SUM N(i) (9) MR0 greater than 20g/kg: Threshold = MR0 + (1.50 + .025 (MR0 - 20.00)) SUM N(i) (10)

MR0 from .99 and .10 g/kg: Threshold = MR0 + (.10 + .05 (1.00 - MR0)/.90) (MR0) SUM N(i) (11)

MR0 from .099 and .010 g/kg: Threshold = MR0 + (.15 + .05 (.100 - MR0)/.090) (MR0) SUM N(i) (12) MR0 from .0099 and .0001 g/kg: Threshold = MR0 + (.20 + .05(.0100 - MR0)/.0090) (MR0) SUM N(i) (13) MR0 from .00099 and .00001 g/kg: Threshold = MR0 + (.25 + .05 (.0010 - MR0)/.00090) (MR0) SUM N(i) (14)

10 An event is also denoted if the WVMR over consecutive interpolated levels abruptly decreases. This amount is defined as 80% over the 920mb to 850mb layer. The normalized threshold for such events is computed using equation (15): Threshold = WVMR( i ) - .80 (WVMR( i )) (N( i )) (15) where WVMR( i ) is from the lower level

An event is denoted if the WVMR for the upper level (i +1) is less than Threshold.

Thresholds for such events are not adjusted based on WVMR range. Multiple occurrences over three or more consecutive levels (unlikely) are counted as a single event. Multiple occurrences over separated layers are counted as multiple events.

The flagging scheme for Tier-2, Part-1 tests is:  0; no events,  1; one event,  2; an extended event, and  3; two or more events.

Reports with flags of 2 and 3 are identified as candidate for screening.

2.3.3 Tier-2 Tests Part 2: Above 300mb As mentioned in Section 2.2.1, a number of radiosondes from mid-latitude-polar regions tend to report a constant dewpoint depression (TDD) at and above the tropopause particularly during winter. This results in a significant number of profiles with increasing WVMR with height at these levels. The integrity of these profiles is doubtful, and the significant number of occurrences is enough to introduce a moist bias in upper level moisture products via tuning, and at the same time the perception of a dry bias in these same products via statistical validation.

The three panels of Figure 1 show examples of observed moisture structures in the vicinity of the tropopause for mid-latitude wintertime cases (1084, 1213, 1246).

The upper case shows the tendency for constant TDD and increasing WVMR with height. The middle and lower panels show what would appear to be more realistic cases for which the WVMR decreases with height. It is of interest to note that the instrument type for the upper panel is VIZ-B2 (Carbon element), and for the middle and lower panels was Vaisala RS80 (Thin-film); preliminary investigation suggests some association with instrument type.

11 The motivation for the Part-2 tests was mainly to identify cases such as shown in the upper panel of Figure 1, and as feasible to provide adjusted observations for possible use in product tuning that are more consistent with those shown in the middle and lower panels of Figure 1.

The Part-2 test is skipped if the Top-most interpolated level (PT) did not reach at least 350mb. Otherwise, the interpolated mixing ratio at the topmost level (MRT) is compared to its projected power law mixing ratio (POWMRT) based on the interpolated mixing ratio at 400mb using equation (16): 3 3 POWMRT = ( (PT) / 400 ) MR400 (16)

If (POWMRT) is greater than or equal to (MRT), no further testing is done. Otherwise, testing continues to identify a level ( PL ) above which “all” the interpolated mixing ratio values (MRI) exceed their projected power law value (POWMRI), each computed from the previous level (PI-1 ) using equation (17): 3 3 POWMRI = ( ( PI ) / (PI-1) ) MRI-1 (17) where: PI ranges from 350mb to PT

If no such level ( PL ) can be found, the profile is not adjusted. If a level ( PL ) is found, then each interpolated mixing ratio (MRI) from level ( PL+1 ) to ( PT ) are replaced with POWMRLI (based on PL ) using equation (18): 3 3 POWMRLI = ( (PI ) / (PL ) ) MRL (18) where: subscript I ranges from ATOVS level (L+1) to T.

The flagging scheme for Tier-2, Part-2 tests is:  0; no replacement  1; replacement occurred

Interpolated levels containing replacement values from Tier-2, Part-2 tests are noted on the output file.

In general, Part-2 tests as described above would result in the upper profile of Figure 1 to be replaced with projected power law values (18) beginning at 250mb, with the middle and lower profiles unaltered.

12 Figure 1: Skew-T Log P plots of observed moisture (dashed) profiles in the vicinity of the tropopause for mid-latitude wintertime cases for which the upper profile would be replaced via Tier-2 part-2 tests, with the middle and lower profiles unaltered. Top: VIZ-B2, Site 72451, 38N, 00W; Middle: Vaisala RS80 57H, Site 72440, 37N, 93W; Bottom: Vaisala RS80 57H, Site 72357, 35N, 97W; all 2/16/05, 0Z.

13 2.3.4 Borderline Tier-2 Case Figure 2 provides a borderline case of combined Part-1 and Part-2 results.

Concerning Part-1, the WVMR decreases by 72% from 620mb to 570mb; the normalized threshold for an event (15) is 85%. Had it exceeded threshold it would have been a candidate for screening given the obvious event (12) from 350mb to 300mb.

Figure 2: Skew-T Log P plot of observed moisture (dashed) profile (tropical) not adjusted by Tier-2 part-2 tests even though projected (400mb) power law WVMR is exceeded by the observed WVMR at 200mb. VIZ Type A (pressure, USA), 48568, 7.2N, 100.6E, 2/15/0. The 620mb to 200mb WVMR values (1.16, .32, .16, .11, .051, .029, .017, .21, .16, .026). (308) Clean F

Concerning Part-2, the topmost level was 200mb and POWMRT (16) was less than (MRT), however, no level L (17) was found therefore no replacement was done. Interestingly, if this profile had a topmost level of 300mb or 250mb, then a level L would have been defined at 350mb, and the observed WVMR values from 350mb to the topmost level would have been replaced (18).

It can be argued whether cases as in Figure 2 should be screened and/or adjusted. In the first case it’s a matter of setting thresholds. In the later, this case is less typical with the bulk of the Part-2 tests mainly adjusting profiles as shown in the upper panel of Figure 1.

14 This replaces: (If an upper level mixing ratio is found to increase, the preceding level (or layer) is used as the basis for re-computing the upper level value using the lowest mixing ratio from among the following three methods:  Constant dewpoint depression,  Extrapolated dewpoint temperature lapse rate, or  Power law extrapolation

The extrapolated lapse rate (computed from the preceding layer) is not allowed to exceed adiabatic cooling rate. If not initially selected, the extrapolated lapse rate option is removed from consideration for successive levels. The power law relates mixing ratio to the pressure cubed. Replacement values are only provided at levels containing original report data.

Once a mixing ratio value at a given level is replaced, all remaining interpolated levels the given level are similarly checked and replaced. However, if at a given level the candidate replacement value exceeds the original (i.e., interpolated) mixing-ratio, all report values modified in the Part-2 test are restored.)

15 2.3.5 Upward Extrapolation A number of radiosondes do not report moisture up to 200mb. For example, on a typical day approximately 1600 radiosondes pass the Tier-1 tests, however, the distribution of moisture profile vertical extent yields the following:  38 do not reach 500mb  94 do not reach 400mb  200 do not reach 350mb  259 do not reach 300mb  400 do not reach 250mb  500 do not reach 200mb

As can be seen, almost one-third of the reports are not complete to 200mb level as required for products derivation, tuning and (most) radiative transfer model calculations. Extrapolated data (fabrication) to the 200mb level is done to facilitate these requirements and is not intended for use outside these areas (i.e., validation).

The upward extrapolation of WVMR has two parts. Part-1attempts to use the highest reported level for moisture, and depends upon the pressure of the “Top- most” report level, typically the highest significant level (PSIGT), the two uppermost interpolated levels (PT-1) and (PT), and the first extrapolated level (PT+1). If no moisture data is available above (PT), Part-1 is skipped; moisture data above 200mb are not considered.

Otherwise, the TD at the first extrapolated level (PT+1) is estimated as a weighted average of its lapse rate over the uppermost significant and interpolated layers using equation (19): TDT+1 = TDT + (SCALE (dTD/dP)SIGT) + (1-SCALE) (dTD/dP)T) (DELP) (19) where: SCALE = Ln (PT / PSIGT) / Ln (PT / PT+1) DELP = Ln (PT+1 / PT) (dTD/dP)SIGT = (TDSIGT - TDT) / Ln (PSIGT / PT) (dTD/dP)T = (TDT - TDT-1) / Ln (PT / PT-1)

Ln indicates the natural log. If SCALE is less than the value (.25) this step is skipped.

16 Part-2 begins at the top-most pressure level as defined in Part-1. Three candidate methods for filling the next level are attempted:  Constant TDD,  Extrapolated TD, or  Power law.

The constant TDD option uses the TDD at the top-most level to compute TD at subsequent extrapolated levels (PT+1).

The extrapolated TD option is similar to Step-1 but without the scaling. The TD at the two “Top-most” interpolated levels (PT-1) and (PT) are used to extrapolate TD at the first extrapolated level (PT+1) using equation (20): TDT+1 = TDT + (dTD/dP)T (DELP) (20) where: DELP = Ln (PT+1 / PT) (dTD/dP)T = (TDT - TDT-1) / Ln (PT / PT-1)

A check is also done to insure that TD (for this option) does not cool at a rate that is greater than dry adiabatic using equation (21): K = Ln (TDT+1 / TDT ) / Ln (PT+1 / PT) (21)

If the value K from equation (21) exceeds .286 (the adiabatic limit), then extrapolated TD is computed using equation (22): .286 TDT+1 = TDT (PT+1 / PT ) (22)

The Power Law option uses the WVMR at the Top-most level (PT) to project POWMR values at the first extrapolated levels (PT+ 1) using equation (23): 3 3 POWMRT+1 = ((PT+1) / (PT) ) MRT (23)

The lowest mixing ratio among the three candidate values is selected. The method continues with successive extrapolated levels serving as the top-most level for subsequent extrapolations upwards to 200mb. If initially selected, the “extrapolated TD” option is only allowed for up to three successive levels, and if not initially selected is removed from further consideration. All pressure levels containing extrapolated TD data are noted on the output file.

17 2.4 Radiosonde Temperature Inversions: Tier-3 Tests Similar to moisture profiles for which the vertical resolution exceeds the vertical sensitivity of the sounder, temperature profiles with inversions (increasing temperature with height) present meteorological conditions for which sounder measurements become ambiguous.

Therefore, a group of tests were developed to identify situations with temperature inversions for possible screening with respect to product validation and in particular for product tuning.

The technique segregates two types of temperature inversions, those emanating from the surface and those residing aloft. Checks are done for all ATOVS levels from the surface to 300mb.

All profiles with temperature inversion (and isothermal layers) are identified.

Profiles exhibiting a surface inversion which exceeds 9K (10K?), an inversion aloft which exceeds 4K (5K?), any inversion (regardless of magnitude) which spans greater than three (3) successive standard layers, or multiple inversions are specially denoted. The flagging convention for inversions is:  0; no inversions  1; single inversion less than threshold  2; multiple inversions less than threshold  3; inversion(s) greater than threshold(s)

2.5 Other Extrapolations Other extrapolations of radiosonde temperature and moisture profiles that are needed to facilitate their use in satellite sounding product applications include:  downward extrapolations of T and TD to 1030mb  upward extrapolation of T to .1mb

Again, it must be emphasized that all extrapolations discussed in this report are necessitated by requirements within existing satellite product scientific algorithms. For example, radiative transfer models require moisture and temperature information from the bottom to the top of the atmosphere, and current regression schemes require data at all sounding levels. Such data are not intended for use outside these areas, and are addressed in this document for completeness only.

18 The downward extrapolation of T is done in two steps. The first step depends upon the pressure of the lowest report level (PSIGB), typically the first significant level, the two “Bottom-most” interpolated levels (PB+1) and (PB), and the first extrapolated level (PB-1). In the rare case that PS exceeds 1030mb, no extrapolation is done.

Otherwise, the T at the first extrapolated level is estimated using equation (24): TB-1 = TB + (SCALE (dT/dP)SIGB + (1-SCALE) (dT/dP)B) (DELP) (24)

where: SCALE = Ln (PSIGB / PB) / Ln (PB-1 / PB) DELP = Ln ( PB-1 / PB ) (dT/dP)SIGB = (TSIGB - TB) / Ln (PSIGB / PB) (dT/dP)B = (TB - TB+1) / Ln (PB / PB+1)

Ln indicates the natural log. If SCALE is less than the value (.25) then SCALE is set to zero, meaning that first significant level data is not used in downward extrapolation. Otherwise, a weighted average (SCALE) of dT/dP as defined in (24) is used.

The second and subsequent extrapolations continue down to 1030mb using the scaled dT/dP from (24).

The downward extrapolation of TD is done after the T extrapolation is completed. The TD extrapolation is based on a constant TDD which is computed using the SCALE from (24) as shown in equation (25): TDDB-1 = SCALE (TDDSIGB) + (1-SCALE) (TDDB) (25)

TDDB-1 is used to compute the remaining TD profile based on the extrapolated T profile from (24).

The upward extrapolation of temperature is also done in two steps. The first step depends upon the pressure of the “Top-most” report level (PSIGT), typically the highest significant level, the two uppermost interpolated levels (PT-1) and (PT), and the first extrapolated level (PT+1). If no significant level is available above (PT), step 1 is skipped.

Otherwise, the temperature at the first extrapolated level is estimated using equation (26): TT+1 = TT + (SCALE (dT/dP)SIGT + (1-SCALE) (dT/dP)T) (DELP)) (26)

where: SCALE = Ln (PSIGT / PT+1) / Ln (PT / PT+1) DELP = Ln (PT+1 / PT)

19 (dT/dP)SIGT = (TSIGT - TT) / Ln (PSIGT / PT) (dT/dP)T = (TT - TT-1) / Ln (PT / PT-1)

Ln indicates the natural log. If SCALE is less than the value (.25) this step is skipped.

Step 2 of the upward extrapolation provides the T profile upwards to .1mb. Since there are rarely radiosonde observations at these levels, upward extrapolation is done using collocated satellite sounder measurements as described in Section 3.2.

2.6 Section Summary A series of radiosonde processing steps and tests are described to facilitate the use of radiosonde observations in satellite sounding products tuning and validation.

The first steps (Tier-1) entail the integration of available radiosonde quality marks including those derived in conjunction with NOAA operational NWP assimilation and subsequent checks for report completeness. Accepted reports are interpolated (and extrapolated) to the set of ATOVS pressure levels (Appendix 2) and undergo Tier-2 and Tier-3 which provide:  analysis of moisture structure,  moisture and temperature extrapolations, and  analysis of temperature inversion(s).

Tier-2 and 3 tests are designed to isolate reports for which the signal to noise may not be suitable for use in satellite data processing; report extrapolations mainly facilitate their use in satellite data processing applications.

Original report data for accepted and rejected reports are retained on pre-processed radiosonde report files along with associated Tier-1 test results. Accepted reports (including superadiabatic) also contain interpolated T, Tdew and WVMR data, Tier-2 and Tier 3 test results.

Associated flags and annotation on the radiosonde output file include:  Tier 1 pass/fail flags,  Tier-2 part-1 flags,  Tier-2 part-2 flags,  lowest/highest levels of reported moisture and temperature, respectively,  lowest/highest levels of interpolated moisture and temperature,  first level (or missing) of replacement moisture data, and  Tier-3 test flags.

20 Reports passing Tier-1 tests are candidates for collocation with satellite observations (Section 3).

21 3. COLLOCATED RADIOSONDE AND SATELLITE OBSERVATIONS 3.1 Overview Radiosondes that pass the Tier-1 tests are candidate for collocation with ATOVS satellite observations. Collocations are compiled on a daily basis and initially retained on 5-day rotating files. Subsets of collocations are also retained on longer term datasets referred to as the Matchup Data Base (MDB) which serve as input for use in product tuning and validation and are archived at NCDC.

The following sections describe strategies for compiling the individual collocations of radiosonde and satellite observations and subsets of collocations residing on the longer term datasets (MDB), and additional considerations with respect to next generation satellite systems.

3.2 Collocation or Not A satellite sounding represents a volumetric estimate of vertical temperature (surface to approximately 1mb) and moisture (surface to approximately 200mb) derived from radiometric sensor data at a single point in time. Their vertical resolution and sensitivity varies as a function of sensor, scan position and ambient weather, typically ranging from about 1.5 to three (3) kilometers. A radiosonde report provides a series of point measurements (like a dotted line) with time differences on the order of two to three hours from the surface to the highest report level. The vertical spacing (mandatory and significant levels) between measurements are generally irregular (full density reports are typically not available) varying from tens to hundreds to several kilometers.

ATOVS radiometric sensors are cross-track scanners providing measurements that extend vertically through the atmosphere at nadir scan positions but with an increasing slant path toward the orbit scan edges. Radiosondes undergo vertical drift dependent on ambient wind which can carry them into neighboring satellite fov (and back again) during launch.

The horizontal resolution of ATOVS soundings is sensor and scan position dependent and can vary a nominal 20km near nadir for clear soundings (which utilize infrared sensor data) to over 150km near the orbit scan limb for cloudy soundings (which do not utilize infrared sensor data). As mentioned, radiosondes are point measurements.

So at best, a collocated radiosonde and satellite observation is ambiguous.

22 3.3 Processing a Collocation The basic procedure for compiling collocated radiosonde and satellite sounding observations is to compare soundings from a given day, referred to as the satellite- day, to radiosondes from the satellite-day, the final 6-hours from the previous day and the first 6-hours from the next day. Processing typically occurs about 10 hours after a given satellite day. The goal is to identify the “single” satellite sounding location that is closest in time and space to a given radiosonde. Once selected, the respective satellite sounding and radiosonde report data records are merged and retained.

The collocation process initially entails identifying candidate satellite soundings for collocation with a specified time and distance window of a given radiosonde. The time and distance windows currently used for ATOVS soundings are latitude and terrain dependent and range from as little as 2 hours and 50km to up to 5hours and 125km. Sample availability is the primary factor for defining the collocation windows with generally larger windows in remote ocean regions and smaller windows in populated land and polar regions.

Once the candidate satellite soundings for a given radiosonde are identified, the following criteria are considered to select the single sounding for collocation:  terrain,  time difference, and  distance

Satellite soundings have terrain designations of sea, land, coast, sea-ice, snow cover and for the non-sea cases terrain height. Coastal soundings are only considered if no other terrain type (among candidates) is observed. Collocated satellite sounding and radiosonde terrain heights must be within 100 meters.

The observation times “currently” used in collocation processing is the radiosonde launch time and satellite observation time in hour/minutes. Typically, only one satellite orbit provides candidates within the time distance window. If candidate satellite data are available from different orbits (polar regions), only candidates from the orbit closest to the radiosonde report time are considered.

Typically, more than one candidate sounding from the orbit closest in time is available, from which the sounding closest in distance (except coasts) is selected.

23 Once a collocation is selected, the radiosonde temperature is “extrapolated” to . 1mb as required for satellite product processing. Extrapolation is done using pre- computed regression coefficients based on historical collocations of upper air measurements (i.e., rocketsondes) and ATOVS measurements. Extrapolation is only done for radiosondes that extend to at least 60mb, and for which a suitable merge point between the radiosonde and regressed profile is found. Reports that cannot be successfully extrapolated are retained on the MDB but not used for satellite processing (tuning). Further discussions of the extrapolation method are provided in Reale, 2001.

3.4 Collocation Sampling Approximately 300 collocations are compiled daily per satellite given the collocation windows and overall approach presented in the previous section. There are typically about 1900 radiosondes daily, 1600 which pass the Tier-1 tests and therefore candidate for collocation.

The sun-synchronous characteristic of polar satellite orbits, combined with the predominantly synoptic launch times (at 0Z and 12Z) of radiosondes results in regional sampling bias among collocations per satellite. Figure 3 illustrates typical global distributions of 5-day collocation samples for NOAA-16 (1330 LST ascending overpass) and NOAA-15 (1930 LST overpass) respectively. Collocations are color-coded with green indicating collocation with satellite observations over land, blue over sea, orange over coast, light blue over sea-ice or snow, and red denoting ship radiosondes. As can be seen, each satellite tends to accumulate collocations in different regions particularly over land (green), mainly a by-product of the two (2) to three (3) hour time windows used. As discussed in Section 3.4, this can introduce systematic differences in the tuning and validation of respective satellites.

24 Figure 3: Typical 5-day samples of collocated radiosonde and satellite observations compiled for operational ATOVS NOAA-16 (top) and NOAA-15 (bottom) satellites.

25 Increasing the time window to six hours would eliminate such sampling differences between satellites, but the potential for introducing systematic errors through an implicit regional bias in satellite minus radiosonde time differences would still remain. Nonetheless, an approach to initially use 6-hour time windows for compiling collocations, with sub-sampling later (for product tuning and validation) may be the most versatile. One way or another, the synoptic orientation of global radiosondes complicates their use for satellite product tuning and validation; this is discussed further in Section 3.4.

Finally, as mentioned in Section 3.3, collocations are processed daily using satellite data from a given day and radiosondes from 18Z the previous day to 06Z the next day. This means that collocations from two consecutive days can contain the same radiosonde, referred to as a duplicate collocation. Procedures are run on a daily basis to identify duplicates (from the previous day) and retain the single collocation consistent with the selection criteria (Section 3.3).

3.5 Collocation Sub-sampling An inspection of Figure 2 indicates localized areas of high collocation density in the radiosonde dense population zones. When utilizing collocations to tune and validate satellite data and product systems, an overabundance of sample from a small area can lead to misrepresentation with respect to overall global performance.

The basic approach for collocation sub-sampling is to identify potentially redundant collocations, that is, collocations that are relatively close to each other in space and time and for which little or no additional information is being added. This is done by defining a time and distance window for minimal density, referred to as the secondary time and distance windows. Recommended values for ATOVS collocations are 300km and 6 hours.

The procedure entails grouping of collocations in a 3-dimensional space-time coordinate system partitioned by secondary window increments. Collocations contained with a given partition are then analyzed and one is flagged as the primary, with the remainder designated as secondary. A simple rectangular mapping (3 x 3) with some adjustments pole-ward of 60 degrees latitude is sufficient.

26 Criterion for identifying the primary collocation within a given increment is listed below in priority order:  radiosonde manufacturer,  terrain (i.e., no coasts),  vertical extent for temperature (50mb),  vertical extent moisture (400mb),  Tier-2 test,  Tier-3 test,  Time difference, and  distance

A priority listing (or blacklist) for radiosonde manufacturer is optional. Coastal collocations are the first to be flagged as secondary, followed by collocations for which the radiosondes did not reach at least 50mb for temperature and 400mb for moisture. Afterwards, radiosondes which did not pass the Tier-2 and Tier-3 tests described in Section 2 are denoted as secondary. The final delineators are time and distance from the radiosonde, respectively.

Each partition is independently analyzed in this manner. A flag denoting primary and secondary collocations is stored on the collocation output file. Secondary collocations “are not” removed form collocation data sets.

3.6 Collocation Data Sets Running five (5) day samples of collocations as compiled above are retained in conjunction with ATOVS operational processing.

However, longer term collocation data sets, referred to as the Matchup Data Base (MDB), are also compiled and updated on a daily basis and serve as input for product tuning and validation. A Separate MDB is retained for each operational satellite, and for clear and cloudy satellite sounding types, respectively. A brief summary of the MDB is described below, further details can be found in Reale (102 and 107).

The MDB is a partitioned by geographical zone, namely latitude belt and terrain. Each partition is subdivided into grid-points. Individual collocations that fall within a given partition are “linked” to the closest grid. Each partition has a target (maximum) sample size allocation which is proportional to mean radiosonde density of the zone.

27 Sample size allocations are designed to retain a minimum of 30 days of full density observations within a given zone. Collocations older than 30-days undergo selective data deletion during updating depending upon their age and the density of observations for a given grid. For example, older collocations from a data sparse grid within a given zone are more likely to be retained than from a data rich grid. Similarly, overall periods of record for data rich zones (i.e., in the northern hemisphere) will typically be shorter than for data sparse zones. No collocation more than 104 days old is retained.

The overall objective of this design was to provide a more balanced global distribution of collocations. Modest gains have been achieved, for example, the percentage of southern hemisphere radiosondes is typically less than 15%, but the percentage of southern hemisphere collocations on the MDB at any given time typically approaches 25%.

The MDB for each operational satellite is archive on a monthly basis at National Climatic Data Center (NCDC) in Asheville, N.C.

3.7 Future Requirements for Collocation Data Sets The design of the MDB was originally developed to best meet the scientific requirements of the NESDIS ATOVS operational soundings (Reale 102, 107). However, with the pending upgrades of the ATOVS derived product system and ultimate transition to next generation NPOESS, the underlying requirements for collocation data sets need clarification. Specifically, such data sets must be responsive to both the short and long term monitoring of performance in support of real-time weather and climate applications, respectively. Whether the current MDB approach is optimal is questionable. The following section briefly addresses issues and recommendations

The retention of separate collocation datasets for each operational satellite would seem to be a solid requirement, but further segregation by sounding type appears un-necessary. Approaches for creating more balanced global distributions also appear un-necessary, although sub-sampling approaches for more evenly spaced global distributions do have value. A more straight-forward approach to retain and archive specified periods of global data with no overlap appear best.

28 Similarly, it would appear better to allow collocation time differences to vary up to six hours, insuring that each radiosonde would be collocated with a given satellite except in cases of missing data and orbit gaps. The use of the vertical drift information which tracks the location and time of each observation with height is certainly an interesting consideration for future systems, including, as needed, the interpolation of these parameters to specified pressure basis (as in Appendix 2). Distance windows should be standardized to provide collocations in the worst case, that is, at the furthest scan angle for cross-track scanners.

The requirement for allowing only one collocation per radiosonde, per satellite appear solid, however, selection procedures in cases of multiple candidate collocations and possible exceptions to this rule may be warranted.

The satellite data retained for each collocation data record must also be addressed. Currently, the physical parameters stored from the radiosonde are the temperature and moisture profiles (mandatory and significant level), including corrected and uncorrected observations, associated quality marks, vertical drift data and the collocated NWP profiles. The physical parameters stored for the satellite data are the sounding and sensor fields-of-view (fov) used to derive the sounding. This includes selected stages of the radiometric and product processing, ancillary information such as terrain, sea surface temperature and the collocated NWP profiles.

Conspicuously missing from the radiosonde portion of the collocation data record is wind and cloud information. The satellite portion of the record does not include the complete suite of original calibrated observations from each sensor associated with a given sounding. For example, the sensor data (for example AMSU-A) may have been spatially interpolated to a reference fov (for example HIRS).

Alternative approaches for compiling collocated radiosonde and satellite observations could entail contingencies to collocate with respect to the original calibrated sensor data, respectively. Pending requirements to include neighboring sensor data (arrays) in the vicinity of the radiosonde could also be accommodated in this mode.

The real issue is whether collocating satellite and radiosonde observations is enough. It would appear that the optimal utility would be to compile a relational data base through which all critical upper air observations that are spatially and temporally coincident can be readily identified and accessed. However, this would be a tremendous undertaking with an unclear intrinsic value.

29 An alternative approach for upper air observation systems currently under consideration within ongoing NOAA/GCOS workshops is to establish a Reference network of global radiosonde sites providing observations coincident with operational polar satellite overpass (also Reale … REFERENCE Network). Such a network would eliminate many of the problems associated with the sampling and measurement deficiencies as discussed in this report. The integration of satellite and ground truth collection systems with such a network would be optimal.

The bottom line is that there needs to be a specific definition of requirements for collocated satellite and ground-truth observations. Whether the short-term sensor and product performance monitoring requirements and longer-term climate monitoring requirements can both be satisfied via a single database compilation approach must be addressed? Or do we need revised approaches to store multiple coincident data platforms, and if yes, which platforms. Is a reference network ultimately the best approach?

4. RESULTS The following sections provide examples of radiosonde and collocation processing techniques discussed in this report.

4.1 Radiosonde Tier-1 Tier-1 tests primarily identify reports with insufficient observations. The main process is to identify suspect data using the NCEP QC (Appendix-1) markers and NESDIS analysis as described in Section 2.1. On average, about 20% of incoming radiosondes fail the Tier-1 tests.

Although valuable, the interpretation of QC marks is not always clear and the results are at times ambiguous. The following cases show examples of screened levels and/or reports based on Tier-1 tests.

Tables 1 and 2 show portions of standard and significant level report data that contain a QC mark of either 15 or 9 for TD. The report data include original and corrected (Radcor) observations along with drift parameters, NWP data and the QC marks.

Table 1 shows the occurrence of QC marker 15 that are normally observed for TD above 300mb (see Appendix 2). Isolated occurrences of QC marker 15 at the 433mb and 409mb significant levels are also seen; these data were screened not used in interpolations (equation 1). None of the TD data above 300mb were screened.

30 Table 1: Standard (upper) and significant (lower) level radiosonde data which contains a QC mark of 15 for TD (RH Flag) that was screened and not screened, respectively.

Table 2: Standard (upper) and significant (lower) level radiosonde data which contains a QC mark of 15 for TD (RH Flag) that was screened and not screened, respectively.

31 Table 2 shows a sequence where the QC marker of 15 for moisture began at 412mb, resulting in the screening of all moisture data from 412 to 300mb. Since the resulting TD gap exceeds the designated standard layer from 400 to 300mb, the moisture profile was capped below the gap at 500mb. The assignment of QC marker 15 beginning at 400mb (instead of above 300mb) is questionable as the data appear to be reasonable and consistent with NWP.

Table 3 shows a case of combined QC markers 9 and 15 for TD beginning at 302mb, with Figure 4 showing a plot (Skew-T / Log P) of the radiosonde (red) and forecast (brown) temperature and TD (dashed) profiles. In this case the use of a QC marker 9 and/or 15 appears to be interchangeable for upper level moisture, a pattern which led to the NESDIS protocol of not differentiating these markers for upper level TD. The radiosonde value at 302mb does not appear suspect, and the switch from a QC flag of 15 to 9 between 300mb and 250mb is not understood.

This example also shows a case of screened data in the boundary layer with a QC mark of 8 associated with large differences from NWP. The interpretation of QC mark 8 in this case is unclear. It is also curious that the QC mark changes from 8 to 2 (neutral) above 884mb even though large differences between the NWP and radiosonde persist. The QC mark of 8 for the first significant level is ignored.

The report for this case was not screened, for although the 850mb standard level is screened, there are unscreened observations between 850mb and 700mb. Similarly, the TD profile extends to 250mb even though the 300mb TD was screened because there were unscreened TD in the designated standard layers (see Appendix 2) above and below 300mb.

Table 4 and Figure 5 illustrate a case of screened T data beginning at 225mb and extending to the top (30mb) of the report profile. Overall the report data look reasonable even through the tropopause region (between 300 and 150mb) where differences from NWP are relatively (see Figure 5). The assignment of QC mark 14 (purge flag) for standard and significant level data from 250mb to 21mb appears heavy handed. Although this report was not screened, the T and TD profiles were capped at 300mb.

This example also illustrates the rare occurrence of TD data above 300mb (at 250mb) having QC marks other than 9 or 15.

32 Table 3: Standard (upper) and significant (lower) level radiosonde data which contains a QC mark of 8, 9 or 15 that was screened and not screened, respectively.

Figure 4: Skew-T Log P plot of observed radiosonde (red) and NWP profiles (brown) for T (solid) and TD (dashed) corresponding to Table 3. GRAW DFM-90 (Germany), Site ID 99877, 34.2N, 67.3E, 2/14/05, 17Z. Tier1c (2)

33 Table 4: Standard (upper) and significant (lower) level radiosonde data which contains a QC mark of 14 that were screened and 15 that were not screened.

Figure 5: Skew-T Log P plot of observed radiosonde (red) and NWP profiles (brown) for T (solid) and TD (dashed) corresponding to Table 4. Shanghai Radio (China), Site ID 57494, 30.6N, 114.1E, 2/14/05, 23Z. CleanF 96

34 The value of additional checks as provided by NESDIS in conjunction with the NCEP QC marks is shown in Tables 5 and 6.

Table 5 shows a case where the significant level temperature data has an errant value 14.6 C at 622mb. Perhaps the value should have been 4.6C, nevertheless it was not screened by the QC marker which shows a value of 2 (neutral) but was ultimately screened by the NESDIS climate test which specifies a maximum temperature of approximately 12C for this level, season and latitude zone (Appendix- 3).

Table 5: Standard (upper) and significant (lower) level radiosonde data (at 6.8S) which contains bad data (622mb) not identified by the NCEP QC mark but ultimately screened by NESDIS climate test. Tier1e (47) 82678

A final case of multiple interests is shown in Table 6. First, the 700mb standard layer TD (1.9C) is greater than then the radiation corrected (Radcor) T (1.8C). Such observations led to the protocol for adjusting WVMR using equation (4) as presented in Section 2.2. Second, the combination of significant and standard level data from 986mb to 700mb suggests that all (or perhaps none) of the data are suspect but curiously only the standard level data at 850mb level was flagged (QC mark of 6). The approximately 3.7K inversion aloft from 925 to 700mb (the -1.7K at 850mb is screened) is considered marginal. This report was ultimately accepted with the TD at 700mb adjusted to 1.5C. A plot of the radiosonde and corresponding NWP data below 500mb is illustrated in Figure 6. The value of these data is clearly ambiguous.

35 Table 6: Standard (upper) and significant (lower) level radiosonde data with TD greater than radiation corrected T (700mb) and associated NCEP QC marks.

Figure 6: Skew-T Log P plot of observed radiosonde (red) and NWP profiles (brown) for T (solid) and TD (dashed) corresponding to Table 6. Site 57749 Tier1g (4)

36 4.2 Radiosonde Tier-2 Tier-2 tests identify TD profiles that are either suspect or contain vertical structures that are typically not resolvable with existing satellite moisture sensors. The approach is isolate TD profiles which do not portray a relatively smooth decrease in WVMR with height. Portions of individual profiles showing WVMR increase or excessive decrease with height, referred to as events are analyzed as described in Section 2.2. Testing is done separately below (Part-1) and above 300mb (Part-2), with extrapolation of all moisture profiles to 200mb as necessary. On average, about 15% of the radiosondes passing the Tier-1 tests are flagged by the Tier-2 tests.

The idea is that profiles flagged by Tier-2 tests are not well suited for derived product tuning and validation since they are more a source of noise than information within the context of the sounder measurements. However, whether real or suspect, the subset of profiles flagged by Tier-2 tests comprises an interesting investigative dataset and for this reason is retained on all output and archive files. The following cases show examples of profiles identified for screening by Tier-2 tests.

Figure 7 (24, 86) illustrates two examples of profiles failing Tier-2, Part-1 moisture tests in which two or more events occurred. The corresponding interpolated WVMR (g/kg) at ATOVS pressure levels are listed below:

Pressure (mb) Top Bottom 1000 3.80 15.95 950 3.14 12.73 920 2.70 12.53 850 2.36 11.15 780 2.18 9.94 700 0.29 5.56 670 0.62 5.48 620 1.05 5.33 570 0.83 0.63 500 0.16 1.69 475 0.09 1.84 430 0.04 0.48 400 0.25 1.04 350 0.03 0.33 300 0.02 0.09

37 Figure 7: Skew-T Log P plots of radiosonde moisture (dashed) profiles which failed the Tier-2, part-1 tests with two or more events. Top: Vaisala RS80 57H, Site 72659, 45N, 98W, 2/15/05, 0Z; Bottom: Vaisala RS80, DigiCora (Fin), Site 96471, 6N, 116E , 2/15/05, 0Z.

The top profile in Figure 7 shows three events. The first is from 780 to 700 where the WVMR at 700mb is less than the normalized threshold (.32) from equation (14). The second is an event sequence is from 670mb to 570mb where the increased WVMR values at 670mb and 620mb exceed the threshold ( using equation (10). Note that the event sequence does not end until 500mb, where the WVMR (.16) is less than the MR0 value (.29) initialized at 700mb. The third is an event sequence from 430mb to 350mb where the increase in WVMR exceeds the threshold from equation (11). The sharp decrease from 570mb to 500mb embedded in the second event sequence does not exceed the normalized threshold (.11) from equation (14).

The bottom profile also shows three events. The first event is from 620mb to 570mb where the WVMR at 570mb is less than the normalized threshold (1.1). The second is an event sequence from 570mb to 430mb where the increase(s) in WVMR exceed threshold from equation (10). The third is an event sequence from 430mb to 350mb where the WVMR at 400mb exceeds normalized threshold also from equation (10).

38 Figure 8 (51, 21, Tier2 ) illustrates examples of profiles failing the Tier-2, Part-1 moisture test criteria for event sequences which exceed 6 consecutive ATOVS levels. The corresponding ATOVS pressure levels and mixing ratio profiles for each are listed below

Pressure (mb) Top Bottom 1000 6.70 3.76 950 5.91 3.32 920 5.30 2.61 850 3.14 1.91 780 0.31 0.30 700 1.37 0.064 670 1.91 0.044 620 2.89 0.33 570 2.33 0.11 500 1.64 0.098 475 1.38 0.067 430 0.97 0.32 400 0.73 0.34 350 0.40 0.19 300 0.19 0.092

Both cases show event sequences comprised of greater than six (6) consecutive ATOVS levels for which a given MR0 mixing ratio value was exceeded.

The Top case has an MR0 of .31 at 780mb which was exceeded for ten (10) consecutive levels up to 300mb. This sequence also includes a separate event of decreasing WVMR between 850mb and 780mb (normalized threshold (15) at 780mb was .57 g/kg).

The Bottom case shows the MR0 of .044 at 670mb being exceeded for at least eight (8) consecutive levels to 300mb. This sequence also includes a separate event of decreasing WVMR between 850mb and 780mb (normalized threshold (15) at 780mb was .35 g/kg). .

39

Figure 8: Examples of radiosonde moisture (dashed) profiles which failed the Tier-2, part- 1 test of six or more successive levels with mixing ratio values exceeding MR0. Top: Vaisala RS80, PCCORA (Fin), Site 93844, 46S, 168E, 2/14/05, 22Z; Bottom: Vaisala RS80, Digi Cora III (Fin), Site 03005, 60N, 1.2W, 2/14/05, 23Z.

Figure 9 illustrates examples of profiles for which the Tier-2, Part-2 tests encountered increasing mixing ratios above 400mb.

The upper panel of Figure 9 typifies the case for which the Tier-2, Part-2 tests were primarily designed (similar to upper panel of Figure 1). In this case, a level

( PL = 350mb) was found above which “all” the interpolated mixing ratio values (MRI) exceed their projected power law value. Subsequently, replacement values consisting of projected 350mb power law mixing ratios were provided above 350mb. Two moisture (dashed) profiles are shown, the darker are the original report values, the red are the corresponding replacement values. … slight error to be corrected …

40 Figure 9: The top profile (484 Clean PL2) had an original mixing ratio of .09 g/kg, the replacement value via Tier-2 part-2 tests is .008 g/kg. The lower profile (87 CleanF) is not replaced, also illustrates “interpolation” of sig to ATOVS levels.

The lower panel of Figure 9 illustrates a case where replacement values were not provided. As can be seen, the red profile does show a slight increase in mixing ratio between 350 and 300mb. However, the projected power law mixing ratio at 200mb (Equation 16) is greater than the reported 200mb mixing ratio, so the original values are retained.

The lower panel of Figure 9 also illustrates several aspects of the interpolation and extrapolation techniques to construct moisture profiles using available mandatory and significant level report data.

Beginning at 500mb, one can follow the construction of the interpolated profile (red curve with ATOVS pressure levels indicated on the left-side axis) using various mandatory and significant level points. As can be seen, the highest significant level (264mb) is close enough to 250mb (spans over .25 of interval) and therefore used to compute the extrapolated mixing ratio at 250mb.

41 Above 250mb the moisture was extrapolated using the lapse rate from the previous layer, which in this case provided a mixing ratio at 200mb which was less than the respective power law and constant dewpoint depression values (a relatively rare occurrence). However, since the TD lapse rate between 300mb and 250mb slightly exceeded adiabatic (blue curves), the extrapolated lapse rate from 250mb to 200mb was tempered to adiabatic.

4.3 Radiosonde Tier-3 Tier-3 tests identify profiles with tropospheric temperature inversions. Surface inversions which exceed 9o K or span more than two standard layers (see Appendix-2) and inversions aloft which exceed 5o K or span more than one standard layer are identified as candidate for screening. Figure 10 illustrates examples of T profiles with inversions identified by Tier 3 tests.

Figure 10: Inversions denoted by Tier-3 tests; Top 71801 (68 inv) 47N, 52W Sable Island?, 12Z, Vaisala RS80 Digi Cora Fin; Bottom 71957 (16 inv), 68N, 133W (E Green coast), 00Z, Vaisala RS80 Digi Cora Fin.

42 The upper panel of Figure 10 suggests a case of overriding warm and relatively moist air aloft with cold air at the surface capped (trapped) by a dry stable layer.

The lower panel of Figure 10 is the typical surface inversion. The lighter dashed plot show the significant level moisture data, suggesting that above the inversion (to about 650mb) the moisture sensor became erratic probably due to sensor icing through the frigid boundary layer.

Each of moisture profiles for these cases contains two Tier-2 events. Real or not, the profiles in Figure 10 represent prime cases for which the satellite sounder measurements and radiative transfer model calculations are essentially void with respect to ambient T and moisture structures below 600mb.

4.4 Collocated Radiosonde and Satellite Observations As discussed in Section 3.4, the combined sun-synchronous and synoptic observation time characteristic of polar satellite and radiosonde data, respectively, results in sampling bias among collocations per satellite. Figure 11 shows a sub- sampling of the collocations in Figure 3 covering the 30N to 30S region for NOAA-16 (upper) and NOAA-15 (lower), and corresponding statistics of satellite sounding minus radiosonde temperature mean and standard deviation differences based on the land (green) collocations. Significant differences in the statistics are seen, however these differences are misleading with respect to overall product performance being much more indicative of the sampling differences. Embedded systematic errors as illustrated by these statistics lead to uncertainties which plague all validation protocols comprised of such observations (and ultimately their utility in weather and climate applications).

Another potential source of systematic sampling bias can emanate from the radiosonde screening protocols presented in this report. The question is whether screening causes the systematic removal of radiosonde observations on a regional basis. The three (3) panels of Figure 12 illustrate the global distribution of radiosondes which passed all the screening criteria (top), those designated for screening based on Tier-1 and Tier-2 tests (middle), and reports with significant inversions based on Tier-3 tests (bottom). Sampling was done over a 3-day period during February, 2005. Of the over 1900 radiosondes observations, approximately 1300 passed screening, 500 were identified for screening by Tier-1 (240) or Tier-2 (260) tests, and 100 had significant inversions via Tier-3 tests.

As can be seen, other than for radiosondes on the subcontinent of India, the candidates for screening based on Tier-1 and Tier-2 tests show little if any regional pattern compared to unscreened observations. Radiosondes with significant inversions (Tier-3) are mainly observed in northern hemisphere polar regions, as expected given the February time period of the data.

43

Figure 11: Global distributions and associated statistics for collocated radiosonde and satellite observations from NOAA-16 (topmost) and NOAA-15 (bottommost) for the 30N to 30S region; statistics are for collocations with land satellite soundings (green).

44 Figure 12: Global distributions of radiosonde reports which passed all the screening criteria (top), were designated for screening based on Tier-1 and Tier-2 tests (middle), and had significant inversions based on Tier-3 tests (bottom), respectively, over a 3-day period during February, 2005.

45 5. REPORT SUMMARY This report summarizes NESDIS operational protocols for processing and screening radiosonde observations and compiling collocated radiosonde and satellite observations.

Radiosondes are accessed for NCEP supported “PREPBUFR” files and candidates for collocation with satellite observations are selected. Processing steps include:  TIER-1 tests invoking NCEP quality markers and report completeness requirements,  Tier-2 tests of TD (WVMR) profile,  Interpolation to ATOVS pressure levels  Upward and downward extrapolation of WVMR and T profiles  Tier-3 tests for T inversions.

Candidate report files include corrected (Radcor) and uncorrected observations, test results and collocated NWP forecast observations.

Radiosondes passing Tier-1 tests are candidate for collocation with satellite observations. Collocations consist of one operational sounding per candidate radiosonde, typically the closest in time and distance, in that order. Collocations are temporarily stored on 5-day running files, and permanently on longer term files referred to as the MDB. The MDB is archived by NCDC on a monthly basis.

Discussions for collocated observations include existing sampling bias due to combined synoptic and sun-synchronous characteristics of most radiosonde and satellite data, respectively, and subsequent impacts on product validation, tuning and user applications. Modifications and expansions to better satisfy requirements for pending next generation systems (NPP and NPOESS) are proposed.

The report concludes with results demonstrating the radiosonde processing and collocation strategies discussed in the report.

46 Appendix-1

Code table for EMC/NCEP/NOAA observation quality markers (Revised 7/13/05). (http://www.emc.ncep.noaa.gov/mmb/data_processing/prepbufr.doc/table_7.htm)

The quality markers and associated actions presented in Appendix-1 were copied from the web site indicated above. Please refer to this site for further definitions and clarifications.

Quality Marker Definitions: 0: All steps: Keep (always assimilate). Applies to pressure, height, wind, temperature, specific humidity, rainfall rate, precipitable water and cloud top pressure. 1: All steps: Good. Applies to pressure, height, wind, temperature, specific humidity, rainfall rate, precipitable water and cloud top pressure. 2: All steps: Neutral or not checked (default). Applies to pressure, height, wind, temperature, specific humidity, rainfall rate, precipitable water and cloud top pressure. 3: All steps: Suspect. Applies to pressure, height, wind, temperature, specific humidity, rainfall rate, precipitable water and cloud top pressure.

4-15 All steps: Rejected (don't assimilate), as defined below: 4: Step OIQC: An observation with pre-existing quality marker 0 (keep) is flagged. Applies to pressure, height, wind, temperature, specific humidity and precipitable water. 5: Step OIQC: An observation with pre-existing quality marker 1 (good) is flagged. Applies to pressure, height, wind, temperature, specific humidity and precipitable water. 6: Step OIQC: An observation with pre-existing quality marker 2 (neutral/default) is flagged. Applies to pressure, height, wind, temperature, specific humidity and precipitable water. 7: Step OIQC: An observation with pre-existing quality marker 3 (suspect) is flagged. Applies to pressure, height, wind, temperature, specific humidity and precipitable water. 8: Step PREVENT: An observed pressure on any level is > 100 mb below model (guess) surface pressure. Applies to pressure, height, wind, temperature and specific humidity. Step PREVENT: An observed pressure is < 0. Applies to pressure, height, wind, temperature and specific humidity. Step VIRTMP: A virtual temperature is generated from a specific humidity observation with a rejected quality marker (> 3). Applies to temperature.

47 Appendix-1 (cont.)

Code table for EMC/NCEP/NOAA observation quality markers (revised 7/13/05)

9: Step PREVENT: An observed surface pressure is > 100 mb above or below model (guess) surface pressure. Applies to pressure, height, wind, temperature and specific humidity. Step PREVENT: An observation error is missing (applies only for AVN, FNL and CDAS networks). Applies to pressure, height, wind, temperature, specific humidity and precipitable water. Step PREVENT: A non-pressure observation fails a limit check. Applies to pressure, height, wind, temperature, specific humidity and precipitable water. Step PREVENT: A moisture observation is above 300 mb. Applies to moisture. 10: Step SYNDATA: A non-synthetic bogus mass report is in the vicinity of a tropical storm. Applies to pressure (only) on all levels of the mass report. Step SYNDATA: A dropwinsonde wind report is in the vicinity of a tropical storm. Applies to wind (only) on all levels of the wind report. Step PREPACQC: An AIREP or PIREP aircraft report in Table A entry AIRCFT is used to generate a superob report. Applies to wind and temperature. Data Producer: A wind profiler report in Table A entry PROFLR fails both median and shear checks performed by NOAA/FSL. Applies to wind. 11: NCEP/SDM: An observation with pre-existing quality marker 3 (suspect) is flagged. Applies to pressure, height, wind, temperature and specific humidity. This is currently not used. Data Producer: A wind profiler report in Table A entry PROFLR does not pass median and shear checks performed by NOAA/FSL (results are inconclusive). Applies to wind. 12: NCEP/SDM: A non-profiler observation is on the reject list. Applies to pressure, height, wind, temperature and specific humidity. Data Producer: A wind profiler report in Table A entry PROFLR fails shear check performed by NOAA/FSL, but passes median check. Applies to wind.

48 Appendix-1 (cont.)

Code table for EMC/NCEP/NOAA observation quality markers (revised 7/13/05)

13: All automated quality control steps: A non-wind profiler observation failed one or more checks. Applies to pressure, height, wind, temperature, specific humidity and precipitable water. Step CQCPROF: A wind profiler report in Table A entry PROFLR failed one or more checks. Applies to wind. Data Producer: A wind profiler report in Table A entry PROFLR fails median check performed by NOAA/FSL, but passes shear check. Applies to wind. 14: NCEP/SDM: An observation is assigned a purge flag. Applies to pressure, height, wind, temperature and specific humidity. 15: Step PREPRO: An observation is flagged for non-use by the analysis. Applies to pressure, height, wind, temperature, specific humidity, rainfall rate, precipitable water and cloud top pressure. : Missing

49 Appendix-2

ATOVS and Radiosonde Standard Pressure Levels

A. The 42 ATOVS Pressure Levels (mb): 1 through 10; .1, .2, .5, 1.0, 1.5, 2.0, 3.0, 4.0, 5.0, 7.0

11 through 20; 10, 15, 20, 25, 30, 50, 60, 70, 85, 100

21 through 30; 115, 135, 150, 200, 250, 300, 350, 400, 430, 475

31 through 40; 500, 570, 620, 670, 700, 780, 850, 920, 950, 1000

41 and 42; 1012, 1030

B. The 18 Radiosonde Standard Levels (mb): 1 through 10; 5, 7, 10, 20, 30, 50, 70, 100, *150, 200

11 though 18; *250, 300, 400, 500, 700, 850, *925, 1000

* indicates pressure levels omitted for gap tests

50 Appendix-3

Climatological Limits for Temperature

The following Tables show climatological profiles of maximum and minimum temperature as a function of standard pressure, 15-degree latitude zone and season (1 indicating June, July, August, etc).

The actual limits used for testing are linearly interpolated to the month and pressure of the observation.

The interpolation to the month uses the seasonal profile for observations occurring in the center months (i.e., July, October, January, April). Observations occurring in all other months are linearly interpolated as 2/3rds of the given season plus 1/3rd of the nearest adjacent season. This results in a unique standard level profile for each latitude belt and month.

Report standard level temperatures are tested against these profiles; significant levels are not tested.

Report significant level temperatures are tested against interpolated standard level profiles using equation (1A):

TI = TS- + (dT/dP)I (DELP) (1A) where: S – indicates the standard level limit immediately below level I, S + indicates the standard level limit immediately above I, DELP = Ln (PI / PS-) (dT/dP)I = (TS+ - TS-) / Ln (PS+ / PS-)

Levels outside the range of standard values are tested against the closest standard level.

51 Appendix-3 (cont.) Climatological Limits for Temperature

Standard Level 1: 1000 mb Lat Season Season Season Season Zone 1 2 3 4 min max min max min max min max 1 -20. 12. -40. 8. -60. 5. -46. 6. 2 -20. 29. -35. 25. -60. 20. -42. 25. 3 -8. 29. -28. 29. -50. 29. -34. 29. 4 -4. 44. -10. 40. -20. 35. -16. 39. 5 5. 55. 2. 40. -8. 36. 2. 40. 6 5. 45. 5. 45. 5. 45. 5. 45. 7 5. 45. 5. 45. 5. 45. 5. 45. 8 -8. 36. 2. 40. 5. 55. 2. 40. 9 -20. 35. -16. 39. -4. 48. -10. 40. 10 -50. 29. -34. 29. -8. 35. -28. 29. 11 -60. 20. -42. 25. -20. 29. -35. 25. 12 -60. 5. -46. 6. -20. 17. -40. 8.

Standard Level 2: 925 mb Lat Season Season Season Season Zone 1 2 3 4 min max min max min max min max 1 -25. 10. -38. 6. -55. 3. -43. 5. 2 -25. 23. -35. 18. -55. 14. -40. 18. 3 -13. 26. -32. 24. -48. 22. -34. 23. 4 -5. 38. -18. 35. -24. 30. -19. 35. 5 1. 43. -3. 36. -10. 34. -3. 36. 6 3. 38. 3. 38. 3. 38. 3. 38. 7 3. 38. 3. 38. 3. 38. 3. 38. 8 -10. 34. -3. 36. 1. 45. 3. 36. 9 -24. 30. -19. 35. -5. 42. -18. 35. 10 -48. 22. -34. 23. -13. 31. -32. 24. 11 -55. 14. -40. 18. -25. 25. -35. 18. 12 -55. 3. -43. 5. -25. 12. -38. 6.

Standard Level 3: 850 mb Lat Season Season Season Season Zone 1 2 3 4 min max min max min max min max 1 -30. 8. -35. 3. -50. 0. -40. 3. 2 -30. 16. -35. 10. -50. 8. -38. 10. 3 -17. 22. -35. 19. -45. 15. -34. 16. 4 -5. 32. -25. 30. -27. 25. -22. 30. 5 -3. 31. -8. 31. -12. 31. -8. 31. 6 0. 31. 0. 31. 0. 31. 0. 31. 7 0. 31. 0. 31. 0. 31. 0. 31. 8 -12. 31. -8. 31. -3. 35. -8. 31. 9 -27. 25. -22. 30. -5. 36. -25. 30. 10 -45. 15. -34. 16. -17. 27. -35. 19. 11 -50. 8. -38. 10. -30. 20. -35. 10. 12 -50. 0. -40. 3. -30. 12. -35. 3.

52 Appendix-3 (cont.) Climatological Limits for Temperature

Standard Level 4: 700 mb Lat Season Season Season Season Zone 1 2 3 4 min max min max min max min max 1 -45. 2. -45. -3. -55. -5. -45. -3. 2 -45. 8. -45. 5. -50. 7. -45. 10. 3 -25. 12. -42. 10. -50. 10. -34. 10. 4 -15. 17. -35. 18. -40. 15. -29. 15. 5 -10. 18. -13. 18. -20. 20. -18. 19. 6 -10. 18. -10. 19. -10. 20. -10. 19. 7 -10. 20. -10. 19. -10. 18. -10. 19. 8 -20. 20. -18. 19. -10. 20. -13. 18. 9 -40. 15. -29. 15. -15. 22. -35. 18. 10 -50. 10. -34. 10. -25. 14. -42. 10. 11 -50. 7. -45. 10. -45. 10. -45. 5. 12 -50. -5. -45. -3. -45. 2. -45. -3.

Standard Level 5: 500 mb Lat Season Season Season Season Zone 1 2 3 4 min max min max min max min max 1 -54. -10. -55. -10. -65. -10. -57. -10. 2 -54. -10. -55. -10. -65. -10. -50. -10. 3 -40. -4. -52. -4. -59. -4. -46. -4. 4 -32. 1. -45. 0. -50. 1. -43. 1. 5 -20. 4. -24. 4. -35. 4. -25. 4. 6 -24. 4. -24. 4. -24. 4. -24. 4. 7 -24. 4. -24. 4. -24. 4. -24. 4. 8 -35. 4. -25. 4. -20. 4. -24. 4. 9 -50. 1. -43. 1. -32. 4. -45. 0. 10 -59. -4. -46. -4. -40. -4. -52. -4. 11 -59. -10. -50. -10. -54. -6. -55. -10. 12 -57. -10. -57. -10. -54. -10. -55. -10.

Standard Level 6: 400 mb Lat Season Season Season Season Zone 1 2 3 4 min max min max min max min max 1 -67. -29. -65. -21. -70. -21. -65. -21. 2 -67. -21. -65. -20. -70. -21. -65. -25. 3 -50. -14. -60. -14. -62. -14. -55. -14. 4 -45. -10. -50. -8. -55. 0. -50. -8. 5 -32. -2. -35. -2. -40. -2. -35. -2. 6 -29. -3. -29. -2. -29. -3. -29. -2. 7 -29. -3. -29. -2. -29. -3. -29. -2. 8 -40. -2. -35. -2. -32. -2. -35. -2. 9 -55. 0. -50. -8. -45. -6. -50. -8. 10 -62. -14. -55. -14. -50. -14. -60. -14. 11 -65. -21. -65. -25. -67. -21. -65. -20. 12 -60. -21. -65. -21. -67. -27. -65. -21.

53 Appendix-3 (cont.) Climatological Limits for Temperature

Standard Level 7: 300 mb Lat Season Season Season Season Zone 1 2 3 4 min max min max min max min max 1 -72. -40. -70. -35. -76. -27. -72. -35. 2 -70. -37. -70. -32. -75. -27. -70. -33. 3 -60. -30. -68. -22. -70. -25. -65. -30. 4 -55. -25. -62. -22. -66. -4. -64. -20. 5 -47. -17. -50. -17. -54. -8. -50. -17. 6 -45. -16. -45. -17. -45. -16. -45. -17. 7 -45. -16. -45. -17. -45. -16. -45. -17. 8 -54. -8. -50. -17. -47. -17. -50. -17. 9 -66. -4. -64. -20. -55. -21. -62. -22. 10 -70. -25. -65. -30. -60. -30. -68. -22. 11 -72. -27. -70. -33. -70. -37. -70. -32. 12 -72. -27. -72. -35. -72. -40. -70. -35.

Standard Level 8: 250 mb Lat Season Season Season Season Zone 1 2 3 4 min max min max min max min max 1 -73. -40. -74. -35. -76. -30. -75. -35. 2 -73. -37. -74. -35. -76. -30. -74. -35. 3 -65. -35. -70. -31. -76. -28. -72. -35. 4 -62. -34. -68. -31. -76. -9. -72. -32. 5 -60. -20. -60. -31. -66. -18. -60. -32. 6 -54. -20. -54. -33. -54. -20. -54. -32. 7 -54. -20. -54. -32. -54. -20. -54. -33. 8 -66. -18. -60. -32. -60. -20. -60. -31. 9 -76. -9. -72. -32. -62. -30. -68. -31. 10 -76. -28. -72. -35. -65. -35. -70. -31. 11 -76. -30. -74. -35. -73. -37. -74. -35. 12 -76. -30. -75. -35. -73. -38. -74. -35.

Standard Level 9: 200 mb Lat Season Season Season Season Zone 1 2 3 4 min max min max min max min max 1 -70. -39. -75. -35. -81. -32. -77. -35. 2 -70. -36. -75. -35. -82. -32. -77. -35. 3 -70. -35. -75. -35. -78. -28. -73. -37. 4 -66. -35. -75. -35. -76. -20. -74. -37. 5 -66. -29. -70. -42. -75. -28. -75. -42. 6 -72. -29. -70. -44. -72. -29. -78. -45. 7 -72. -29. -78. -45. -72. -29. -70. -44. 8 -75. -28. -75. -42. -66. -29. -70. -42. 9 -76. -20. -74. -37. -66. -35. -75. -35. 10 -78. -28. -73. -37. -70. -35. -75. -35. 11 -82. -32. -77. -35. -70. -36. -75. -35. 12 -81. -32. -77. -35. -70. -36. -75. -35.

54 Appendix-3 (cont.) Climatological Limits for Temperature

Standard Level 10: 150 mb Lat Season Season Season Season Zone 1 2 3 4 min max min max min max min max 1 -67. -30. -75. -35. -84. -37. -79. -35. 2 -70. -33. -80. -35. -84. -35. -79. -35. 3 -74. -35. -80. -35. -85. -28. -77. -39. 4 -75. -42. -80. -40. -85. -33. -76. -40. 5 -76. -42. -81. -50. -84. -35. -80. -50. 6 -76. -42. -78. -50. -78. -42. -80. -50. 7 -78. -42. -80. -50. -76. -42. -78. -50. 8 -84. -35. -80. -50. -76. -42. -81. -50. 9 -85. -33. -76. -40. -75. -42. -80. -40. 10 -85. -28. -77. -39. -74. -35. -80. -35. 11 -84. -35. -79. -35. -70. -33. -80. -35. 12 -84. -37. -79. -35. -67. -30. -75. -35.

Standard Level 11: 100 mb Lat Season Season Season Season Zone 1 2 3 4 min max min max min max min max 1 -65. -30. -80. -35. -87. -38. -79. -35. 2 -62. -30. -80. -35. -87. -36. -79. -35. 3 -72. -40. -80. -35. -87. -28. -78. -39. 4 -82. -46. -80. -45. -87. -39. -81. -45. 5 -90. -57. -90. -59. -91. -50. -90. -60. 6 -95. -54. -95. -60. -95. -54. -95. -60. 7 -95. -54. -95. -60. -95. -54. -95. -60. 8 -91. -50. -90. -60. -90. -57. -90. -59. 9 -87. -39. -81. -45. -82. -46. -80. -45. 10 -87. -28. -78. -39. -72. -40. -80. -35. 11 -85. -36. -79. -35. -62. -30. -80. -35. 12 -85. -38. -79. -35. -65. -30. -80. -35.

Standard Level 12: 70 mb Lat Season Season Season Season Zone 1 2 3 4 min max min max min max min max 1 -56. -29. -80. -35. -91. -38. -79. -30. 2 -54. -22. -80. -35. -89. -32. -79. -30. 3 -70. -36. -80. -35. -95. -32. -75. -38. 4 -80. -46. -80. -43. -95. -36. -76. -40. 5 -88. -57. -90. -56. -95. -48. -84. -57. 6 -95. -55. -95. -58. -95. -55. -95. -58. 7 -95. -55. -95. -58. -95. -55. -95. -58. 8 -95. -48. -84. -57. -88. -57. -90. -56. 9 -95. -36. -76. -40. -80. -46. -80. -43. 10 -95. -32. -75. -38. -70. -36. -80. -35. 11 -89. -32. -79. -30. -54. -22. -80. -35. 12 -89. -38. -79. -30. -56. -29. -80. -35.

55 Appendix-3 (cont.) Climatological Limits for Temperature

Standard Level 13: 50 mb Lat Season Season Season Season Zone 1 2 3 4 min max min max min max min max 1 -50. -22. -80. -28. -95. -34. -81. -25. 2 -52. -18. -80. -35. -95. -28. -81. -25. 3 -65. -33. -80. -35. -95. -28. -76. -38. 4 -75. -45. -90. -40. -95. -28. -76. -40. 5 -80. -54. -80. -51. -90. -45. -78. -46. 6 -85. -50. -85. -51. -90. -52. -85. -51. 7 -90. -52. -85. -51. -85. -50. -85. -51. 8 -90. -45. -78. -46. -80. -54. -80. -51. 9 -95. -28. -76. -40. -75. -45. -90. -40. 10 -95. -28. -76. -38. -65. -33. -80. -35. 11 -89. -28. -81. -25. -52. -18. -80. -35. 12 -89. -34. -81. -25. -50. -22. -80. -28.

Standard Level 14: 30 mb Lat Season Season Season Season Zone 1 2 3 4 min max min max min max min max 1 -46. -16. -80. -28. -95. -34. -84. -20. 2 -51. -18. -80. -28. -95. -25. -84. -20. 3 -60. -30. -90. -31. -95. -25. -78. -35. 4 -67. -42. -90. -35. -90. -25. -78. -40. 5 -70. -43. -70. -41. -90. -30. -78. -41. 6 -79. -41. -70. -41. -79. -41. -70. -41. 7 -79. -41. -70. -41. -79. -41. -70. -41. 8 -90. -30. -78. -41. -70. -43. -70. -41. 9 -90. -25. -78. -40. -67. -42. -90. -35. 10 -95. -25. -78. -35. -60. -30. -90. -31. 11 -90. -25. -84. -20. -51. -18. -80. -28. 12 -90. -34. -84. -20. -46. -16. -80. -28.

Standard Level 15: 20 mb Lat Season Season Season Season Zone 1 2 3 4 min max min max min max min max 1 -44. -7. -85. -28. -95. -31. -84. -20. 2 -47. -7. -85. -28. -95. -21. -84. -18. 3 -55. -25. -84. -28. -92. -20. -79. -28. 4 -60. -34. -80. -32. -80. -20. -67. -32. 5 -65. -34. -65. -32. -80. -28. -62. -32. 6 -78. -35. -63. -35. -78. -30. -62. -35. 7 -78. -30. -62. -35. -78. -35. -63. -35. 8 -80. -28. -62. -32. -65. -34. -65. -32. 9 -80. -20. -67. -32. -60. -34. -80. -32. 10 -92. -20. -79. -28. -55. -25. -84. -28. 11 -89. -21. -84. -18. -47. -7. -85. -28. 12 -89. -31. -84. -20. -44. -7. -85. -28.

56 Appendix-3 (cont.) Climatological Limits for Temperature

Standard Level 16: 10 mb Lat Season Season Season Season Zone 1 2 3 4 min max min max min max min max 1 -39. 17. -80. -20. -92. -30. -75. -18. 2 -41. 12. -88. -18. -92. -18. -75. -18. 3 -45. -6. -82. -18. -90. -18. -73. -18. 4 -52. -20. -75. -20. -78. -20. -64. -20. 5 -56. -23. -60. -24. -78. -24. -55. -24. 6 -77. -26. -53. -26. -78. -26. -53. -26. 7 -78. -26. -53. -26. -77. -26. -53. -26. 8 -78. -24. -55. -24. -56. -23. -60. -24. 9 -78. -20. -64. -20. -52. -20. -75. -20. 10 -90. -18. -73. -18. -45. -6. -82. -18. 11 -88. -18. -75. -18. -41. 12. -88. -18. 12 -88. -30. -75. -18. -39. 17. -80. -20.

Standard Level 17: 7 mb Lat Season Season Season Season Zone 1 2 3 4 min max min max min max min max 1 -35. 22. -70. -18. -87. -31. -70. -18. 2 -38. 17. -75. -18. -85. -24. -70. -18. 3 -40. -2. -80. -18. -80. -21. -66. -18. 4 -45. -15. -70. -18. -73. -19. -60. -14. 5 -58. -19. -60. -20. -73. -20. -50. -20. 6 -76. -20. -48. -20. -76. -20. -48. -20. 7 -76. -20. -48. -20. -76. -20. -48. -20. 8 -73. -20. -50. -20. -58. -19. -60. -20. 9 -73. -19. -60. -14. -45. -15. -70. -18. 10 -80. -21. -66. -18. -40. -2. -80. -18. 11 -77. -24. -70. -18. -38. 17. -75. -18. 12 -79. -31. -70. -18. -35. 22. -70. -18.

Standard Level 18: 5 mb Lat Season Season Season Season Zone 1 2 3 4 min max min max min max min max 1 -35. 25. -66. -19. -81. -42. -70. -16. 2 -35. 20. -70. -17. -81. -20. -70. -11. 3 -38. 0. -75. -15. -80. -15. -66. -12. 4 -40. -3. -70. -12. -71. -15. -60. -9. 5 -58. -8. -60. -10. -71. -12. -45. -9. 6 -71. -8. -45. -8. -71. -8. -45. -8. 7 -71. -8. -45. -8. -71. -8. -45. -8. 8 -71. -12. -45. -9. -58. -8. -60. -10. 9 -71. -15. -60. -9. -40. -3. -70. -12. 10 -75. -15. -66. -12. -38. 0. -75. -15. 11 -75. -20. -70. -11. -35. 20. -70. -17. 12 -75. -42. -70. -16. -35. 25. -66. -19.

57