Quality Assurance Procedures in the Oklahoma Mesonetwork

Total Page:16

File Type:pdf, Size:1020Kb

Quality Assurance Procedures in the Oklahoma Mesonetwork 474 JOURNALOFATMOSPHERICANDOCEANICTECHNOLOGY VOLUME17 QualityAssuranceProceduresintheOklahomaMesonetwork MARKA.SHAFER,CHRISTOPHERA.FIEBRICH,ANDDEREKS.ARNDT OklahomaClimatologicalSurvey,Norman,Oklahoma SHERMANE.FREDRICKSON* National Severe Storms Laboratory,Norman,Oklahoma TIMOTHYW.HUGHES Environmental Verification and Analysis Center,Norman,Oklahoma (Manuscriptreceived6April1999,in®nalform14June1999) ABSTRACT Highqualitydatasourcesarecriticaltoscientists,engineers,anddecisionmakersalike.Themodelsthat scientistsdevelopandtestwithquality-assureddataeventuallybecomeusedbyawidercommunity,frompolicy makers'long-termstrategiesbaseduponweatherandclimatepredictionstoemergencymanagers'decisionsto deployresponsecrews.Theprocessofdevelopinghighqualitydatainonenetwork,theOklahomaMesonetwork (Mesonet)isdetailedinthismanuscript. TheOklahomaMesonetquality-assuranceproceduresconsistoffourprincipalcomponents:aninstrument laboratory,®eldvisits,automatedcomputerroutines,andmanualinspection.Theinstrumentlaboratoryensures thatallsensorsthataredeployedinthenetworkmeasureuptohighstandardsestablishedbytheMesonet SteeringCommittee.Routineandemergency®eldvisitsprovideamanualinspectionoftheperformanceofthe sensorsandreplacementasnecessary.Automatedcomputerroutinesmonitordataeachday,setdata¯agsas appropriate,andalertpersonnelofpotentialerrorsinthedata.Manualinspectionprovideshumanjudgmentto theprocess,catchingsubtleerrorsthatautomatedtechniquesmaymiss. Thequality-assurance(QA)processistiedtogetherthroughef®cientcommunicationlinks.AQAmanager servesastheconduitthroughwhomallquestionsconcerningdataquality¯ow.TheQAmanagerreceivesdaily reportsfromtheautomatedsystem,issuestroubleticketstoguidethetechniciansinthe®eld,andissuessummary reportstothebroadercommunityofdatausers.TechniciansandotherMesonetstaffremainincontactthrough cellularcommunications,pagers,andtheWorldWideWeb.Together,thesemeansofcommunicationprovidea seamlesssystem:fromidentifyingsuspiciousdata,to®eldinvestigations,tofeedbackonactiontakenbythe technician. 1.Introduction ningstagesofthenetworkthroughoperationaldata monitoringandlong-termanalyses.Thismanuscriptde- TheOklahomaMesonetwork(Mesonet),developed tailsqualityassurance(QA)proceduresdeveloped throughapartnershipbetweentheUniversityof throughthecourseofbuildingtheMesonetandem- OklahomaandOklahomaStateUniversity,isaper- ployedoperationallyinMay1999. manentmesoscaleweatherobservationnetwork.Care TheOklahomaMesonetoperates115stationsona wastakenalongeverystepoftheprocesstoensurethat continuousbasis(Fig.1).Thirteenatmosphericandsub- theOklahomaMesonetwouldprovideresearch-quality surfacevariables(hereafter,parameters)arerecorded data.Theproceduresdocumentedinthismanuscriptare designedtoensurethisquality,fromtheearliestplan- every5minateachsite,producing288observations ofeachparameterperstationperday(Elliottetal.1994; Brocketal.1995).Severalotherparametersareob- servedevery15or30min.Fromitscommissioningin *Additionalaf®liation:OklahomaClimatologicalSurvey,Mesonet March1994throughMay1999,theOklahomaMesonet Project,Norman,Oklahoma. hassuccessfullycollectedandarchived99.9%ofover 75millionpossibleobservations.Becauseofthiscon- tinuousobservationcycle,aneedexistedtoensurethe Correspondingauthoraddress:MarkA.Shafer,OklahomaCli- matologicalSurvey,100E.BoydSt.,Suite1210,Norman,OK73019. qualityofdatacomingfromover2500instruments.A E-mail:[email protected] comprehensiveQAsystemwasdevelopedtocomple- q2000AmericanMeteorologicalSociety APRIL 2000 SHAFER ET AL. 475 FIG. 1. Map of Mesonet site locations. Land ownership for each of the sites is according to the symbols de®ned in the legend. There are 42 sites located in the ARS Micronet in southwest Oklahoma. ment the network's ef®cient collection and transmission followed closely; however, in a few cases, some guide- of environmental observations. The system utilizes lines could not be met. These cases resulted from 1) the feedback from an instrumentation laboratory, ®eld com- nature of the terrain; 2) a lack of suitable sites offered; parisons, automated tests, and visual analyses to rec- 3) the a priori decision to locate on particular parcels ognize and catalog suspect and/or erroneous observa- of land, such as existing agricultural research stations. tions. Ef®cient communication between all components Site photographs and documentation are available on- in the QA system is essential to quickly replace ques- line at http://okmesonet.ocs.ou.edu/ so data users can tionable instruments. ``visit'' a site. The Site Standards Committee also provided guid- ance for site layout and parameter selection. Wind speed 2. Network design considerations and direction are measured at a height of 10 m to match The Mesonet Steering Committee established 11 sub- World Meteorological Organization (WMO) standards, committees, each drawing upon the experience of ex- and temperature and relative humidity are measured at perts within their respective ®elds. Seven of these sub- 1.5 m for consistency with existing National Oceanic committees (see appendix) offered recommendations and Atmospheric Administration cooperative observa- pertaining to the quality of data recorded within the tions and airport stations. The characteristics of Mesonet network. The subcommittees represented three focus ar- sites are also generally consistent with standards rec- eas. The Site Standards and Site Selection committees ommended by the American Association of State Cli- developed a consistent set of criteria for site selection matologists (AASC) for automated weather station net- and provided guidance in locating sites. The Parameter works. The layout for all Mesonet sites is depicted in Selection and Sensor Speci®cation committees evalu- Fig. 2. The tower stands nearly in the center of a 10 m ated many diverse requests for instrumentation to be 3 10 m enclosure. It is surrounded by a cattle-panel installed on the Mesonet towers and developed rec- fence, 1.3 m high, to secure the area from animals and ommendations for instrument purchases. The Station nearby human activity. Maintenance, Quality Assurance, and Data Management One difference between Mesonet and AASC rec- committees developed guidelines to maintain the quality ommendations is in the height of the wind monitor. Mey- of and access to data once the network was established. er and Hubbard (1992) note that the AASC's recom- The Site Standards Committee recommended that mended height of 3 m for wind measurements is a com- sites should be located in rural areas, representative of promise between the expense of installing 10-m towers as large an area as possible, and ¯at, with all obstacles and problems with exposure affecting wind measure- being at a distance of more than 300 m away from the ments at lower heights. Because the Site Standards wind sensors (Shafer et al. 1993). These guidelines were Committee recommended installation of 10-m towers 476 JOURNAL OF ATMOSPHERIC AND OCEANIC TECHNOLOGY VOLUME 17 projects. The intercomparison site also allows evalua- tion of proposed instrument changes, the addition of new instruments, and changes in network con®guration, without affecting data collected from operational sites. The Sensor Speci®cation Committee developed per- formance speci®cations for each instrument individu- ally, designing the speci®cations to meet criteria for both research and operational purposes. The make and model of each instrument were selected separately for each parameter, allowing uniformity among those sensors de- ployed in the ®eld. By equipping operational sites with similar sensors, the potential of measurement bias, when comparing measurements between sites, is reduced. This strategy also allows technicians to draw from a common stock of spare sensors when sensor replacement is re- quired. By using similar instruments at all ®eld sites, a user does not need to be concerned by different instru- ment error characteristics when comparing data between sites. The remaining committees provided guidance relating to collection and archival of data. In particular, the Qual- ity Assurance Committee recommended the following steps to develop a quality data stream: R laboratory calibrations to test sensors at delivery and at routine intervals thereafter; FIG. 2. Schematic of Mesonet instruments site layout. The perspec- tive is a side view of the station looking northward. R ®eld intercomparisons, both during technician visits and through site intercomparisons; R real-time, automated data-monitoring software; R documentation of sites and processes; for wind measurements, the AASC's concerns for ex- R independent review; and posure are mitigated. A second difference concerns the R publication of data quality assessment height at which the rain gauge is mounted. The AASC recommends a height of 1.0 m to reduce splash effects, While not all recommendations have been fully imple- while the WMO recommends a height of 0.3 m to reduce mented, the following sections document steps taken wind effects. The AASC's concerns of splash effects toward meeting these recommendations. were resolved by modi®cations to the rain gauge made by Mesonet staff (discussed in section 4b). Installation 3. Overview of Mesonet QA processes of wind screens around the rain gauges to reduce tur- bulence in the vicinity of the gauge ori®ce addresses The need for thorough and ef®cient data archival and the WMO concerns. retrieval, combined with the need to optimize daily per- In addition to general guidelines for the network, the formance of the network, dictated a dual purpose for Site Selection Committee recommended
Recommended publications
  • University of Oklahoma
    UNIVERSITY OF OKLAHOMA GRADUATE COLLEGE INVESTIGATION OF POLARIMETRIC MEASUREMENTS OF RAINFALL AT CLOSE AND DISTANT RANGES A DISSERTATION SUBMITTED TO THE GRADUATE FACULTY in partial fulfillment of the requirements for the degree of Doctor of Philosophy By SCOTT EDWARD GIANGRANDE Norman, Oklahoma 2007 UMI Number: 3291249 UMI Microform 3291249 Copyright 2008 by ProQuest Information and Learning Company. All rights reserved. This microform edition is protected against unauthorized copying under Title 17, United States Code. ProQuest Information and Learning Company 300 North Zeeb Road P.O. Box 1346 Ann Arbor, MI 48106-1346 INVESTIGATION OF POLARIMETRIC MEASUREMENTS OF RAINFALL AT CLOSE AND DISTANT RANGES A DISSERTATION APPROVED FOR THE SCHOOL OF METEOROLOGY BY ____________________________________ Dr. Michael Biggerstaff ____________________________________ Dr. Alexander Ryzhkov ____________________________________ Dr. Jerry Straka ____________________________________ Dr. Guifu Zhang ____________________________________ Dr. Mark Yeary © Copyright by SCOTT EDWARD GIANGRANDE 2007 All Rights Reserved. ACKNOWLEDGEMENTS I would like to extend my sincerest thanks to the numerous individuals who have helped me complete this work. To begin, this work would not have been possible without the guidance of my primary research advisor, Dr. Alexander Ryzhkov. His leadership and patience were instrumental throughout this process. I would also like to extend my gratitude to the other members of my committee: Drs. Michael Biggerstaff (Co-Chair and primary OU School of Meteorology advisor), Guifu Zhang, Jerry Straka, and Mark Yeary. The reviews performed by these individuals strengthened this work. I also thank the members of the Radar Research and Development Division (RRDD) at the National Severe Storms Laboratory (NSSL), which includes Drs. Douglas Forsyth, Dusan Zrnic, Dick Doviak, Allen Zahrai, Terry Schuur, Pam Heinselman, Valery Melnikov, Sebastian Torres, Pengfei Zhang and Svetlana Bachmann.
    [Show full text]
  • 2020 Infra Surface Weather Observations
    Surface Weather Observations Comparison of Various Observing Systems Scott Landolt & Matthias Steiner National Center for Atmospheric Research [email protected] USHST Infrastructure Summit 12 – 13 March 2020 in Washington, DC © 2020 University Corporation for Atmospheric Research 1 Surface Stations & Reporting Frequency Station Type Frequency of Reports Automated Surface 5 minutes Observing System (ASOS) (limited access to 1 minute data) Automated Weather 20 minutes Observing System (AWOS) 15 minutes (standard), can be Road Weather Information more frequent but varies state to System (RWIS) state and even site to site 5 – 15 minutes, can vary from Mesonet station to station Iowa station network © 2020 University Corporation for Atmospheric Research 2 Reporting Variables Weather Variable ASOS AWOS RWIS Mesonet Temperature X X X X Relative X X X X Humidity/Dewpoint Wind Speed/Direction X X X X Barometric Pressure X X X X Ceiling Height X X X X Visibility X X X X Present Weather X X X X Precipitation X X X X Accumulation Road Condition X X X X X – All Stations Report X – Some Stations Report X – No Stations Report © 2020 University Corporation for Atmospheric Research 3 Station Siting Requirements Station Type Siting Areal Representativeness Automated Surface Miles (varies depending on Airport grounds, unobstructed Observing System (ASOS) local conditions & weather) Automated Weather Miles (varies depending on Airport grounds, unobstructed Observing System (AWOS) local conditions & weather) Next to roadways, can be in canyons, valleys, mountain
    [Show full text]
  • Relative Forecast Impact from Aircraft, Profiler, Rawinsonde, VAD, GPS-PW, METAR and Mesonet Observations for Hourly Assimilation in the RUC
    16.2 Relative forecast impact from aircraft, profiler, rawinsonde, VAD, GPS-PW, METAR and mesonet observations for hourly assimilation in the RUC Stan Benjamin, Brian D. Jamison, William R. Moninger, Barry Schwartz, and Thomas W. Schlatter NOAA Earth System Research Laboratory, Boulder, CO 1. Introduction A series of experiments was conducted using the Rapid Update Cycle (RUC) model/assimilation system in which various data sources were denied to assess the relative importance of the different data types for short-range (3h-12h duration) wind, temperature, and relative humidity forecasts at different vertical levels. This assessment of the value of 7 different observation data types (aircraft (AMDAR and TAMDAR), profiler, rawinsonde, VAD (velocity azimuth display) winds, GPS precipitable water, METAR, and mesonet) on short-range numerical forecasts was carried out for a 10-day period from November- December 2006. 2. Background Observation system experiments (OSEs) have been found very useful to determine the impact of particular observation types on operational NWP systems (e.g., Graham et al. 2000, Bouttier 2001, Zapotocny et al. 2002). This new study is unique in considering the effects of most of the currently assimilated high-frequency observing systems in a 1-h assimilation cycle. The previous observation impact experiments reported in Benjamin et al. (2004a) were primarily for wind profiler and only for effects on wind forecasts. This new impact study is much broader than that the previous study, now for more observation types, and for three forecast fields: wind, temperature, and moisture. Here, a set of observational sensitivity experiments (Table 1) were carried out for a recent winter period using 2007 versions of the Rapid Update Cycle assimilation system and forecast model.
    [Show full text]
  • Weatherscope Weatherscope Application Information: Weatherscope Is a Stand-Alone Application That Makes Viewing Weather Data Easier
    User Guide - Macintosh http://earthstorm.ocs.ou.edu WeatherScope WeatherScope Application Information: WeatherScope is a stand-alone application that makes viewing weather data easier. To run WeatherScope, Mac OS X version 10.3.7, a minimum of 512MB of RAM, and an accelerated graphics card with 32MB of VRAM are required. WeatherScope is distributed freely for noncommercial and educational use and can be used on both Apple Macintosh and Windows operating systems. How do I Download WeatherScope? To download the application, go to http://earthstorm.ocs.ou.edu, select Data, Software, Download, or go to http://www. ocs.ou.edu/software. There will be three options: WeatherBuddy, WeatherScope, and WxScope Plugin. You will want to choose WeatherScope. There are two options under the application: Macintosh or Windows. Choose Macintosh to download the application. The installation wizard will automatically save to your desktop. Go to your desktop and double click on the icon that says WeatherScope- x.x.x.pkg. Several dialog messages will appear. The fi rst message will inform you that you are about to install the application. The next message tells you about computer system requirements in order to download the application. The following message is the Software License Agreement. It is strongly suggested that you read this agreement. If you agree, click Agree. If you do not agree, click Disagree and the software will not be installed onto your computer. The next message asks you to select a destination drive (usually your hard drive). The setup will run and install the software on your computer. You may then press Close.
    [Show full text]
  • Storm Preparedness Kits –By Sparky Smith HELLO
    www.mesonet.org Volume 5 — Issue 7 — July 2014 connection Storm Preparedness Kits –by Sparky Smith HELLO. MY NAME IS SPARKY SMITH, and I am an pieces everywhere, but never the complete thing. I compiled Eagle Scout. First I will talk a little bit about myself. Then I all the lists and added a few things I thought were needed will tell you about how I came up with the idea for my project, in every emergency. The best way to get your project and actually show you how to build your own. I will also talk approved is to make it relatively inexpensive with easy to about the constraints and guidelines that I had to deal with obtain materials, and a personal aspect to show that you while doing the project. actually put thought into it. I have always been interested in weather. I went to the very One of the first things I did was go to a graphics design store first Mesonet Weather Camp, where I learned more about and had a nice looking label made to put on the storm kits, what kind of disasters occur and what other dangers they so people would know it was more than a five gallon bucket. bring. A storm kit has to last at least 1 day because many Thankfully, most of the products were donated, including the other dangers can arise after storms. For instance, you label. I made 20 storm kits, which would have cost about could be trapped in your shelter by debris or even have $3000, but it ended up costing about $1000-$1200.
    [Show full text]
  • Reading Mesonet Rain Maps
    Oklahoma’s Weather Network Lesson 2 - Rain Maps Reading Mesonet Rain Maps Estimated Lesson Time: 30 minutes Introduction Every morning when you get ready for school, you decide what you are going to wear for the day. Often you might ask your parents what the weather is like or check the weather yourself before getting dressed. Then you can decide if you will wear a t-shirt or sweater, flip-flops or rain boots. The Oklahoma Mesonet, www.mesonet.org, is a weather and climate network covering the state. The Mesonet collects measurements such as air tempera- ture, rainfall, wind speed, and wind direction, every five minutes . These mea- surements are provided free to the public online. The Mesonet has 120 remote weather stations across the state collecting data. There is at least one in every county which means there is one located near you. Our data is used by people across the state. Farmers use our data to grow their crops, and firefighters use it to help put out a fire. Emergency managers in your town use it to warn you of tornadoes, and sound the town’s sirens. Mesonet rainfall data gives a statewide view, updated every five minutes. When reading the Mesonet rainfall accumulation maps, notice each Mesonet site displays accumulated rainfall. The map also displays the National Weather Service (NWS) River Forecast Center’s rainfall estimates (in color) across Oklahoma based on radar (an instrument that can locate precipitation and its motion). For example, areas in blue have lower rainfall than areas in red or purple.
    [Show full text]
  • The Effect of Climatological Variables on Future UAS-Based Atmospheric Profiling in the Lower Atmosphere
    remote sensing Article The Effect of Climatological Variables on Future UAS-Based Atmospheric Profiling in the Lower Atmosphere Ariel M. Jacobs 1,2 , Tyler M. Bell 1,2,3,4 , Brian R. Greene 1,2,5 and Phillip B. Chilson 1,2,5,* 1 School of Meteorology, University of Oklahoma, 120 David L. Boren Blvd, Ste 5900, Norman, OK 73072, USA; [email protected] (A.M.J.); [email protected] (T.M.B.); [email protected] (B.R.G.) 2 Center for Autonomous Sensing and Sampling, University of Oklahoma, 120 David L. Boren Blvd., Ste 4600, Norman, OK 73072, USA 3 Cooperative Institute for Mesoscale Meteorological Studies, 120 David L. Boren Blvd., Ste 2100, Norman, OK 73072, USA 4 NOAA/OAR National Severe Storms Laboratory, 120 David L. Boren Blvd., Norman, OK 73072, USA 5 Advanced Radar Research Center, University of Oklahoma, 3190 Monitor Ave., Norman, OK 73019, USA * Correspondence: [email protected] Received: 7 August 2020; Accepted: 9 September 2020; Published: 11 September 2020 Abstract: Vertical profiles of wind, temperature, and moisture are essential to capture the kinematic and thermodynamic structure of the atmospheric boundary layer (ABL). Our goal is to use weather observing unmanned aircraft systems (WxUAS) to perform the vertical profiles by taking measurements while ascending through the ABL and subsequently descending to the Earth’s surface. Before establishing routine profiles using a network of WxUAS stations, the climatologies of the flight locations must be studied. This was done using data from the North American Regional Reanalysis (NARR) model. To begin, NARR data accuracy was verified against radiosondes.
    [Show full text]
  • Alternative Earth Science Datasets for Identifying Patterns and Events
    https://ntrs.nasa.gov/search.jsp?R=20190002267 2020-02-17T17:17:45+00:00Z Alternative Earth Science Datasets For Identifying Patterns and Events Kaylin Bugbee1, Robert Griffin1, Brian Freitag1, Jeffrey Miller1, Rahul Ramachandran2, and Jia Zhang3 (1) University of Alabama in Huntsville (2) NASA MSFC (3) Carnegie Mellon Universityv Earth Observation Big Data • Earth observation data volumes are growing exponentially • NOAA collects about 7 terabytes of data per day1 • Adds to existing 25 PB archive • Upcoming missions will generate another 5 TB per day • NASA’s Earth observation data is expected to grow to 131 TB of data per day by 20222 • NISAR and other large data volume missions3 Over the next five years, the daily ingest of data into the • Other agencies like ESA expect data EOSDIS archive is expected to grow significantly, to more 4 than 131 terabytes (TB) of forward processing. NASA volumes to continue to grow EOSDIS image. • How do we effectively explore and search through these large amounts of data? Alternative Data • Data which are extracted or generated from non-traditional sources • Social media data • Point of sale transactions • Product reviews • Logistics • Idea originates in investment world • Include alternative data sources in investment decision making process • Earth observation data is a growing Image Credit: NASA alternative data source for investing • DMSP and VIIRS nightlight data Alternative Data for Earth Science • Are there alternative data sources in the Earth sciences that can be used in a similar manner? •
    [Show full text]
  • The Center for Analysis and Prediction of Storms
    CAPS Mission Statement • The Center for Analysis and Prediction of Storms (CAPS) was established at the University of Oklahoma in 1989 as one of the first 11 National Science Foundation Science and Technology Center. Its mission was, and remains the development of techniques for the computer-based prediction of high-impact local weather, such as individual spring and winter storms, with the NEXRAD (WSR-88D) Doppler radar serving as a key data source. 1 Forecast Funnel • Large Scale - provide synoptic flow patterns and boundary conditions to the Large Scale regional scale flow. • Regional Forecast - provide improved resolution for predicting regional scale Regional Scale events (large thunderstorm complexes, squall lines, heavy precipitation events) • Storm Scale - predict individual thunderstorm and groups of Storm Scale thunderstorms as well as initiation of convection. 2 ARPS System Lateral boundary conditions from large-scale models ARPS Data Assimilation System (ARPSDAS) Gridded first guess Mobile Mesonet Data Acquisition Parameter Retrieval and 4DDA Rawinsondes & Analysis Single-Doppler Velocity • AddACARS adas slide here…... CLASS ARPS Data Analysis Retrieval (SDVR) System (ADAS) SAO 4-D Variational Vel- data Satellite – Ingest Profilers – Quality control Variational ocity Adjustment Incoming – Objective analysis ASOS/AWOS Data & Thermo- – Archival Oklahoma Mesonet Assimilation dynamic Retrieval WSR-88D Wideband Product Generation and DataData SupportSupport SystemSystem ForecastForecast GenerationGeneration ARPSPLT andARPSV IEW
    [Show full text]
  • TC Modelling and Data Assimilation
    Tropical Cyclone Modeling and Data Assimilation Jason Sippel NOAA AOML/HRD 2021 WMO Workshop at NHC Outline • History of TC forecast improvements in relation to model development • Ongoing developments • Future direction: A new model History: Error trends Official TC Track Forecast Errors: • Hurricane track forecasts 1990-2020 have improved markedly 300 • The average Day-3 forecast location error is 200 now about what Day-1 error was in 1990 100 • These improvements are 1990 2020 largely tied to improvements in large- scale forecasts History: Error trends • Hurricane track forecasts have improved markedly • The average Day-3 forecast location error is now about what Day-1 error was in 1990 • These improvements are largely tied to improvements in large- scale forecasts History: Error trends Official TC Intensity Forecast Errors: 1990-2020 • Hurricane intensity 30 forecasts have only recently improved 20 • Improvement in intensity 10 forecast largely corresponds with commencement of 0 1990 2020 Hurricane Forecast Improvement Project HFIP era History: Error trends HWRF Intensity Skill 40 • Significant focus of HFIP has been the 20 development of the HWRF better 0 Climo better HWRF model -20 -40 • As a result, HWRF intensity has improved Day 1 Day 3 Day 5 significantly over the past decade HWRF skill has improved up to 60%! Michael Talk focus: How better use of data, particularly from recon, has helped improve forecasts Michael Talk focus: How better use of data, particularly from recon, has helped improve forecasts History: Using TC Observations
    [Show full text]
  • Collaborative Research on Road Weather Observations and Predictions by Universities, State Departments of Transportation, and Na
    Collaborative Research on Road Weather Observations and Predictions by Universities, State Departments of Transportation, and National Weather Service Forecast Offices Publication No. FHWA-HRT-04-109 OCTOBER 2004 Research, Development, and Technology Turner-Fairbank Highway Research Center 6300 Georgetown Pike McLean, VA 22101-2296 FOREWORD This report documents the results of five research projects to improve the sensing, predictions and use of weather-related road conditions in road maintenance and operations. The primary purpose for these projects was to evaluate the use of weather observations and modeling systems to improve highway safety and to support effective decisions made by the various jurisdictions that manage the highway system. In particular, the research evaluated how environmental sensor station data, particularly Road Weather Information System (RWIS) data, could best be used for both road condition forecasting and weather forecasting. The collaborative efforts also included building better relations for training and sharing information between the meteorological and transportation agencies. These projects are unique because they each involved collaborated partnerships between National Weather Service (NWS) Weather Forecast Offices (WFOs), State departments of transportation (DOTs), and universities. Lessons learned from these projects can help all State DOTs improve how they manage RWIS networks and achieve maximum utility from RWIS investments. Sufficient copies are being distributed to provide one copy to each Federal Highway Administration (FHWA) Resource Center, two copies to each FHWA Division, and one copy to each State highway agency. Direct distribution is being made to the FHWA Divisions Offices. Additional copies may be purchased from the National Technical Information Service (NTIS), 5285 Port Royal Road, Springfield, VA 22161.
    [Show full text]
  • Evaluating NEXRAD Multisensor Precipitation Estimates for Operational Hydrologic Forecasting
    JUNE 2000 YOUNG ET AL. 241 Evaluating NEXRAD Multisensor Precipitation Estimates for Operational Hydrologic Forecasting C. BRYAN YOUNG,A.ALLEN BRADLEY,WITOLD F. K RAJEWSKI, AND ANTON KRUGER Iowa Institute of Hydraulic Research and Department of Civil and Environmental Engineering, The University of Iowa, Iowa City, Iowa MARK L. MORRISSEY Environmental Veri®cation and Analysis Center, University of Oklahoma, Norman, Oklahoma (Manuscript received 9 August 1999, in ®nal form 10 January 2000) ABSTRACT Next-Generation Weather Radar (NEXRAD) multisensor precipitation estimates will be used for a host of applications that include operational stream¯ow forecasting at the National Weather Service River Forecast Centers (RFCs) and nonoperational purposes such as studies of weather, climate, and hydrology. Given these expanding applications, it is important to understand the quality and error characteristics of NEXRAD multisensor products. In this paper, the issues involved in evaluating these products are examined through an assessment of a 5.5-yr record of multisensor estimates from the Arkansas±Red Basin RFC. The objectives were to examine how known radar biases manifest themselves in the multisensor product and to quantify precipitation estimation errors. Analyses included comparisons of multisensor estimates based on different processing algorithms, com- parisons with gauge observations from the Oklahoma Mesonet and the Agricultural Research Service Micronet, and the application of a validation framework to quantify error characteristics. This study reveals several com- plications to such an analysis, including a paucity of independent gauge data. These obstacles are discussed and recommendations are made to help to facilitate routine veri®cation of NEXRAD products. 1. Introduction WSR-88D radars and a network of gauges that report observations to the RFC in near±real time (Fulton et al.
    [Show full text]