DATA MANAGEMENT FOR MARINE GEOLOGY AND GEOPHYSICS
Tools for Archiving, Analysis, and Visualization
WORKSHOP REPORT LA JOLLA, CALIFORNIA MAY 14-16, 2001
TABLE OF CONTENTS
Executive Summary ...... 1
1. Overview ...... 3 1.1 Motivation ...... 3 1.2 Objectives ...... 5 1.3 Agenda ...... 6 1.4 Evaluation ...... 6 1.5 Relevant URLs ...... 6
2. Working Group Summaries and Recommendations...... 7 2.1 Working Group 1 - Structure of a Data Management System ...... 7 2.2 Working Group 2 - Data Archiving and Access ...... 9 2.3 Working Group 3 - Data Documentation ...... 12
Appendices ...... 16 Appendix 1: List of Attendees ...... 16 Appendix 2: Final Agenda ...... 18 Appendix 3: Workshop Evaluation ...... 20 Appendix 4: Relevant URLs...... 24
Support for this workshop was provided by the National Science Foundation and Office of Naval Research. ORGANIZING COMMITTEE
Steve Cande William Ryan SIO/UCSD LDEO 9500 Gilman Dr. 61 Rte 9W La Jolla, CA 92093-0220 Palisades, NY 10964 [email protected] [email protected] Tel: 858-534-1552 Tel: 845-365-8312 Fax: 858-534-0784 Fax: 845-365-8156
Suzanne Carbotte, Co-Chair Deborah Smith, Co-Chair LDEO Department of Geology 61 Rte 9W and Geophysics, MS #22 Palisades, NY 10964 WHOI [email protected] Woods Hole, MA 02543 Tel: 845-365-8895 [email protected] Fax: 845-365-8168 Tel: 508-289-2472 Fax: 508-457-2187 Stephen Miller SIO/UCSD Dawn Wright 9500 Gilman Dr. Department of Geosciences La Jolla, CA 92093-0220 Oregon State University [email protected] Corvallis, OR 97331-5506 Tel: 858-534-1898 [email protected] Fax: 858-534-0784 Tel: 541-737-1229 Fax: 541-737-1200 EXECUTIVE SUMMARY
ON MAY 14-16, 2001 the National Science Foundation and the Office of Naval Research sponsored a
workshop on Data Management for Marine Geology and Geophysics: Tools for Archiving, Analysis, and Visu-
alization. The workshop was held at the Sea Lodge Hotel in La Jolla, CA. The workshop’s objective was to bring
together researchers, data collectors, data users, engineers, and computer scientists to assess the state of
existing data management efforts in the marine geology and geophysics (MG&G) community, share experi-
ences in developing data management projects, and help determine the direction of future efforts in data
management.
The workshop agenda was organized around presen- MANAGEMENT SYSTEM tations, plenary discussions, and working group dis- 3. Manage data using a distributed system with a cussions. The presentations provided examples of the central coordination center. needs of data users, the needs of large, multidisci- 4. Manage different data types with user-defined plinary MG&G projects, existing data management centers. projects in the community, tools that have been de- 5. Support area- or problem-specific databases if veloped for data access and analysis, examples of or- scientifically justified, but these databases should ganizations with centralized databases, and current link to rather than duplicate data holdings within topics in information technology. discipline-specific data centers. Working groups addressed questions concern- 6. Fund core operating costs of the distributed data ing three different themes: (1) the structure of a data centers as 3-5 year facility cooperative agree- management system, (2) data archiving and access, ments. and (3) data documentation. The working groups were 7. Evaluate the data management system using also asked to recommend strategies to permit MG&G oversight and advisory committees, in-depth peer data management to move forward in each of these reviews at renewal intervals, and ad hoc panels areas. to assess each data center’s contribution to sci- The Working Groups came up with 20 recom- ence. mendations: DATA ARCHIVING AND ACCESS OVERARCHING RECOMMENDATIONS 8. Always archive raw data. Archive derived data for high-demand products. 1. Create permanent, active archives for all MG&G 9. Store data in open formats. data. 2. Create a centralized and searchable on-line metadata catalog. 1 2
10. Develop standardized tools and procedures to 19. Require level 3 metadata* within each discipline- ensure quality control at all steps from acquisi- specific data center. Archiving of publications tion through archiving. related to the data should also be included (level 11. Improve access to common tools for data analy- 4 metadata*). sis and interpretation for the benefit of the com- 20. Follow nationally accepted metadata standards munity. (particularly for levels 1 and 3 metadata*). 12. Build data centers to address the needs of a di- verse user community, which will be primarily A clear top priority of the workshop participants is to scientists. immediately define and establish a centralized 13. Enforce timely data distribution through funding metadata catalog. The metadata catalog should be agency actions. broad, containing information on as many data types 14. Promote interactions among federal agencies and as possible. It should support geospatial, temporal, organizations, and international agencies to de- keyword, and expert-level searches of each data type. fine data and metadata exchange standards and By definition, metadata are information about data that policies. can evolve. The catalog should be a circular system that allows feedback from the user/originator. The DATA DOCUMENTATION metadata catalog should serve as the central link to 15. Require ship operators and principal investiga- the distributed network of data centers where the ac- tors (P.I.s) to submit level 1 metadata* and cruise tual data reside. navigation to the centralized metadata catalog To move forward, funding agencies must estab- at the end of each cruise as part of the cruise- lish a small working group or advisory board to de- reporting process. velop the structure and implementation of a metadata 16. Generate a standard digital cruise report form and catalog. Additional working groups for each of the make it available to all chief scientists for cruise high-priority, discipline-specific data centers also need reporting (level 2 metadata*). to be assembled. It is critical to obtain the active in- 17. Require individual P.I.s to complete and submit volvement of scientists in all aspects of this process standard forms for level 1 and 2 metadata* for through all operational phases, including data collec- field programs carried out aboard vessels not in tion, processing, archiving, and distribution. the University-National Laboratory System Section 2 of this report discusses these recom- (UNOLS) fleet (e.g., foreign, commercial, other mendations further. academic platforms). 18. Generate a standardized suite of level 1 and level *Metadata levels: Level 1. Basic description of the field program 2 metadata* during operation of seafloor obser- including location, program dates, data types, collecting institu- tions, collecting vessel, and P.I.s. Level 2. A digital cruise report and vatories and other national facilities (e.g., the data inventory. Level 3. Data object and access information includ- Deep Submergence Laboratory, Ocean Bottom ing data formats, quality, processing, etc. Level 4. Publications de- Seismograph (OBS) Instrument Pool), and sub- rived from the data. mit to the central metadata catalog. 1. OVERVIEW
1.1 MOTIVATION
MG&G SCIENTIFIC data collections are grow- available for shallow-water problems, and digital ac- ing at a rapid rate (Figures 1 and 2). Processed and quisition of 480 channel data is currently routine for analyzed data made available to a broad community deep-ocean work (Figure 5). of scientists, educators, and the general public can be With these new technologies it is becoming in- used for discovering and distributing new knowledge. creasingly difficult for individual investigators to syn- A significant problem is how to provide data users with thesize and organize data sets collected on single the means to effectively access these data and the tools cruises let alone manage them in a manner that al- to analyze and interpret them. lows data to be accessed efficiently by a larger user Advances in data storage technology have elimi- pool. National archiving of some marine geoscience nated practical constraints on storing large data vol- data is carried out. There have been several attempts umes and have permitted data collection at increas- by individual P.I.s to establish geographic- or data- ingly finer sample rates. New high-resolution systems specific databases. However, access to many data types provide digital images of the seafloor at sub-meter pixel remains difficult and incomplete (e.g., MCS, multibeam resolution and generate data at rates on the order of bathymetry, sidescan sonar, camera, and video imag- Gigabytes per day (Figures 3 and 4). Seismic acquisi- ery). Large quantities of data are under-utilized by tion capabilities have greatly expanded with long-term primary users and gaining access to these data is vir- deployment of bottom sensors. High-resolution mul- tually impossible for secondary users. At the same time, tichannel seismic reflection (MCS) systems are now our scientific interests are increasingly interdiscipli-
24.0 1000 23.0 22.0 NGDC Data Growth 21.0 20.0 19.0 18.0 100 17.0 16.0 15.0 33 14.0 13.0 12.0 10 11.0 10.0 9.0 8.0 7.0
Archive Size (terabytes) 1
6.0 Terabytes 5.0 4.0 17 3.0 2.0 1.0 0.1 0.0 9 Jul-92 Jul-93 Jul-94 Jul-95 Jul-96 Jul-97 Jul-98 Jul-99 Jul-00 Jul-01 Jan-92 Jan-93 Jan-94 Jan-95 Jan-96 Jan-97 Jan-98 Jan-99 Jan-00 Jan-01
GSN FDSN JSP Other US Regional PASSCAL 0.01 1986 1988 1990 1992 1994 1996 1998 2000 2002 2004
Fig. 1. Growth in the Incorporated Research Institutions for Seis- Year mology (IRIS) data archive since 1992. Data holdings from a variety Fig. 2. Growth in the digital data holdings of marine geophysical of different seismic networks are shown (GSN – Global Seismo- data at the National Geophysical Data Center (NGDC) since 1990. graphic Network, FDSN – Federation of Digital Broad-Band Seis- Dashed lines show data doubling times of 9, 17 and 33 months. mograph Networks PASSCAL – Program for the Array Seismic Stud- Figure provided by George Sharman (NGDC, NOAA). ies of the Continental Lithosphere). Figure provided by Tim Ahern (IRIS). 3 4
Figure 3. Sun-illuminated perspective view of the Eel River margin Figure 4. Three-dimensional shaded relief map of the East Pacific of Northern California. Multibeam bathymetry (EM1000, Rise near 9o-10oN. This is currently the best-studied section of fast- Hydrosweep and Seabeam data) are merged with the USGS 30 m spreading mid-ocean ridge. Figure courtesy of Dawn Wright (OSU). DEM for the adjacent land. Three-dimensional visualization of the merged topography is carried out using Fledermaus from Interac- tive Visualization Systems and Analysis. (Fonseca, L , Mayer, L. and Paton, M., ArcView Objects in the Fledermaus Interactive 3-D Vi- sualization System: An example from the STRATAFORM GIS, in Wright, D.J. (ed.), Undersea With GIS, Redlands, CA: ESRI Press, in press, 2001).
To obtain a new grid at an appropriate resolution
Drag mouse to select a profile
Lat: Lon: Depth displayed at curser location
Figure 5. Example of a multichannel seismic reflection record from Figure 6 Example of capability provided by MapAp, a web-based the northwest shelf of Australia. Seismic interpretation of various map driven, database interface developed for the RIDGE Multibeam reflectors is superimposed with various colors. Data are stored and Synthesis project (see Appendix 4). Figure shows a multibeam displayed within the GEOQUEST IESX seismic-well log integrator/ bathymetry map of Axial Seamount, NE Pacific, with a user-defined data browser. The power engine of IESX is an Oracle-based data- profile location and corresponding bathymetry profile displayed. base system that organizes seismic, well log and geographical data Figure provided by Bill Haxby (LDEO). in a local environment for interpretation. Figure courtesy of Garry Karner (LDEO). nary and require easy access to the broad spectrum of data collected. Throughout the marine geoscience community, scientists want access to data, the ability to compare data of different types, and tools to ma- nipulate and interpret these data (Figures 6, 7, 8). With these concerns in mind, the National Sci- ence Foundation (NSF) and Office of Naval Research (ONR) sponsored a workshop on MG&G data man- agement on May 14-16, 2001 in La Jolla, California. The coordinating committee advertised the workshop; participation was open. Approximately 80 represen- tatives from science, engineering, computer science, government, and industry attended (Appendix 1). The Figure 7. The Virtual Jason Control Van is a web-based application workshop provided a forum for a focused interchange that takes real-time snapshots of information occurring inside the control van during vehicle operations and makes this information of ideas among data users, data providers, and tech- immediately available for shipboard scientists and for collabora- nical experts in information technology. tion and post-cruise analysis on shore. Features include monitor- ing real-time operations, searching for events, dates, etc. Figure 1.2 OBJECTIVES courtesy of Steven Lerner (WHOI).
The overall goal of the workshop was to develop a strategy for MG&G data management that would meet Experimental Data Processing scientists’ needs for improved data access and im- Ocean Data proved tools for data analysis and interpretation. Ac- Matlab complishing this goal will lead to greater use of data Adjust Constraints by the MG&G community and the education and out- Parameters Geodynamic Seismic Application Velocity reach community. Model Parameters Another workshop objective was to provide NSF Seismic Published Result Velocity and ONR with recommendations on how to implement Model Viz this data management strategy. The organizing com- Visualize Model Space Matlab mittee thus created three thematic working groups: Add Physics (1) structure of a data management system, (2) data Figure 8. Schematic diagram of scientific workflow. There are three archiving and access, and (3) data documentation. stages that modelers go through when developing a computational Each group discussed key problems within their theme result: (1) experimental data processing; (2) parameter adjustment; and provided a list of recommendations on how to and (3) publication of the result. It is important that the modeler be solve them. Critical to implementing these recommen- able to link tools easily to the output of computational applications dations is active involvement of scientists in the entire (e.g., to visualize the data). Figure courtesy of Dawn Wright (OSU). process, including collecting, processing, archiving, and distributing data.
5 6
1.3 AGENDA
The first day of the workshop was devoted to short The presentations on days 1 and 2 served as cata- talks, each followed by a brief discussion. In addition, lysts for discussions that were held within the theme a longer discussion followed each group of subject- working groups, each of which consisted of an inter- specific talks. The longer discussion was led by a pre- disciplinary group of scientists, engineers, and com- assigned discussion leader. Our intent was to engage puter scientists. These working groups addressed a the participants in the meeting right from the start number of questions that formed the basis for presen- through the discussion. tations in the morning of the third day of the work- The workshop began with presentations from shop. data users. The talks focused on problems that P.I.s The full agenda is given in Appendix 2. Presen- have had in the past with gaining access to data, and tations and poster abstracts can be obtained through possible solutions to these problems. Representatives the workshop conveners. from large, multidisciplinary MG&G programs gave overviews on anticipated database needs for new pro- 1.4 EVALUATION gram initiatives. Individual P.I.s made presentations on database projects which they initiated, providing An evaluation form was included in the workshop working examples of data access and functionality over packet that participants received. The forms were col- a range of disciplines. In the late afternoon of the first lected at the conclusion of the workshop. The re- day, workshop participants presented models for data sponses to the questions have been compiled and are access. The format of this session was somewhat dif- presented in Appendix 3. ferent as the talks served as introductions for demon- strations that were part of the evening poster session. 1.5 RELEVANT URLS In addition to these invited demonstrations, the evening session included posters and demonstrations contrib- During the meeting, participants were asked to pro- uted by workshop participants. vide links to web sites that are relevant to MG&G da- The second day of the workshop began with pre- tabase efforts. This URL list is provided in Appendix 4. sentations by representatives of organizations with large central databases. Talks focused on anticipated future directions in data access and database design as well as insights on successes and major obstacles encountered during their efforts to date. The final set of talks focused on current developments in informa- tion technology, including data mining issues and de- signing databases to serve real-time data. 2. WORKING GROUP SUMMARIES
2.1 WORKING GROUP 1 Structure of a Data Management System
Working Group 1 considered how to structure a MG&G attendees and the MG&G community, in general. The data management system. Currently, some data are consensus of Working Group 1 is that the community archived at the National Geophysical Data Center must begin taking small, concrete steps towards es- (NGDC), such as the suite of underway geophysical tablishing a metadata catalog. From there the com- data (navigation, gravity, magnetics, topography) col- munity should move towards a discipline-oriented, lected on most large UNOLS vessels. However, it is distributed data management system that will improve not standard practice to submit to NGDC all data col- the data use by a broad community. Development of lected by the MG&G community. the discipline-oriented data centers should be handled Several ship-operating institutions have archived through the normal competitive proposal process. Al- data at some level. However, no standardization across though participants agreed that significant resources institutions exists and these efforts have been carried are needed for new database efforts, exact details of out at the discretion of the individual institutions. The the level of government agency funds for the manage- need for a sound data management system is recog- ment system were not determined. nized, and a few workshops have been held to ad- RECOMMENDATIONS dress this problem for specific data types (e.g., MCS Workshop, La Jolla, CA, 1999). In addition, individual WG1_1. Create permanent, active archives for all P.I.s have made specific data types available to the MG&G data. broader community (see Appendix 4). It is evident that It is very important that the funding agencies there is no community-wide strategy in place to solve maintain and strengthen their commitment to long- MG&G data management problems. term data archiving. As noted at the meeting, data col- lected by the HMS Challenger are still being used. Per- QUESTIONS CONSIDERED manent archives for all types of MG&G data must be Working Group 1 addressed the following questions: established. The community must continually add to • What model is appropriate for a data manage- and update these permanent archives. ment system (e.g., distributed versus centralized)? • How do we fund the data management system? WG1_2. Manage data using a distributed system • How do we evaluate the system? with a central coordinating center. • Do we need a permanent archive? The management system should operate as close to the data generators as possible. Scientists must be There was clear agreement within the group that the actively involved in data management, placing the re- MG&G community needs a distributed data manage- sponsibility for and authority over the data as close ment system with a coordination center to facilitate as possible to where the expertise resides. Data qual- communication among different data centers. The ity control should be provided by those generating the working group session started with a discussion of data. Mechanisms should be developed to enable us- metadata, indicating the importance of metadata to ers to easily provide feedback on data quality. 7 8
A coordination center is necessary to facilitate needs. There is a critical need for one-stop ac- communications among the distributed data centers, cess quality-control and processing centers with and to ensure that everyone works together. A good tools to generate higher level products. There does example of central coordination is the OBS Instrument not seem to be a quality-control process in place Pool. The individual instrument centers provide qual- although the MB-System software provides tools ity control and write standard format data. The Incor- for reading a broad suite of multibeam data. porated Research Institutions for Seismology (IRIS) 7. Deep submergence data collected by near-bottom in- then archives the data and provides community ac- struments (submarines, remotely operated vehicles cess to them. (ROVs), autonomous underwater vehicles (AUVs), etc.). In principle, these data should be managed WG1_3. Manage different data types with user-de- in the same way that other shipboard data are fined centers. managed. A data management plan must be de- Examples of different data types and their man- fined and overseen, perhaps through the Deep agement status are given below. The list is not all in- Submergence Science Committee (DESSC), the ex- clusive. isting operators and user group. 1. Ocean bottom seismograph/hydrophone (OBS/OBH) 8. Gravity/magnetic data. NGDC maintains archives data. Quality control is provided by the three OBS of these data, but there are major quality-con- instrument centers and archival and community trol and user-interface problems. The commu- access is provided by IRIS. nity concerned about these data needs to be de- 2. Rock petrology/geochemistry data. A web-served fined. Value-added products (derived products) database is being developed to provide metadata should be archived and made available to the and processed results for rock samples. This ef- broad community. fort is ready for migration to permanent support. 9. Sedimentology, paleontology data. Although it was 3. Core/rock collections. Sample curation appears to noted that problems exist, there were too few rep- be in good shape. NGDC maintains a central cata- resentatives from these communities at the work- log of the existence of physical samples and some shop to define the issues and possible solutions. sample metadata. NSF should encourage mini-workshops or work- 4. Ocean Drilling Program (ODP) data. ODP developed ing groups for these data. the JANUS database based on community rec- ommendations that came out of several work- WG1_4. Support area- or problem-specific data- shops. It appears to be in good shape. There are bases if scientifically justified, but these databases plans in place to transition the database from ODP should link to rather than duplicate data holdings to IODP in 2003. within discipline-specific data centers. 5. Single channel and multichannel seismic data. Working Group 1 recognizes that there might be A workshop was held in 1999 to determine the a future need to set up databases for specific oceanic needs of the community for database manage- regions or for specific scientific goals. Examples of ment. Recommendations were made from that area-specific databases are those for the 9oN area of workshop. An interested subgroup of the MCS the East Pacific Rise and the Juan de Fuca region. Ex- community needs to define the model details and amples of problem-specific databases are those that submit a proposal to NSF. will develop from the MARGINS and RIDGE programs. 6. Multibeam sonar data (bathymetry, sidescan, back- These databases should be supported, but they should scatter, LIDAR, etc.). This community needs a user/ serve as links to discipline-specific databases and generator workshop or working group to define should not duplicate data holdings within these data- the problems and solutions to their database bases. WG1_5. Evaluate the data management system us- community’s needs. Selection of data centers should ing oversight and advisory committees, in-depth be determined through competition, and a data cen- peer reviews at renewal intervals, and ad hoc pan- ter should not expect to be funded permanently. els to assess each data center’s contribution to sci- ence. WG1_6. Fund core operating costs of the distrib- The data management system should undergo uted data centers as 3-5 year facility cooperative regularly scheduled peer review. A new set of advi- agreements. sory groups representing the broad spectrum of the This is a corollary to recommendation WG1_5 in MG&G research community should be established. that funds should cover a finite number of years after This will ensure that the recommendations regarding which each of the data centers should be evaluated data sets and models will be responsive to the for effectiveness and responsiveness to users’ needs.
2.2 WORKING GROUP 2 Data Archiving and Access
Working Group 2 focused primarily on data archiving form data set for a region. Multibeam bathymetric sys- issues. Problems associated with current MG&G tems are currently operated on most deep-ocean ves- archiving efforts range from complete absence of an sels within the academic fleet, but standards do not archive for many important data types, to lack of qual- exist for navigation editing, beam editing, or even the ity control and inadequate data delivery to archives. nature of the final data product (corrected or uncor- While ship operators deliver underway geophysical rected meters). Some multibeam data are archived with data to the NGDC (Figure 9), there are no standards the NGDC, some with the ship operating institutions, for data quality and it can be difficult to obtain a uni- and some within problem-specific archives (e.g., the
Figure 9. World map showing over 15 million miles of ship tracks with underway geophysical data inventoried within NGDC’s Marine Trackline Geophysics Database. Bathymetry, magnetics, gravity, and seismic reflection data along these tracks from 4600 cruises were collected from 1939 to 2000. Figure provided by John Campagnoli, NGDC.
9 10
6 Data Storage RIDGE Multibeam Synthesis Web Site). However, de- livery to these data archives is largely at the discre- 5 tion of the P.I., and access to these data and many
4 other data types is often difficult. Data ownership issues continue to be significant 3 obstacles to archiving efforts. Although NSF policy per- mits a two-year proprietary hold on data collected by Log (PB/yr) 2 a P.I., data are commonly held well past this time pe- EOS riod. 1 NOAA Moore’s Law Recognition for contribution to data archives is
0 an important issue that has not been well addressed 2001 2004 2007 2010 by any existing archives. Figure 10. Predicted future growth in data holdings at NGDC com- pared with data storage capability predicted by Moore’s law for the QUESTIONS CONSIDERED next 10 years. Figure demonstrates that expected data storage ca- pability should be more than adequate to handle expected data Working Group 2 considered the following questions: volumes. Figure courtesy of Herbert Kroehl, NGDC. • What data need to be archived? • Should we archive raw data, processed data and/ or interpretations? • How can we ensure quality control? Worldwide Users Query • Local Analysis • How do we provide broad data access for scien- • Reproducibility • Comparability tists and the public?
Data • What tools are needed for data interpretation and Retrieval Digital Library (Curation) • Authorization Metadata analysis? • Cataloging • Search and Retrieval • How do we enforce timely data distribution? • How do we reward data contribution to archives? QA/QC Data Repository Persistent Names (Archival) (Accession Numbers) Anomaly Detection Overall priorities were defined as: (1) save the data, and Reporting ADO Data Upload (2) provide a catalog of all of the data, and (3) provide easy access to the data. With advances in storage tech- Worldwide QA/QC Data nology and web access, the bulk archiving of data is Contributors Publication feasible, but our community will be challenged in the
Figure 11. Arbitrary Digital Objects (ADO) are produced when con- areas of quality control and metadata generation (Fig- tributed data are uploaded to the data repository. The ADO is as- ures 10 and 11). A centralized metadata catalog of all signed a persistent and unique name within the repository. These data-collection activities was viewed by this group as and other metadata are passed to the digital library function where a very high priority. To build this catalog, the working they can be searched using a catalogue database. Key elements of group recommended the development of easy-to-use this process are the continuing involvement of the authors of the tools for automatic metadata generation aboard data and the maintenance of a dialogue between data users and authors or their successors. Another important consideration is the UNOLS vessels. The data types to be archived include, separation of the metadata catalogue search function from the data but are not limited to: repository and delivery function. Both become more portable and • Underway geophysical data, including time, po- reliable when functionally separated. Figure from J. Helly, T. T. Elvins, sition, magnetics and gravity, multibeam bathym- D. Sutton, D. Martinez, S. Miller, S. Pickett and A. M. Ellison, Con- etry, sidescan sonar, single and multichannel trolled Publication of Digital Scientific Data, Communications of the seismics. ACM (accepted October 3. 2000). • Standard supporting data, including sea state, to enter archives, common sanity and geographic- XBT, CTD, sea surface temperature and salinity, bounds checks need to be applied. Circular archives and derived sound velocity profiles, as well as are needed to permit content to be updated as errors calibration data for each sensor. are found by users, with appropriate notations in • Station information for dredging, coring, trawl- metadata. The peer-review process in electronic jour- ing, and other over-the-side operations, with nals can provide broad-based quality assessment, and complete data or metadata, as appropriate. the publication of data briefs in electronic journals is • Some individual-investigator instrumental data encouraged. need not be saved, but metadata with time, posi- tion, and contact need to be archived. For ex- WG2_4. Improve access to common tools for data ample, it may not be appropriate to archive test analysis and interpretation for the benefit of the data collected from a prototype sensor, but the community. existence of these data should be documented A combination of public domain and commercial and preserved. tools are used widely, including GMT, MB-System, ArcView, Fledermaus, and Matlab. A data center should RECOMMENDATIONS maintain a list of suitable software for viewing and WG2_1. Always archive raw data. Archive derived analyzing each data type, instructions on installation, data for high-demand products. data exchange and usage for our community, and con- Current data storage capability is adequate for tacts for further assistance. Custom, open-source de- on-the-fly generation of some types of derived data velopment may be needed for special tools and inter- products. However, some types of derived data should faces for community-wide use. be archived for high-demand products such as multibeam bathymetry grids or maps, or when non- WG2_5. Build data centers with the goal of address- trivial processing steps are required (e.g., MCS data). ing the needs of a diverse user community, which Easy retrieval of these derived, value-added will be primarily scientists. “products” (e.g., images of reflection profiles, grids, Platform-independent, browser-based data ex- bathymetric maps, graphs) must be developed. Ev- traction tools are needed. The authoritative metadata eryone benefits from maximum use of the data includ- catalog should support geospatial, temporal, keyword, ing scientists, the government, and the public. and expert-level searches of each data type. A feder- ated system of distributed data centers, easily updated WG2_2. Store data in open formats. and synchronized, will provide reliable and efficient Tested, portable, and noncommercial software for delivery for each data type. Data should be archived data translation must be freely available for all users. in a form easily used by other disciplines. Experience has shown that well-organized, image-rich, search- WG2_3. Develop standardized tools and procedures able databases will serve the needs of both research- to ensure quality control at all steps from acquisi- ers and the public (Figure 12). tion through archiving. Standardized shipboard tools are the first step to WG2_6. Enforce timely data distribution through ensuring quality control. Easy-to-use, real-time data funding agency actions. quality monitoring tools are needed for UNOLS ves- Raw data should be delivered to the designated sels, as well as cost-effective ship-to-shore commu- data center immediately following each cruise. The nication of sufficient compressed data for quality as- designated center will restrict data access to the P.I. sessment and troubleshooting. Before data are allowed and identified collaborators for an initial proprietary
11 12 hold period. This lock is released when the period ex- WG2_7. Promote interactions among federal agen- pires. The standard period is two years, although some cies and organizations, and international agencies circumstances may warrant an extension to be granted to define data and metadata exchange standards and by the cognizant funding agency. policies. Auditing access to data will provide usage statis- The community would benefit from the standard- tics and facilitate interdisciplinary collaboration as well ization of forms, such as an end-of-cruise digital data as the communication of future updates, within the form, as well as from metadata content and exchange restrictions of privacy requirements. standards. We encourage collaboration among the fed- The NSF Final Report could include a field to de- erally mandated agencies (NSF, ONR, USGS, NOAA, scribe how the P.I. complied with NSF data-distribu- NAVO, etc.) to review marine database standards. tion policies. Noncompliance might have a negative International discussions should be encouraged effect on future proposals. Data publication in citable to define exchange standards and policies. At a mini- journals and in technical briefs such as USGS open- mum, exchange of cruise tracks and sample locations file reports should be encouraged. would be a major benefit for cruise planning.
2.3 WORKING GROUP 3 Data Documentation
Working Group 3 focused on metadata issues. The de- form metadata are collected during federally funded velopment of appropriate metadata and metadata stan- MG&G field programs. Basic information regarding dards for ocean floor and other types of oceanographic cruise location, date, project P.I.s, and data types col- data are an extremely important issue. The growth in lected can be difficult to obtain, and no central and information technology has led to an explosion in the comprehensive catalog is available. Cruise reports of- amount of information that is available to researchers ten contain detailed information regarding general ex- in many fields. This is the case in the marine environ- periment configuration, data calibration, and data ment where a state-of-the-art “visual presence” (e.g., quality, all of which are of great importance for sub- through long-term monitoring by cameras and other sequent data analysis. In many instances, the cruise instruments) may result in the acquisition of data that report may be the only record of this information, but quickly overtakes the speed at which the data can be no easily accessible digital archive of these reports interpreted. The paradox is that as the amount of po- exists. tentially useful and important data grows, it becomes QUESTIONS CONSIDERED increasingly difficult to know what data exist, the ex- act location where the data were collected (particu- Working Group 3 addressed the following questions: larly when navigating at sea with no “landmarks”), • How do we move toward metadata standards? and how the data can be accessed. In striving to man- • How do we standardize data collection proce- age this ever-increasing amount of data, and to facili- dures? tate their effective and efficient use, compiling • What is the role of the ship operating institu- metadata becomes an urgent issue. tions in the archiving of data and generation of Although metadata are contained within some of metadata? the digital data file formats commonly used to store • What existing software and structures should we MG&G data (e.g,. MGD77 and SEGY formats), no uni- take advantage of? • How should we deal with real-time data acquisi- tion?
To aid the development of a standardized procedure for generating metadata during MG&G studies, four levels of metadata were defined, each defining a par- ticular stage of the data-acquisition to publication pro- cess: • Level 1. Basic description of the field program including location, program dates, data types, collecting institutions, collecting vessel, and P.I.s. • Level 2. A digital cruise report and data inven- Howland Contour Interval = 125 meters Grid Size = 180 meters tory. Tokelau Seamounts