Getting Data Into Visit

Total Page:16

File Type:pdf, Size:1020Kb

Getting Data Into Visit LLNL-SM-446033 Getting Data Into VisIt July 2010 Version 2.0.0 Brad Whitlock wrence La Livermore National Laboratory ii DISCLAIMER This document was prepared as an account of work sponsored by an agency of the United States government. Neither the United States government nor Lawrence Livermore National Security, LLC, nor any of their employees makes any warranty, expressed or implied, or assumes any legal liability or responsibility for the accuracy, completeness, or usefulness of any information, apparatus, product, or process disclosed, or represents that its use would not infringe privately owned rights. Reference herein to any specific commercial product, process, or service by trade name, trade- mark, manufacturer, or otherwise does not necessarily constitute or imply its endorsement, recommendation, or favoring by the United States government or Lawrence Livermore National Security, LLC. The views and opinions of authors expressed herein do not necessarily state or reflect those of the United States government or Lawrence Liver- more National Security, LLC, and shall not be used for advertising or product endorsement purposes. This work was performed under the auspices of the U.S. Department of Energy by Lawrence Livermore National Laboratory in part under Contract W-7405-Eng-48 and in part under Contract DE-AC52-07NA27344. iii iv Table of Contents Introduction Manual chapters . 2 Manual conventions . 2 Strategies . 2 Picking a strategy . 3 Definition of terms . 4 Creating compatible files Creating a conversion utility or extending a simulation . 7 Survey of database reader plug-ins. 9 BOV file format . 9 X-Y Curve file format. 12 Plain text ASCII files . 12 NETCDF files . 13 HDF5 files. 13 Writing Silo files . 13 Using the Silo library . 14 Inspecting Silo files. 16 Silo files and parallel codes . 16 Creating a new Silo file. 16 Dealing with time . 18 Option lists . 19 Writing a rectilinear mesh. 20 Writing a curvilinear mesh . 23 Writing a point mesh. 26 Writing an unstructured mesh. 29 Writing a scalar variable . 35 Single precision vs. Double precision. 47 Writing expressions . 47 Creating a master file for parallel . 48 Writing VTK files . 54 Getting started with visit_writer . 55 Regular meshes with data . 56 Rectilinear meshes with data . 58 Curvilinear meshes with data . 60 Point meshes with data . 63 Unstructured meshes with data . 64 Creating a master file for parallel (.visit file) . 66 Creating compatible files II Advanced topics Writing vector data. 69 Adding metadata for performance boosts . 72 Writing data extents . 72 Writing spatial extents . 75 Ghost zones . 76 Writing ghost zones to your files . 78 Materials . 83 Creating a database reader plug-in Structure of VisIt . 89 plug-ins . 91 Starting your plug-in . 92 Picking a database reader plug-in interface . 92 Using XMLEdit . 92 Generating a plug-in code skeleton. 96 Building your plug-in . 97 Calling your plug-in for the first time. 100 Implementing your plug-in . 101 Required plug-in methods. 101 Debugging your plug-in . 103 Opening your file . 105 Returning file metadata. 106 Returning a mesh . 113 Returning a scalar variable . 126 Returning a vector variable. 127 Using a VTK reader class . 129 Advanced topics . 129 Returning cycles and times . 129 Auxiliary data . 134 Returning ghost zones. 139 Parallelizing your reader. 141 Instrumenting a simulation code Architecture. 143 Using libsim . 146 Getting libsim . 146 Building in libsim support . ..
Recommended publications
  • Feature Tracking & Visualization in 'Visit'
    FEATURE TRACKING & VISUALIZATION IN ‘VISIT’ By NAVEEN ATMAKURI A thesis submitted to the Graduate School—New Brunswick Rutgers, The State University of New Jersey in partial fulfillment of the requirements for the degree of Master of Science Graduate Program in Electrical and Computer Engineering written under the direction of Professor Deborah Silver and approved by New Brunswick, New Jersey October 2010 ABSTRACT OF THE THESIS Feature Tracking & Visualization in VisIt by Naveen Atmakuri Thesis Director: Professor Deborah Silver The study and analysis of large experimental or simulation datasets in the field of science and engineering pose a great challenge to the scientists. These complex simulations generate data varying over a period of time. Scientists need to glean large quantities of time-varying data to understand the underlying physical phenomenon. This is where visualization tools can assist scientists in their quest for analysis and understanding of scientific data. Feature Tracking, developed at Visualization & Graphics Lab (Vizlab), Rutgers University, is one such visualization tool. Feature Tracking is an automated process to isolate and analyze certain regions or objects of interest, called ‘features’ and to highlight their underlying physical processes in time-varying 3D datasets. In this thesis, we present a methodology and documentation on how to port ‘Feature Tracking’ into VisIt. VisIt is a freely available open-source visualization software package that has a rich feature set for visualizing and analyzing data. VisIt can successfully handle massive data quantities in the range of tera-scale. The technology covered by this thesis is an improvement over the previous work that focused on Feature Tracking in VisIt.
    [Show full text]
  • Python for Economists Alex Bell
    Python for Economists Alex Bell [email protected] http://xkcd.com/353 This version: October 2016. If you have not already done so, download the files for the exercises here. Contents 1 Introduction to Python 3 1.1 Getting Set-Up................................................. 3 1.2 Syntax and Basic Data Structures...................................... 3 1.2.1 Variables: What Stata Calls Macros ................................ 4 1.2.2 Lists.................................................. 5 1.2.3 Functions ............................................... 6 1.2.4 Statements............................................... 7 1.2.5 Truth Value Testing ......................................... 8 1.3 Advanced Data Structures .......................................... 10 1.3.1 Tuples................................................. 10 1.3.2 Sets .................................................. 11 1.3.3 Dictionaries (also known as hash maps) .............................. 11 1.3.4 Casting and a Recap of Data Types................................. 12 1.4 String Operators and Regular Expressions ................................. 13 1.4.1 Regular Expression Syntax...................................... 14 1.4.2 Regular Expression Methods..................................... 16 1.4.3 Grouping RE's ............................................ 18 1.4.4 Assertions: Non-Capturing Groups................................. 19 1.4.5 Portability of REs (REs in Stata).................................. 20 1.5 Working with the Operating System....................................
    [Show full text]
  • Lecture 1: September 8 Lecturer: Emery Berger Scribes: Bruno Silva, Gene Novark, Jim Partan
    CMPSCI 377 Operating Systems Fall 2009 Lecture 1: September 8 Lecturer: Emery Berger Scribes: Bruno Silva, Gene Novark, Jim Partan 1.1 General information This class is taught by Prof. Emery Berger in ELAB 323, every Tuesday and Thursday from 9:30am to 11:00am. ELAB is the older brick building next to the computer science building. For additional information related to the course (syllabus, projects, etc), please visit www.cs.umass.edu/~emery. Although the class name is Operating Systems, this course will deal mostly with large-scale computer systems. The main question to be studied is how to build correct, reliable and high performance computer systems. In order to answer this question, problems such as memory management, concurrency, scheduling, etc, will be studied. 1.1.1 Getting help Prof. Berger is available by appointment in his office (CS344). The TA's office hours are Mondays and Wednesdays from 4:00 PM to 5:00 PM in the Edlab (LGRT 223, 225). In addition to office hours, Discussion Sections will be led by the TA on Wednesdays from 12:20pm to 1:10pm in CMPSCI 142. Attendance is very important since sample test problems will be discussed, and the main concepts presented during the last couple of classes will be reinforced. Also, students looking for help can access the course's newsgroup (look for the link at Prof. Berger's website). Lecture notes will also be available. Although this class requires no textbook, interested students might look for classics of the area, such as Operating System Concepts (Silberschatz et all), 7th Edition.
    [Show full text]
  • A New Data Model, Programming Interface, and Format Using HDF5
    NetCDF-4: A New Data Model, Programming Interface, and Format Using HDF5 Russ Rew, Ed Hartnett, John Caron UCAR Unidata Program Center Mike Folk, Robert McGrath, Quincey Kozial NCSA and The HDF Group, Inc. Final Project Review, August 9, 2005 THG, Inc. 1 Motivation: Why is this area of work important? While the commercial world has standardized on the relational data model and SQL, no single standard or tool has critical mass in the scientific community. There are many parallel and competing efforts to build these tool suites – at least one per discipline. Data interchange outside each group is problematic. In the next decade, as data interchange among scientific disciplines becomes increasingly important, a common HDF-like format and package for all the sciences will likely emerge. Jim Gray, Distinguished “Scientific Data Management in the Coming Decade,” Jim Gray, David Engineer at T. Liu, Maria A. Nieto-Santisteban, Alexander S. Szalay, Gerd Heber, Microsoft, David DeWitt, Cyberinfrastructure Technology Watch Quarterly, 1998 Turing Award Volume 1, Number 2, February 2005 winner 2 Preservation of scientific data … the ephemeral nature of both data formats and storage media threatens our very ability to maintain scientific, legal, and cultural continuity, not on the scale of centuries, but considering the unrelenting pace of technological change, from one decade to the next. … And that's true not just for the obvious items like images, documents, and audio files, but also for scientific images, … and MacKenzie Smith, simulations. In the scientific research community, Associate Director standards are emerging here and there—HDF for Technology at (Hierarchical Data Format), NetCDF (network the MIT Libraries, Common Data Form), FITS (Flexible Image Project director at Transport System)—but much work remains to be MIT for DSpace, a groundbreaking done to define a common cyberinfrastructure.
    [Show full text]
  • Using Netcdf and HDF in Arcgis
    Using netCDF and HDF in ArcGIS Nawajish Noman Dan Zimble Kevin Sigwart Outline • NetCDF and HDF in ArcGIS • Visualization and Analysis • Sharing • Customization using Python • Demo • Future Directions Scientific Data and Esri • Direct support - NetCDF and HDF • OPeNDAP/THREDDS – a framework for scientific data networking, integrated use by our customers • Users of Esri technology • National Climate Data Center • National Weather Service • National Center for Atmospheric Research • U. S. Navy (NAVO) • Air Force Weather • USGS • Australian Navy • Australian Bur.of Met. • UK Met Office NetCDF Support in ArcGIS • ArcGIS reads/writes netCDF since version 9.2 • An array based data structure for storing multidimensional data. T • N-dimensional coordinates systems • X, Y, Z, time, and other dimensions Z Y • Variables – support for multiple variables X • Temperature, humidity, pressure, salinity, etc • Geometry – implicit or explicit • Regular grid (implicit) • Irregular grid • Points Gridded Data Regular Grid Irregular Grid Reading netCDF data in ArcGIS • NetCDF data is accessed as • Raster • Feature • Table • Direct read • Exports GIS data to netCDF CF Convention Climate and Forecast (CF) Convention http://cf-pcmdi.llnl.gov/ Initially developed for • Climate and forecast data • Atmosphere, surface and ocean model-generated data • Also for observational datasets • The CF conventions generalize and extend the COARDS (Cooperative Ocean/Atmosphere Research Data Service) convention. • CF is now the most widely used conventions for geospatial netCDF data. It has the best coordinate system handling. NetCDF and Coordinate Systems • Geographic Coordinate Systems (GCS) • X dimension units: degrees_east • Y dimension units: degrees_north • Projected Coordinate Systems (PCS) • X dimension standard_name: projection_x_coordinate • Y dimension standard_name: projection_y_coordinate • Variable has a grid_mapping attribute.
    [Show full text]
  • Introduction CFD General Notation System (CGNS)
    CGNS Tutorial Introduction CFD General Notation System (CGNS) Christopher L. Rumsey NASA Langley Research Center Outline • Introduction • Overview of CGNS – What it is – History – How it works, and how it can help – The future • Basic usage – Getting it and making it work for you – Simple example – Aspects for data longevity 2 Introduction • CGNS provides a general, portable, and extensible standard for the description, storage, and retrieval of CFD analysis data • Principal target is data normally associated with computed solutions of the Navier-Stokes equations & its derivatives • But applicable to computational field physics in general (with augmentation of data definitions and storage conventions) 3 What is CGNS? • Standard for defining & storing CFD data – Self-descriptive – Machine-independent – Very general and extendable – Administered by international steering committee • AIAA recommended practice (AIAA R-101A-2005) • Free and open software • Well-documented • Discussion forum: [email protected] • Website: http://www.cgns.org 4 History • CGNS was started in the mid-1990s as a joint effort between NASA, Boeing, and McDonnell Douglas – Under NASA’s Advanced Subsonic Technology (AST) program • Arose from need for common CFD data format for improved collaborative analyses between multiple organizations – Existing formats, such as PLOT3D, were incomplete, cumbersome to share between different platforms, and not self-descriptive (poor for archival purposes) • Initial development was heavily influenced by McDonnell Douglas’ “Common
    [Show full text]
  • A COMMON DATA MODEL APPROACH to NETCDF and GRIB DATA HARMONISATION Alessandro Amici, B-Open, Rome @Alexamici
    A COMMON DATA MODEL APPROACH TO NETCDF AND GRIB DATA HARMONISATION Alessandro Amici, B-Open, Rome @alexamici - http://bopen.eu Workshop on developing Python frameworks for earth system sciences, 2017-11-28, ECMWF, Reading. alexamici / talks MOTIVATION: FORECAST DATA AND TOOLS ECMWF Weather forecasts, re-analyses, satellite and in- situ observations N-dimensional gridded data in GRIB Archive: Meteorological Archival and Retrieval System (MARS) A lot of tools: Metview, Magics, ecCodes... alexamici / talks MOTIVATION: CLIMATE DATA Copernicus Climate Change Service (C3S / ECMWF) Re-analyses, seasonal forecasts, climate projections, satellite and in-situ observations N-dimensional gridded data in many dialects of NetCDF and GRIB Archive: Climate Data Store (CDS) alexamici / talks HARMONISATION STRATEGIC CHOICES Metview Python Framework and CDS Toolbox projects Python 3 programming language scientic ecosystems xarray data structures NetCDF data model: variables and coordinates support for arbitrary metadata CF Conventions support on IO label-matching broadcast rules on coordinates parallelized and out-of-core computations with dask alexamici / talks ECMWF NETCDF DIALECT >>> import xarray as xr >>> ta_era5 = xr.open_dataset('ERA5-t-2016-06.nc', chunks={}).t >>> ta_era5 <xarray.DataArray 't' (time: 60, level: 3, latitude: 241, longitude: 480)> dask.array<open_dataset-..., shape=(60, 3, 241, 480), dtype=float64, chunksize=(60, 3, Coordinates: * longitude (longitude) float32 0.0 0.75 1.5 2.25 3.0 3.75 4.5 5.25 6.0 ... * latitude (latitude) float32 90.0 89.25 88.5 87.75 87.0 86.25 85.5 ... * level (level) int32 250 500 850 * time (time) datetime64[ns] 2017-06-01 2017-06-01T12:00:00 ..
    [Show full text]
  • Development of a Coupling Approach for Multi-Physics Analyses of Fusion Reactors
    Development of a coupling approach for multi-physics analyses of fusion reactors Zur Erlangung des akademischen Grades eines Doktors der Ingenieurwissenschaften (Dr.-Ing.) bei der Fakultat¨ fur¨ Maschinenbau des Karlsruher Instituts fur¨ Technologie (KIT) genehmigte DISSERTATION von Yuefeng Qiu Datum der mundlichen¨ Prufung:¨ 12. 05. 2016 Referent: Prof. Dr. Stieglitz Korreferent: Prof. Dr. Moslang¨ This document is licensed under the Creative Commons Attribution – Share Alike 3.0 DE License (CC BY-SA 3.0 DE): http://creativecommons.org/licenses/by-sa/3.0/de/ Abstract Fusion reactors are complex systems which are built of many complex components and sub-systems with irregular geometries. Their design involves many interdependent multi- physics problems which require coupled neutronic, thermal hydraulic (TH) and structural mechanical (SM) analyses. In this work, an integrated system has been developed to achieve coupled multi-physics analyses of complex fusion reactor systems. An advanced Monte Carlo (MC) modeling approach has been first developed for converting complex models to MC models with hybrid constructive solid and unstructured mesh geometries. A Tessellation-Tetrahedralization approach has been proposed for generating accurate and efficient unstructured meshes for describing MC models. For coupled multi-physics analyses, a high-fidelity coupling approach has been developed for the physical conservative data mapping from MC meshes to TH and SM meshes. Interfaces have been implemented for the MC codes MCNP5/6, TRIPOLI-4 and Geant4, the CFD codes CFX and Fluent, and the FE analysis platform ANSYS Workbench. Furthermore, these approaches have been implemented and integrated into the SALOME simulation platform. Therefore, a coupling system has been developed, which covers the entire analysis cycle of CAD design, neutronic, TH and SM analyses.
    [Show full text]
  • MATLAB Tutorial - Input and Output (I/O)
    MATLAB Tutorial - Input and Output (I/O) Mathieu Dever NOTE: at any point, typing help functionname in the Command window will give you a description and examples for the specified function 1 Importing data There are many different ways to import data into MATLAB, mostly depending on the format of the datafile. Below are a few tips on how to import the most common data formats into MATLAB. • MATLAB format (*.mat files) This is the easiest to import, as it is already in MATLAB format. The function load(filename) will do the trick. • ASCII format (*.txt, *.csv, etc.) An easy strategy is to use MATLAB's GUI for data import. To do that, right-click on the datafile in the "Current Folder" window in MATLAB and select "Import data ...". The GUI gives you a chance to select the appropriate delimiter (space, tab, comma, etc.), the range of data you want to extract, how to deal with missing values, etc. Once you selected the appropriate options, you can either import the data, or you can generate a script that imports the data. This is very useful in showing the low-level code that takes care of reading the data. You will notice that the textscan is the key function that reads the data into MATLAB. Higher-level functions exist for the different datatype: csvread, xlsread, etc. As a rule of thumb, it is preferable to use lower-level function are they are "simpler" to understand (i.e., no bells and whistles). \Black-box" functions are dangerous! • NetCDF format (*.nc, *.cdf) MATLAB comes with a netcdf library that includes all the functions necessary to read netcdf files.
    [Show full text]
  • R Programming for Climate Data Analysis and Visualization: Computing and Plotting for NOAA Data Applications
    Citation of this work: Shen, S.S.P., 2017: R Programming for Climate Data Analysis and Visualization: Computing and plotting for NOAA data applications. The first revised edition. San Diego State University, San Diego, USA, 152pp. R Programming for Climate Data Analysis and Visualization: Computing and plotting for NOAA data applications SAMUEL S.P. SHEN Samuel S.P. Shen San Diego State University, and Scripps Institution of Oceanography, University of California, San Diego USA Contents Preface page vi Acknowledgements viii 1 Basics of R Programming 1 1.1 Download and install R and R-Studio 1 1.2 R Tutorial 2 1.2.1 R as a smart calculator 3 1.2.2 Define a sequence in R 4 1.2.3 Define a function in R 5 1.2.4 Plot with R 5 1.2.5 Symbolic calculations by R 6 1.2.6 Vectors and matrices 7 1.2.7 Simple statistics by R 10 1.3 Online Tutorials 12 1.3.1 Youtube tutorial: for true beginners 12 1.3.2 Youtube tutorial: for some basic statistical summaries 12 1.3.3 Youtube tutorial: Input data by reading a csv file into R 12 References 14 Exercises 14 2 R Analysis of Incomplete Climate Data 17 2.1 The missing data problem 17 2.2 Read NOAAGlobalTemp and form the space-time data matrix 19 2.2.1 Read the downloaded data 19 2.2.2 Plot the temperature data map of a given month 20 2.2.3 Extract the data for a specified region 22 2.2.4 Extract data from only one grid box 23 2.3 Spatial averages and their trends 23 2.3.1 Compute and plot the global area-weighted average of monthly data 23 2.3.2 Percent coverage of the NOAAGlobalTemp 24 2.3.3 Difference from the
    [Show full text]
  • Oracle Solaris Containers Data Sheet
    ORACLE DATA SHEET ORACLE SOLARIS 8 CONTAINERS AND ORACLE SOLARIS 9 CONTAINERS KEY BENEFITS Oracle Solaris 8 Containers and Oracle Solaris 9 Containers let you migrate !"Quickly deploy Oracle Solaris 8 or Oracle Solaris 8 and Oracle Solaris 9 applications and environments to the Oracle Solaris 9 applications on Oracle Solaris 10 latest SPARC systems running Oracle Solaris 10. These environments can later !"Lower costs through consolidation onto be updated, at your own pace, to Oracle Solaris 10 Containers. With migration newer systems assistance from Oracle Advanced Customer Services, take advantage of Oracle !"Mitigate risks with expert assistance Solaris 8 Containers and Oracle Solaris 9 Containers to reduce the effort and !"Implement your solution for a single risk associated with migrating to the latest network computing environments. application or as part of a larger virtualization strategy Deploy Previous-Version Oracle Solaris Applications on Oracle Solaris 10 Oracle Solaris 8 Containers and Oracle Solaris 9 Containers serve as transition tools, decoupling your hardware upgrade from the operating system (OS) and applications, giving you more time to upgrade your applications to Oracle Solaris 10. Oracle guarantees application compatibility between OS releases, and builds on that by adding the ability to capture and transfer the surrounding configuration, data, and other elements of the OS. You can rapidly upgrade to new Oracle hardware and Oracle Solaris 10 while providing the Oracle Solaris 8 or Oracle Solaris 9 runtime environment as needed. Lower Costs by Consolidating Systems Today’s systems can be many times more powerful, space-efficient, and cost-efficient than earlier generations. However, hardware upgrades often demand OS as well as application upgrades.
    [Show full text]
  • Developing for Industrial Iot with Embedded Linux OS on Dragonboard™ 410C by Timesys University
    Qualcomm Developer Network Presents Developing for Industrial IoT with Embedded Linux OS on DragonBoard™ 410c by Timesys University Co-sponsored by Qualcomm Technologies, Inc. and Arrow Electronics Session 3 Building a Cutting-Edge User Interface with Qt® Maciej Halasz, Vice President of Technology Timesys Corporation Maurice Kalinowski, Principal Software Engineer Qt Company www.timesys.com ©2017 Timesys Corp. 3 Webinar Series . Session 1: Introduction to DragonBoard 410 SoC and Starting Development of Your Embedded Linux based “Industrial Internet of Things” (IIoT) Device • Setup for designing IIoT products • How to assemble and deploy initial BSP . Session 2: Application Development for Embedded Linux • Application development environment setup • How to reflect product requirements in the BSP • Communication in the IIoT system . Session 3: Building a Cutting-Edge User Interface with Qt® • Developing modern, rich UIs for factory terminals . Session 4: Embedded Products Security • Designing security-rich devices www.timesys.com ©2017 Timesys Corp. 4 Session 2 recap . What we did • Reflected API requirements in OpenEmbedded RPB Linux BSP • Talked about BSP customizations – New meta-layer – New recipe – Modified image . Application Development • SDK setup on a host • Used IDE to develop/deploy/debug code on DragonBoard 410c • We looked briefly at a BLE protocol • Programmed an application that received info from a sensor board – Temperature, Luminosity, FreeFall . Key takeaways • Its very straight forward to develop C/C++ application code for the 410c • IDEs accelerate development process • When working with Linaro BSP, customers can leverage meta layer library on OpenEmbedded https://layers.openembedded.org www.timesys.com ©2017 Timesys Corp. 5 Session 3 — Agenda . Sharing information in an IoT system .
    [Show full text]