
TetFusion: An Algorithm For Rapid Tetrahedral Mesh Simplification Prashant Chopra and Joerg Meyer* Mississippi State University Engineering Research Center Figure 1: Four macroLoDs (defined in section 3.1) of the 12,936 elements spx dataset, created in less than 1 second on an SGI R10000 194MHz, 2048 MB RAM. (a) The original mesh (b) 20.06% reduced (c) 31.20% reduced (d) 38.29% reduced. Only a portion of the mesh (cells that intersect a vertical cutting plane in the XY plane at a specific Z value) is rendered to show the interior elements. (Dataset courtesy: Peter Williams, Lawrence Livermoore National Laboratory). Abstract 1 INTRODUCTION This paper introduces an algorithm for rapid progressive Large and highly detailed volumetric models are very common simplification of tetrahedral meshes: TetFusion. We describe how now in research environments due to improvements in hardware a simple geometry decimation operation steers a rapid and technology and computational methods. Lately, they have found controlled progressive simplification of tetrahedral meshes, while their place in research areas like computational vector field also taking care of complex mesh-inconsistency problems. The simulation, finite element analysis, medical imaging, range algorithm features a high decimation ratio per step, and inherently scanning, and large-scale visualization. Scientists employ a discourages any cases of self-intersection of boundary, element- number of techniques like contour detection and iso-contouring, boundary intersection at concave boundary-regions, and negative contour connection, iso-surface generation, voxel-based volume tetrahedra (flipping). We achieved rigorous reduction tessellation, etc., to create these models, mostly to simulate a ratios of up to 98% for meshes consisting of 827,904 elements in natural or engineering phenomenon. However, since these less than 2 minutes, progressing through a series of level-of- techniques do not always take the requirements for efficient details (LoDs) of the mesh in a controlled manner. We describe rendering into account during creation stages, the resulting models how the approach supports a balanced re-distribution of space often turn out to be very complex in terms of the number of between tetrahedral elements, and explain some useful control geometric primitives and their spatial relationship. Thus, when parameters that make it faster and more intuitive than ‘edge these models are broken down to rendering primitives like collapse’-based decimation methods for volumetric meshes [3, 18, triangles or quadrilaterals for the purposes of visualization, the 20, 21]. Finally, we discuss how this approach can be employed large number of primitives poses a challenge to interactive for rapid LoD prototyping of large time-varying datasets as an aid rendering. Each of these primitives might have associated to interactive visualization. attributes like scalar values, vector tuples, etc., as results of scientific simulations [2, 11, 12], or direct visual attributes like CR Categories: I.3.5 [Computer Graphics]: Computational color, transparency, texture, normal vector information, etc. It is Geometry and Object Modeling – surfaces and object well known that such meshes are often inefficient in terms of representations. storage, access, and transmission over computer network media. Keywords: mesh simplification, multi resolution, level-of-detail, unstructured meshes. To add to these complexities, some scientific and engineering simulations output four- or higher-dimensional (e.g., time- ____________________________ varying) datasets in the form of higher order simplicial complexes * {prash | jmeyer}@erc.msstate.edu (polyhedral meshes), both homogeneous and heterogeneous. The http://www.cs.msstate.edu/~graphics most common examples of homogeneous polyhedral meshes are tetrahedral grids, because of factors like the convex shape of the tetrahedral elements, which makes them suitable for volume rendering. A typical example is an earthquake simulation dataset (11,800,639 vertices, 69,448,288 tetrahedra), which takes about 3.3 gigabytes of storage space just for the basic node geometry and the tetrahedral connectivity information, and an additional coplanar faces over the mesh in form of patches, which are re- 141 megabytes for associated vector information for 120 time- triangulated with a smaller number of primitives. Their greedy steps in each node. To visualize such large-scale volumetric face-merging algorithm has a time complexity of O(N), N being meshes with time-varying attributes interactively is still an open the number of polygon faces in the original mesh. Further, this research arena. algorithm is more effective in reducing the polygon count in a given polygonal mesh, because of the decimation of multiple Mesh simplification has been addressed in a number of primitives per step. publications in recent years as a preprocessing step to reduce the geometric complexity of both polygonal and polyhedral meshes. Hugues Hoppe [8] proposes a novel progressive continuous- Both refinement- and decimation-based strategies have been resolution representation of 2-manifold triangular meshes. He explored. Most of these publications extend ‘edge collapse’-based claims that out of the three basic primitive mesh-simplification decimation methods to volumetric meshes, focusing on accurate operations of edge collapse, edge split and edge swap, an ‘edge error evaluation metrics [3, 21], while also preventing mesh- collapse’ is all that is needed for effective progressive inconsistencies [3, 18, 21]. These methods work very well as simplification of meshes. The primitive operation of ‘edge metrics-guided simplification tools. However, the time- collapse’ is presented as a completely reversible operation with complexities of these algorithms (please see section 2.2) ‘vertex split’ as its counter-operation. This work shows how a discourage their application upon massive datasets of the order of mesh could be represented as a basic mesh (the simplified one million or more elements. version) and a series of ‘vertex split’ records that can be applied to refine the basic mesh back to the original representation at full This paper describes a rapid tetrahedral mesh simplification resolution. This representation permits geo-morphing and framework, which incorporates the constraints on mesh progressive transmission along with significant compression and consistency, as well as it binds the geometric and attribute field support for selective refinement. errors; while also providing better spatial control over LoDs. Specifically, we employ a symmetric reduction operation: Schroeder [17] extends his previous work on triangle mesh TetFuse, which suits the natural re-distribution of volume among decimation [16]. He proposes a new algorithm that guarantees a a reduced number of elements upon every decimation step. We specified level of reduction, but modifies the topology while present TetFusion as a progressive simplification method to create performing local decimation to achieve the result. He includes two macroLoDs of both static and time varying tetrahedral meshes. additional primitive operations in the algorithm: vertex split, and Finally, we discuss how this approach can assist in interactive vertex merge. Compression ratios as high as 200:1 can be visualization of time-varying volumetric datasets of tetrahedral achieved for some models. Finally, Garland and Heckbert [5] nature. suggest a more organized way of representing error metrics in terms of quadrics for efficient calculation of hierarchy for view- dependent simplification using edge collapses. 2 RELATED WORK We classify recent publications in the area of mesh simplification 2.2. Polyhedral Mesh Simplification broadly into two categories: those based upon polygonal mesh reduction techniques, and those employing polyhedral mesh Only a few attempts have been made towards polyhedral mesh reduction techniques. simplification lately. Both refinement- and decimation- based strategies have been explored in this pursuit. For a detailed note on these strategies, we suggest a reference to the survey report on 2.1 Polygonal Mesh Simplification mu lti-resolution modeling by M. Garland [4]. This sec t i on focuses specifically on decimation strategies because they closely relate to The class of algorithms that simplify polygonal (2-manifold) our methods described in the next section. surface meshes fall into this category. Most of the work discussed here is based on geometry reduction techniques, that eliminate one Trotts et al. [20] extend polygonal geometry reduction or more geometric primitives at a time based on specific techniques for tetrahedral meshes. The authors propose a constraining criteria, and then consolidate the mesh making use of tetrahedral collapse operation as a sequence of three edge a smaller number of primitives. The area of polygonal mesh collapses, while keeping the overall error (based on a unique simplification has seen some significant improvements lately. spline-segment representation of each tetrahedron) below a tolerance range. The paper discusses problems and difficulties Schroeder et al. [16] propose a simple multi-pass vertex specific to tetrahedral mesh simplification, and presents a decimation algorithm for triangular meshes. The hole resulting framework to employ edge-collapse to decimate tetrahedra. from a vertex-removal is re-triangulated with a smaller number of Because the decimation strategy is based upon successive ‘edge triangles.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages8 Page
-
File Size-