Final.Pdf (2.120Mb)

Final.Pdf (2.120Mb)

AUTOMATED TEXTURE MAPPING OF LASER BASED RANGE IMAGES by RATTASAK SRISINROONGRUANG, B.S. A THESIS IN COMPUTER SCIENCE Submitted to the Graduate Faculty of Texas Tech University in Partial Fulfillment of the Requirements for the Degree of MASTER OF SCIENCE Approved Eric D. Sinzinger Chairperson of the Committee Gopal D. Lakhani Hector J. Hernandez Accepted John Borrelli Dean of the Graduate School August, 2005 ACKNOWLEDGEMENTS I would like to thank Mom and Dad for everything they’ve done and had to go through to give their children the best kind of life that they could. I only hope I will be as kind and giving as you throughout my life. I would like to thank Dr. Eric Sinzinger for all the help he has given me during my research. The suggestions and advice made the completion of this work possible. ii CONTENTS ABSTRACT . v LIST OF FIGURES . vi LIST OF TABLES . vii 1 INTRODUCTION . 1 2 RELATED WORK . 3 2.1 Texture Mapping . 3 2.2 Laser Range Data . 3 2.3 Data Registration . 4 2.4 Image Segmentation . 5 3 BACKGROUND . 7 3.1 3D Transformations . 7 3.2 Camera Model . 8 3.3 Texture Mapping Overview . 10 3.4 Texture Mapping Techniques . 11 3.5 Texture Mapping Types . 12 3.6 Texture Mapping Effects . 14 3.7 Aliasing and Filtering . 16 3.8 Image Segmentation . 18 4 METHODOLOGY . 21 4.1 Automated Mesh Alignment . 21 4.2 Stencil Calculation . 21 4.2.1 Translation Alignment . 22 4.2.2 Scale Alignment . 23 4.2.3 Rotation Alignment . 23 4.2.4 Alignment Metric . 25 4.2.5 Field of View Alignment . 26 4.2.6 Combined Transform Alignment . 26 iii 4.3 Texture Coordinate Mapping . 28 4.3.1 Orthographic Projection . 28 4.3.2 Perspective Projection . 28 5 RESULTS ................................ 31 6 CONCLUSION AND FUTURE WORK . 42 6.1 Advantages and Disadvantages . 42 6.2 Future Work and Improvements . 43 6.2.1 Stencil calculations . 43 6.2.2 Extended borders . 44 6.2.3 Lighting and shading . 44 REFERENCES . 45 iv ABSTRACT Texture mapping is the process of applying a 2D image onto a 3D planar surface. This requires the generation of a mapping that defines the relationship between the 2D coordinates of the image and the 3D coordinates of the surface. The goal of this research is to provide a method of automatically generating this mapping given a 3D object at arbitrary orientation and a 2D image that may contain unwanted background information. A review of the current methods of texture mapping, image segmentation, and basic 3D viewing transforms is given. An algorithm to compute this alignment given the proper segmentation of the 2D image is then proposed and tested with five different models. The results of the generated alignment and mapping are then discussed, showing the level of accuracy of the final texture mapped model. v LIST OF FIGURES 3.1 Viewing frustum . 8 3.2 Texture Mapping Example . 11 3.3 Segmentation using clustering with color. Reprinted from ”Computer Vision: A Modern Approach,” by Forsyth and Ponce, Prentice Hall, 2003. .................................... 20 3.4 Segmentation using clustering with color and position. Reprinted from ”Computer Vision: A Modern Approach,” by Forsyth and Ponce, Pren- tice Hall, 2003. .............................. 20 5.1 Segmentation Images . 32 ◦ 5.2 Teapot Alignment Results for δr = 45 ................. 35 ◦ 5.3 Teapot Alignment Results for δr = 20 ................. 35 ◦ 5.4 Teapot Initial Orientation and Result Error for δr = 20 . 35 ◦ 5.5 Face Initial Orientation and Result Error for δr = 20 ......... 36 ◦ 5.6 Face Alignment Results for δr = 45 ................... 36 ◦ 5.7 Face Alignment Results for δr = 20 ................... 36 ◦ 5.8 Mechanical Part Initial Orientation and Result Error for δr = 20 . 37 ◦ 5.9 Mechanical Part Alignment Results for δr = 45 ............ 37 ◦ 5.10 Mechanical Part Alignment Results for δr = 20 ............ 37 ◦ 5.11 Cessna Initial Orientation and Result Error for δr = 20 . 38 ◦ 5.12 Cessna Alignment Results for δr = 45 ................. 38 ◦ 5.13 Cessna Alignment Results for δr = 20 ................. 38 ◦ 5.14 Will Rogers Initial Orientation and Result Error for δr = 20 . 39 ◦ 5.15 Will Rogers Alignment Results for δr = 45 ............... 40 ◦ 5.16 Will Rogers Alignment Results for δr = 20 ............... 40 vi LIST OF TABLES 5.1 Model Sizes . 32 ◦ 5.2 Alignment Calculations with δr = 45 .................. 33 ◦ 5.3 Alignment Calculations with δr = 20 .................. 33 vii CHAPTER 1 INTRODUCTION Texture mapping has become an integral component of computer generated scenes, whether in movies, games, or another form of graphical rendering. Before, with meshes composed of thousands of polygons considered complex, manual assignment of texture coordinates to 3D model coordinates, though time consuming, was a man- ageable task. With increased processing power and memory however, 3D models can now be composed of hundreds of thousands of polygons. An automated method of assigning texture coordinates with quick visual feedback would expediate the artistic pipeline. Texture mapping is the process of applying an image (usually 2D) to a planar 2D surface (usually in 3D space). This can be used to increase the visual interest and complexity of a scene without adding increased geometric data. The most common component to add is surface color. However, texture mapping is also used to apply light effects, shadow effects, reflective effects, and surface irregularity effects onto a surface. The process of applying and warping a texture on a planar surface is computationally inexpensive compared to the cost of transforming a geometrically complex scene. This project involves the automated mapping of textures to 3D geometric data, requiring as little user input as possible. The traditional process of applying a non- tiled texture map to a 3D model requires extensive user input to select the proper ’binding’ of texture coordinate points to 3D model vertices. Traditionally, the process of creating a non-tiled mapping onto a 3D model is accomplished in two ways. One method requires the user to manually assign each vertex in the model to a specific point in the 2D texture. For increasingly complex 3D models, this process takes a long time. The other method is ”reverse-skinning”. This involves obtaining the unwrapped ”outline” of the 3D model on a 2D texture. The user can then apply the color properties onto the 2D texture as desired, knowing 1 where each point on the 2D texture is mapped on the 3D model. Both methods require significant user input. Both of these processes can take a long time to accomplish. In the former case, the process does not scale well with increasing geometric complexity in terms of the amount of input needed from the user. In the latter case, working around the 3D model outline usually means a stock piece of texture cannot be applied to the model as the outline (usually generated automatically from an unwrapping tool) will have an orientation not in alignment with the texture of the object. If the process of texture mapping can be automated as proposed in this research, the time needed to correctly apply a texture to a 3D model may be significantly re- duced and the problems of the two traditional methods of texture mapping discussed earlier can be reduced. This automated process should scale well with increased geo- metric complexity of models in terms of time because the user will not be required to manually set the mapping between texture points and model points for every vertex. The issue with ”reverse-skinning” where the orientation of the model outline does not align with the stock texture may also be eliminated. Given a stock texture containing a texture of the object whose 3D representation the user wishes to have mapped, this new process of texture mapping would ideally determine the transformation needed to align the 3D model so that its orientation can closely match that present in the texture. From this, the mapping of texture coordinates to 3D model vertices can be automatically computed. 2 CHAPTER 2 RELATED WORK 2.1 Texture Mapping Texture mapping is the process where a 2D texture is applied to a 3D object. The process involves the parametrization of a surface and the subsequent application of the texture onto the surface. [3]. Texture mapping made gains in popularity because of its ability to add realism and interest to a scene without adding increased geometric complexity that would lead to significantly increased processing time. Such effects include color, reflection, shadow, and surface irregularities. The reflection effect is not a true reflection and requires the use of an environment map that is ”wrapped” around the 3D object as rays are cast outward from the object to determine the intersection point with the environment map [3]. Surface irregularities are modeled with the use of a bump map that defines surface normal offsets. These offsets are used during lighting calculations to give the illusion of surface irregularities [2]. Combined transforms of 3D points are facilitated by encoding the transformations as matrix operations. This requires that 3D points be represented as the 4 compo- nents of a homogenous coordinate system so that the 3D points can be manipulated by multiplication with the transformation matrix [15]. With the advent of modern graphic processing units that allow efficient matrix operations, this has the added bonus of allowing the transformations to be done on specialized hardware. The most common type of texture mapping is the perspective texture mapping, a transformation that is related to the perspective transformation used when manip- ulating the point of view [8]. This type of mapping limits the distortions apparent when using an affine texture mapping transformation. 2.2 Laser Range Data Laser range scanner devices emit a beam that is transmitted and recaptured to determine the distance to an object.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    54 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us