Abstract Lightmap Generation And
Total Page:16
File Type:pdf, Size:1020Kb
ABSTRACT LIGHTMAP GENERATION AND PARAMETERIZATION FOR REAL-TIME 3D INFRA-RED SCENES by Meisam Amjad Having high resolution Infra-Red (IR) imagery in cluttered environment of battlespace is crucial for capturing intelligence in search and target acquisition tasks such as whether or not a vehicle (or any heat source) has been moved or used and in which direction. While 3D graphic simulation of large scenes helps with retrieving information and training analysts, using traditional 3D rendering techniques are not enough, and an additional parameter needs to be solved due to different concept of visibility in IR scenes. In 3D rendering of IR scenes, the problem of what can currently be seen by a participant of the simulation does not just depend on emitted thermal energy from objects, and the visibility also depends on previous scenes as thermal energy is slowly retained and diffused over time. Therefore, time as an additional factor must be included since the aggregation of heat energy in the scene relates to its past. Our solution uses lightmaps for storing energy that reaches surfaces over time. We modify the lightmaps to solve the problem of lightmap parameterization between 3D surfaces and 2D mapping and add an extra ability to let us periodically update only necessary areas based on dynamic aspects of the scene. LIGHTMAP GENERATION AND PARAMETERIZATION IN A REAL-TIME 3D INFRA-RED SCENES A Thesis Submitted to the Faculty of Miami University in partial fulfillment of the requirements for the degree of Master of Science by Meisam Amjad Miami University Oxford, Ohio 2019 Advisor: Dr.John Femiani Reader: Dr.Eric Bachmann Reader: Dr.Vijayalakshmi Ramasamy c 2019 Meisam Amjad This Thesis titled LIGHTMAP GENERATION AND PARAMETERIZATION IN A REAL-TIME 3D INFRA-RED SCENES by Meisam Amjad has been approved for publication by The College of Arts and Science and Department of Computer Science and Software Engineering Dr.John Femiani Dr.Eric Bachmann Dr.Vijayalakshmi Ramasamy Contents LIST OF TABLESv LIST OF FIGURES vi ACKNOWLEDGEMENTS viii 1 INTRODUCTION1 2 CONTRIBUTION7 3 BACKGROUND8 3.1 Parameterization . .8 3.1.1 Non-shape-preserving embedding . .9 3.1.2 Angle preserving or conformal parameterization . 10 3.1.3 Distance preserving parameterization . 12 3.1.4 Area preserving parameterization . 13 3.1.5 Trade-off between metrics . 14 3.2 Cutting . 15 3.2.1 Mesh segmentation . 15 3.2.2 Chart packing . 16 3.2.3 Seam Cutting . 16 4 AREA TARGETING PARAMETERIZATION 17 4.1 Generating Initial Mesh . 17 4.1.1 Mesh Data Structure . 17 4.1.2 Identifying Non-Manifold errors . 17 4.1.3 Identifying Seams . 21 4.1.4 Merging patches and scaling . 22 iii 4.2 Per-part area-preserving parameterization . 23 4.2.1 Detecting outer boundary for mapping to a disk . 24 4.2.2 Area-preserving parameterization of the mesh . 25 5 EVALUATION 33 5.1 Metrics . 33 5.1.1 Metrics for a Complete System . 33 5.1.2 Metrics for the Parametrization Component . 34 5.2 Experiments . 34 5.2.1 Testing different objects . 35 5.2.2 Testing in a dynamic scene . 40 5.3 Results . 40 6 CONCLUSION 42 References 43 Appendices 50 Appendix LIGHTMAPPER LIBRARY 51 iv List of Tables 3.1 Summary of basic approaches in parameterization . 14 5.1 Result of parameterization for different 3D objects . 41 5.2 Result of simulation a dynamic scene where there is a heat source moving . 41 v List of Figures 1.1 An Example of Thermal Inertia . .2 1.2 Demonstration of the problem: Input and Desired Output . .3 1.3 Solution overview . .4 1.4 Example of results (generated lightmap) . .5 1.5 Snapshot from created library . .6 4.1 Sample 3D object that is used for demonstration and explanation. 18 4.2 Demonstration of non-manifold edge. 19 4.3 Demonstration of the first generated Lightmap atlas. 20 4.4 Demonstration of folded surfaces. 20 4.5 Demonstration of overlapping. 20 4.6 Demonstration of different kind of edges . 22 4.7 UV mesh before scaling in 3D space . 23 4.8 Complete UV mesh after scaling in 3D space . 23 4.9 A sample island for parameterization . 24 4.10 Demonstration of different boundaries. 24 4.11 Filled in image of the island. 25 4.12 Topological disk mesh in 3D and UV space. 25 4.13 Three equal-area disk patches examples . 26 4.14 Polygon explaining the weights used by sibson’s interpolant . 28 4.15 Phantom triangles for relaxing the boundary . 29 4.16 Island without added phantom triangles to relax the boundary . 29 4.17 Square error in parameterization in area-preserving parameterization. 30 4.18 Final UV mesh . 31 4.19 Final result after generating lightmap atlas for our 3D object . 32 5.1 3D Cube object . 35 5.2 Generated lightmap from Cube before parameterization. 36 vi 5.3 Generated lightmap from Cube before parameterization. 36 5.4 Parameterized island from cube . 36 5.5 3D Monkey object . 37 5.6 Final lightmap for monkey . 37 5.7 Errors after parameterizing two of the islands from monkey . 38 5.8 3D Bunny object . 38 5.9 Final lightmap for bunny . 39 5.10 Errors after parameterizing two of the islands from bunny . 39 5.11 Different frames of created dynamic scene . 40 vii ACKNOWLEDGEMENTS I would first like to thank my advisor, Dr.John Femiani, for presenting me with this interest- ing problem, and for many conversations providing critical guidance throughout the project. I also thank him for being patient and helping me to grow during my time as a graduate student. I also thank Drs. Eric Bachmann and Vijayalakshmi Ramasamy, for taking the time to read this thesis and offer their insights, as well as for a great deal of friendly conver- sation and enjoyable classroom instruction during my time in Miami University. Last, but certainly not least, I offer a great deal of thanks to wife, without whom I would have lacked crucial support and guidance in maintaining my interest and desire to finish this graduate program. viii Chapter 1 INTRODUCTION High resolution IR imagery is currently acquired from a variety of mobile sensors; including cameras mounted on aircraft and drones. A key benefit of IR sensors is that they can capture thermal energy that is retained, diffused, and slowly re-emitted over time by a surface; a process we call thermal inertia. Thermal inertia makes it possible for an analyst looking at sensor imagery to make deductions such as whether machinery has been used recently or whether a vehicle has been moved because the heat emitted, or blocked, by the object leaves a persistent thermal signature in the scene. In order to train analysts, it is important to simulate IR sensors including the effect of heat inertia in physically accurate large scenes. Existing 3D simulations and training solutions can simulate IR sensors and allow operators to search for objects that are warmer or cooler than their surroundings. However, prior to this work we are unaware of any work that attempts to model the important effect of heat inertia. A key challenge when rendering IR with thermal inertia is the visibility problem in computer graphics. For traditional 3D scene rendering, one need only solve the challenging problem of what can currently be seen by a participant in the simulation. For thermal inertia situations, an additional dimension of time must be included because the appearance of a scene depends on its past. Heat slowly diffuses through and between surface materials, so that in order to predict the IR signature of a surface you must aggregate the effects of energy diffusion over a window in time. Our solution uses an existing technique for rendering diffuse lighting, called a lightmap. Lightmaps are a rendering solution used as a solution for storing view-independent irradiance in a real-time 3D IR scene. A lightmap is pre-computed irradiance texture that can be stored in memory and reused during rendering to capture complex diffuse lighting effects. For static scenes, it is useful to solve the radiosity equations to find a steady-state lighting solution that does not need to be modified in a scene unless the light sources are changed or the objects move. If the lighting changes in a scene, however, then the lightmap will need to be modified. 1 We propose to modify lightmaps to record and accumulate the incident energy that reaches a surface over time; unlike lightmaps used for radiosity the thermal energy will not generally reach a steady state as the simulations will involve moving heat-sources. In our approach, a real time photon-mapping solution is used to determine the incoming thermal energy from dynamic objects, as well as the energy that was absorbed and re-emitted by static objects in the scene. The lightmap data is periodically updated based on incoming heat energy from dynamic objects and a heat-diffusion equation. A key challenge of this approach is that the lightmap must be recomputed for all dynamic objects in a scene, not just the ones that are currently visible by a player in a simulation. Based on these challenges, we aim to solve the problem of generating a lightmap param- eterization of a scene that associates 3D (XYZ) surfaces with unique 2D (UV) locations on a lightmap image so that portions of the surface exposed to more thermal energy variation (closer to heat sources) occupy more area than the portions of the surface that are far from the heat sources. Although the solution we discuss is based on thermal energy, this dynamic approach can be use whenever one wishes to control the photon density in lightmaps.