A Terrain Rendering Primer Andrew Liounis and John Christian LSIC Workshop on Lunar Mapping for Precision Landing March 4, 2021 Outline
Total Page:16
File Type:pdf, Size:1020Kb
A Terrain Rendering Primer Andrew Liounis and John Christian LSIC Workshop on Lunar Mapping for Precision Landing March 4, 2021 Outline • Purpose • Rendering Situations • Rendering Types • Optics Properties • Geometric Camera Models • Illumination Conditions • General Considerations • Conclusion March 4, 2021 LSIC Workshop on Lunar Mapping for Precision Landing 2 Outline • Purpose • Rendering Situations • Rendering Types • Optics Properties • Geometric Camera Models • Illumination Conditions • General Considerations • Conclusion March 4, 2021 LSIC Workshop on Lunar Mapping for Precision Landing 3 We want to equip you to make informed decisions about rendering software. • Give a basic understanding of how different rendering techniques work. • Clarify how to determine what you need from a renderer. • Point out potential issues when choosing/using a renderer that are commonly overlooked. • NOT to recommend one rendering technique (or software) over another. • Different techniques excel in different areas March 4, 2021 LSIC Workshop on Lunar Mapping for Precision Landing 4 We will use these terms in the presentation. • Representative • Able to generate images that appear realistic of what is expected to be “seen” • Predictive • Able to generate images that show what is expected to be “seen” Representative Predictive Real March 4, 2021 LSIC Workshop on Lunar Mapping for Precision Landing 5 Outline • Purpose • Rendering Situations • Rendering Types • Optics Properties • Geometric Camera Models • Illumination Conditions • General Considerations • Conclusion March 4, 2021 LSIC Workshop on Lunar Mapping for Precision Landing 6 There are numerous situations in which we need to render synthetic images. • Training human operators • Generating simulation data for training/testing algorithms • Generating predictive maps for navigation March 4, 2021 LSIC Workshop on Lunar Mapping for Precision Landing 7 There are numerous situations in which we need to render synthetic images. • Training human operators • Speed • Responsive • Representative • *Virtual Reality • Generating simulation data for training/testing algorithms • Generating predictive maps for navigation Image from https://spacecenter.org/how-nasa-uses-virtual-reality-to-train-astronauts/ March 4, 2021 LSIC Workshop on Lunar Mapping for Precision Landing 8 There are numerous situations in which we need to render synthetic images. • Training human operators • Generating simulation data for training/testing algorithms • Representative • Truth scene • *Speed • *Responsiveness • Generating predictive maps for navigation March 4, 2021 LSIC Workshop on Lunar Mapping for Precision Landing 9 There are numerous situations in which we need to render synthetic images. • Training human operators • Generating simulation data for training/testing algorithms • Generating predictive maps for navigation • Predictive • Scene control • Camera modelling • *Speed Mars 2020 TRN Map (sample) OSIRIS-REx Feature Map (right) Mars image from https://astrogeology.usgs.gov/search/map/Mars/Mars2020/JEZ_ctx_B_soc_008_orthoMosaic_6m_Eqc_latTs0_lon0 March 4, 2021 LSIC Workshop on Lunar Mapping for Precision Landing 10 Outline • Purpose • Rendering Situations • Rendering Types • Optics Properties • Geometric Camera Models • Illumination Conditions • General Considerations • Conclusion March 4, 2021 LSIC Workshop on Lunar Mapping for Precision Landing 11 There are 3 primary categories for rendering synthetic images. • Rasterization • Single Bounce Ray Tracing (ray casting) • Multi Bounce Ray Tracing (path tracing) March 4, 2021 LSIC Workshop on Lunar Mapping for Precision Landing 12 There are 3 primary categories for rendering synthetic images. • Rasterization • Very fast, especially with GPU acceleration • Produces decently representative images, especially from a human perspective • Does not handle shadow well (though there are techniques to handle it better) • Can sometimes create artifacts with surfaces that intersect each other. • Single Bounce Ray Tracing (ray casting) • Multi Bounce Ray Tracing (path tracing) March 4, 2021 LSIC Workshop on Lunar Mapping for Precision Landing 13 There are 3 primary categories for rendering synthetic images. • Rasterization • Single Bounce Ray Tracing (ray casting) • Good at shadows (especially with collimated light) • Can produce Predictive images, especially for atmosphere-less bodies • Can be computationally expensive leading to slow rendering speeds • Occasionally creates rendering artifacts without the proper settings. • Multi Bounce Ray Tracing (path tracing) March 4, 2021 LSIC Workshop on Lunar Mapping for Precision Landing 14 There are 3 primary categories for rendering synthetic images. • Rasterization • Single Bounce Ray Tracing (ray casting) • Multi Bounce Ray Tracing (path tracing) • Good at shadows (including “soft” shadows) • Handles reflections • Can handle bodies with atmospheres • Very computationally expensive • Can create rendering artifacts without proper settings Image rendered using GSFC Freespace March 4, 2021 LSIC Workshop on Lunar Mapping for Precision Landing 15 Outline • Purpose • Rendering Situations • Rendering Types • Optics Properties • Geometric Camera Models • Illumination Conditions • General Considerations • Conclusion March 4, 2021 LSIC Workshop on Lunar Mapping for Precision Landing 16 In addition to geometry, we need to consider the optical system of the camera we are modelling. • In a perfect system, all light coming from a single direction would be focused onto an infinitesimal point on a detector. • Cameras are not perfect systems and light from a single direction is spread out over an area on a detector, typically multiple pixels, leading to an effective loss of resolution. • This is represented by the Modulation Transform Function (MTF) in the frequency domain and the Point Spread Function (PSF) in the spatial domain Image taken from https://www.edmundoptics.com/knowledge-center/application-notes/optics/introduction-to-modulation-transfer-function/ March 4, 2021 LSIC Workshop on Lunar Mapping for Precision Landing 17 In addition to geometry, we need to consider the optical system of the camera we are modelling. • We have 2 options for intensity calculations. • Use a full model of the electronics of the camera • Predicts the actual “DN” values the detector will receive (intensity of each pixel) • Neglect the electronics of the camera • Gives the relative intensity of each pixel (typically stretched to make use of the full dynamic range of the camera) • The first option is generally only necessary for determining required exposure times/detector gain settings. • The second option is generally satisfactory for most other applications. • Most algorithms normalize intensity gradients anyway. March 4, 2021 LSIC Workshop on Lunar Mapping for Precision Landing 18 In addition to geometry, we need to consider the optical system of the camera we are modelling. • When ray tracing (single or multi-bounce) it is beneficial to subsample each pixel. • Pixels integrate all light within their individual FOV. • With ray tracing, we are essentially doing a Riemann sum. • More samples -> more accurate approximation. • 2 cases where particularly necessary • Model resolution >> camera GSD • Multi-bounce stochastic ray tracing • Not using subsampling can lead to speckling artifacts, especially for rough surfaces. Image taken from “By Qutorial - Own work, CC BY-SA 4.0, https://commons.wikimedia.org/w/index.php?curid=49995993” March 4, 2021 LSIC Workshop on Lunar Mapping for Precision Landing 19 Outline • Purpose • Rendering Situations • Rendering Types • Optics Properties • Geometric Camera Models • Illumination Conditions • General Considerations • Conclusion March 4, 2021 LSIC Workshop on Lunar Mapping for Precision Landing 20 Camera schematics modeled after: We can model most camera systems using the Christian, J.A., “A Tutorial on Horizon-Based Optical Navigation and Attitude Determination with Space Imaging Systems,” IEEE Access, Vol. simple pinhole camera model. 9, 2021, pp. 19819-19853. Aperture Stop As an example, consider a FocalPlane simple Gauss lens. Lenses March 4, 2021 LSIC Workshop on Lunar Mapping for Precision Landing 21 Camera schematics modeled after: We can model most camera systems using the Christian, J.A., “A Tutorial on Horizon-Based Optical Navigation and Attitude Determination with Space Imaging Systems,” IEEE Access, Vol. simple pinhole camera model. 9, 2021, pp. 19819-19853. Aperture Stop As an example, consider a FocalPlane simple Gauss lens. Collimated light arriving along the boresight direction is focused to a point on the focal plane. Lenses March 4, 2021 LSIC Workshop on Lunar Mapping for Precision Landing 22 Camera schematics modeled after: We can model most camera systems using the Christian, J.A., “A Tutorial on Horizon-Based Optical Navigation and Attitude Determination with Space Imaging Systems,” IEEE Access, Vol. simple pinhole camera model. 9, 2021, pp. 19819-19853. Aperture Stop As an example, consider a FocalPlane simple Gauss lens. Collimated light arriving along the boresight direction is focused to a point on the focal plane. Lenses Collimated light arriving from another direction is focused to a different point on the focal plane March 4, 2021 LSIC Workshop on Lunar Mapping for Precision Landing 23 Camera schematics modeled after: We can model most camera systems using the Christian, J.A., “A Tutorial on Horizon-Based Optical Navigation and Attitude Determination with Space Imaging Systems,” IEEE Access,