The Goal: To Capture Reality Acquisition of Point-Sampled • Fully-automated 3D model creation of real Geometry and Appearance objects. • Faithful representation of appearance for Hanspeter Pfister, MERL these objects. [email protected] Wojciech Matusik, MIT Addy Ngan, MIT Paul Beardsley, MERL Remo Ziegler, MERL Leonard McMillan, MIT Point-Based Computer Graphics Hanspeter Pfister, MERL 1 Point-Based Computer Graphics Hanspeter Pfister, MERL 2 Image-Based 3D Photography Previous Work • An image-based 3D scanning system. • Active and passive 3D scanners • Handles fuzzy, refractive, transparent objects. • Work best for diffuse materials. • Robust, automatic • Fuzzy, transparent, and refractive objects are difficult. • Point-sampled geometry based on the visual hull. • BRDF estimation, inverse rendering • Objects can be rendered in novel environments. • Image based modeling and rendering • Reflectance fields [Debevec et al. 00] • Light Stage system to capture reflectance fields • Fixed viewpoint, no geometry • Environment matting [Zongker et al. 99, Chuang et al. 00] • Capture reflections and refractions • Fixed viewpoint, no geometry Point-Based Computer Graphics Hanspeter Pfister, MERL 3 Point-Based Computer Graphics Hanspeter Pfister, MERL 4 Outline The System •Overview ¾ System •Geometry Light Array • Reflectance •Rendering •Results Cameras Multi-Color Monitors Rotating Platform Point-Based Computer Graphics Hanspeter Pfister, MERL 5 Point-Based Computer Graphics Hanspeter Pfister, MERL 6 Outline Acquisition •Overview • For each viewpoint ( 6 cameras x 72 •System positions ) ¾ Geometry • Alpha mattes • Use multiple backgrounds [Smith and Blinn 96] • Reflectance • Reflectance images •Rendering • Pictures of the object under different •Results lighting (4 lights x 11 positions) • Environment mattes • Use similar techniques as [Chuang et al. 2000] Point-Based Computer Graphics Hanspeter Pfister, MERL 7 Point-Based Computer Graphics Hanspeter Pfister, MERL 8 Geometry – Opacity Hull Approximate Geometry • Visual hull augmented with view-dependent • The approximate visual hull is augmented by opacity. radiance data to render concavities, reflections, and transparency. Point-Based Computer Graphics Hanspeter Pfister, MERL 9 Point-Based Computer Graphics Hanspeter Pfister, MERL 10 Geometry Example Surface Light Fields • A surface light field is a function that assigns a color to each ray originating on a surface. [Wood et al., 2000] Point-Based Computer Graphics Hanspeter Pfister, MERL 11 Point-Based Computer Graphics Hanspeter Pfister, MERL 12 Shading Algorithm Color Blending • A view-dependent strategy. • Blend colors based on angle between virtual camera and stored colors. • Unstructured Lumigraph Rendering [Buehler et al., SIGGRAPH 2001] • View-Dependent Texture Mapping [Debevec, EGRW 98] Point-Based Computer Graphics Hanspeter Pfister, MERL 13 Point-Based Computer Graphics Hanspeter Pfister, MERL 14 Point-Based Rendering Geometry – Opacity Hull • Point-based rendering using LDC tree, • Store the opacity of each observation at visibility splatting, and view-dependent each point on the visual hull [Matusik et al. shading. SIG2002]. Point-Based Computer Graphics Hanspeter Pfister, MERL 15 Point-Based Computer Graphics Hanspeter Pfister, MERL 16 Geometry – Opacity Hull Example • Assign view-dependent opacity to each ray originating on a point of the visual hull. Photo Surface B C Light Field A φ (θ,φ) Red = invisible Visual Hull Opacity White = opaque Hull A B C Black = transparent θ Point-Based Computer Graphics Hanspeter Pfister, MERL 17 Point-Based Computer Graphics Hanspeter Pfister, MERL 18 Results Opacity Hull – Discussion • Point-based rendering using EWA splatting, • View dependent opacity vs. geometry A-buffer blending, and edge antialiasing. trade-off. • Similar to radiance vs. geometry trade-off. • Sometimes acquiring the geometry is not possible (e.g. resolution of the acquisition device is not adequate). • Sometimes representing true geometry would be very inefficient (e.g. hair, trees). • Opacity hull stores the “macro” effect. Point-Based Computer Graphics Hanspeter Pfister, MERL 19 Point-Based Computer Graphics Hanspeter Pfister, MERL 20 Point-Based Models Outline • No need to establish topology or •Overview connectivity. • Previous Works • No need for a consistent surface •Geometry parameterization for texture mapping. ¾ Reflectance • Represent organic models (feather, tree) •Rendering much more readily than polygon models. •Results • Easy to represent view-dependent opacity and radiance per surface point. Point-Based Computer Graphics Hanspeter Pfister, MERL 21 Point-Based Computer Graphics Hanspeter Pfister, MERL 22 Light Transport Model Surface Reflectance Fields ω ω = θ Φ θ Φ • Assume illumination originates from • 6D function:ω W (P, i , r ) W (ur ,vr ; i , i ; r , r ) infinity. i ω • The light arriving at a camera pixel can be i described as: C(x, y) = W (ω)E(ω)dω ∫ P Ω ω r C(x,y) - the pixel value E-the environment W -the reflectance field Point-Based Computer Graphics Hanspeter Pfister, MERL 23 Point-Based Computer Graphics Hanspeter Pfister, MERL 24 Reflectance Functions Reflectance Field Acquisition • For each viewpoint, 4D function: • We separate the hemisphere into high resolution Ω and low resolution Ω [Matusik W (ω ) = W (x, y;θ ,Φ ) h l xy i i i et al., EGRW2002]. (θi,φi Ω Ω ) h l T φi L(ω C(x, y) = W (ξ )T (ξ )dξ + W) (ω )L(ω )dω ∫ h ∫ l i i Ω Ω θi h l Point-Based Computer Graphics Hanspeter Pfister, MERL 25 Point-Based Computer Graphics Hanspeter Pfister, MERL 26 Acquisition Low-Resolution Reflectance Field • For each viewpoint ( 6 cameras x 72 C(x, y) = W (ξ )T(ξ )dξ + W (ω )L(ω )dω ∫ h ∫ l i i Ω Ω positions ) h l • Alpha mattes • Wl sampled by taking pictures with each light • Use multiple backgrounds [Smith and Blinn 96] turned on at a time [Debevec et al 00]. • Reflectance images Low resolution • Pictures of the object under different lighting (4 lights x 11 positions) • Environment mattes High resolution n • Use similar techniques as [Chuang et al. 2000] W (ω )L(ω )dω ≈ W L for n lights ∫ l i i ∑ i i Ω i=1 l Point-Based Computer Graphics Hanspeter Pfister, MERL 27 Point-Based Computer Graphics Hanspeter Pfister, MERL 28 Compression High-Resolution Reflectance Field • Subdivide images into 8 x 8 pixel blocks. C(x, y) = W (ξ )T(ξ )dξ + W (ω )L(ω )dω ∫ h ∫ l i i Ω Ω • Keep blocks containing the object (avg. h l compression 1:7) • Use techniques of environment matting • PCA compression (avg. compression 1:10) [Chuang et al., SIGGRAPH 00]. •Approximate Wh by a sum of up to two Gaussians: PCA N G1 • Reflective G1. • Refractive G2. a0 a1 a2 a3 a4 a5 G2 ξ = + Wh ( ) a1G1 a2G2 Point-Based Computer Graphics Hanspeter Pfister, MERL 29 Point-Based Computer Graphics Hanspeter Pfister, MERL 30 Surface Reflectance Fields Outline • Work without accurate geometry. •Overview • Surface normals are not necessary. • Previous Works • Capture more than reflectance: •Geometry • Inter-reflections • Reflectance • Subsurface scattering •Refraction ¾ Rendering • Dispersion •Results • Non-uniform material variations • Simplified version of the BSSRDF [Debevec et al., 00]. Point-Based Computer Graphics Hanspeter Pfister, MERL 31 Point-Based Computer Graphics Hanspeter Pfister, MERL 32 st Ω Rendering 1 Step: Relighting l • Input: Opacity hull, reflectance data, new •Compute radiance image for each viewpoint. environment New Illumination •Create radiance images from environment and low-resolution reflectance field. Downsample • Reparameterize environment mattes. x = • Interpolate data to new viewpoint. The sum is the radiance image of this viewpoint in this environment. Point-Based Computer Graphics Hanspeter Pfister, MERL 33 Point-Based Computer Graphics Hanspeter Pfister, MERL 34 nd Ω rd 2 Step: Reproject h 3 Step: Interpolation • Project environment mattes onto the new • From new viewpoint, for each surface point, find environment. four nearest acquired viewpoints. • Store visibility vector per surface point. • Environment mattes acquired was parameterized on plane T (the plasma display). • Interpolate using unstructured lumigraph interpolation [Buehler et al., SIGGRAPH 01] or view- • We need to project the Gaussians to the new dependent texture mapping [Debevec 96]. environment map, producing new Gaussians. •Opacity. • Contribution from low-res reflectance field (in the form of Ωh radiance images). T • Contribution from high-res reflectance field. Point-Based Computer Graphics Hanspeter Pfister, MERL 35 Point-Based Computer Graphics Hanspeter Pfister, MERL 36 3rd Step: Interpolation Outline • For low-res reflectance field, we interpolate •Overview the RGB color from the radiance images. • Previous Works For high-resolution •Geometry reflectance field: V1 ~ • Reflectance Interpolate direction of G1r N ~ reflection/refraction. G V2 2r •Rendering Interpolate other ¾ Results parameters of the ~ G2t Gaussians. ~ G Convolve with the 1t environment. Point-Based Computer Graphics Hanspeter Pfister, MERL 37 Point-Based Computer Graphics Hanspeter Pfister, MERL 38 Results Results • Performance for 6x72 = 432 viewpoints • 337,824 images taken in total !! • Acquisition (47 hours) • Alpha mattes – 1 hour • Environment mattes – 18 hours • Reflectance images – 28 hours •Processing • Opacity hull ~ 30 minutes • PCA Compression ~ 20 hours (MATLAB, unoptimized) • Rendering ~ 5 minutes per frame •Size • Opacity hull ~ 30 - 50 MB • Environment mattes ~ 0.5 - 2 GB • Reflectance images ~ Raw 370 GB / Compressed 2 - 4 GB Point-Based Computer
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages9 Page
-
File Size-