Virtual Cinematography: Relighting Through Computation

Virtual Cinematography: Relighting Through Computation

COVER FEATURE Virtual Cinematography: Relighting through Computation Paul Debevec USC Centers for Creative Technologies Recording how scenes transform incident illumination into radiant light is an active topic in computational photography.Such techniques are making it possible to create virtual images of a person or place from new viewpoints and in any form of illumination. he first photograph (Figure 1) sits in an oxygen- a range of wavelengths, ␭. We can thus describe a color free protective case at the Harry Ransom Cen- image as a 3D function, P (␽, ␾, ␭). Motion pictures ter at the University of Texas at Austin. Its record how the incident light changes with time t, adding image is faint enough that discerning the roofs another dimension to our function, which becomes T of Joseph Nicéphore Niépce’s farmhouses P (␽, ␾, ␭, t). requires some effort. Most of the photographic imagery we view today—cin- That same effort, however, reveals a clear and amaz- ema and television—records and reproduces light across ing truth: Through the technology of photography, a all of these dimensions, providing an even more compelling pattern of light crossing a French windowsill in 1826 is experience of scenes from other times and places. re-emitted thousands of miles away nearly two centuries later. Not surprisingly, this abil- ity to provide a view into a different time and place has come to revolutionize how we com- municate information, tell stories, and docu- ment history. DIGITAL IMAGING In digital imaging terminology, we would think of the first photograph as a 2D array of pixel values, each representing the amount of light arriving at the camera from a particular angle. If we index the pixel values based on the horizontal (␽) and vertical (␾) components of this angle, we can express the photograph as a 2D function, P (␽, ␾). Modern photographic techniques have made it possible to capture considerably more information about the light in a scene. Color Figure 1.The first photograph,Joseph Nicéphore Niépce,1826. photography records another dimension of Source: http://palimpsest.stanford.edu/byorg/abbey/an/an26/an26-3/ information: the amount of incident light for an26-307.html. 0018-9162/06/$20.00 © 2006 IEEE Published by the IEEE Computer Society August 2006 57 VIRTUAL CONTROL OF THE VIEWPOINT Recording variation in the spatial dimen- sions of the plenoptic function allows scenes to be recorded three-dimensionally. Stereo photography captures two image samples in (x, y, z) corresponding to the positions of a left and right camera. Since this matches the form of the imagery that our binocular visual system senses, it can reproduce the sensation of a 3D scene for a particular point of view. QuickTime VR panoramas allow virtual control the field of view by panning and zooming in a panoramic image, exploring different ranges of ␽ and ␾ but staying at the Figure 2.Geometry of the plenoptic function and the reflectance field. same viewpoint (x, y, z). In 1995, Leonard McMillan and Gary Bishop presented a sys- Many innovations in image recording technology tem for interpolating between panoramic images taken receiving increased interest today provide ways of cap- from different viewpoints to create the appearance of turing P (␽, ␾, ␭, t) with greater resolution, range, and virtual navigation through a scene.3 fidelity. For example, high dynamic range (HDR) imag- Light field photography uses an array of cameras to ing techniques increase the range of values that are record a scene from a planar 2D array of viewpoints dis- recorded within P, accurately capturing the millions-to- tributed across x and y for a fixed z.4 If the cameras are one range brightness from a dimly lit interior to staring clear of obstructions, we can infer values of the plenop- into the bright sun. Panoramic and omnidirectional tic function at other z positions by following these rays photography1 increase the range of ␽ and ␾ that are forward or backward in the direction (␽, ␾) to where the recorded, in some cases capturing the entire sphere of light was sensed at the x-y plane.5 incident light. Large-format photography systems such By recording such data, it becomes possible to virtually as Graham Flint’s Gigapxl Project dramatically increase construct any desired viewpoint of a scene after it has been angular resolution, achieving tens of thousands of inde- filmed. As such techniques mature—and if display tech- pendent samples across ␽ and ␾. Multispectral and nology makes similar advances6—future observers may hyperspectral photography, respectively, increase the res- be able to enjoy 3D interactive views of sports matches, olution and range recorded in ␭, and high-speed pho- family gatherings, and tours of the great sites of the world. tography increases the resolution in t. This rich line of photographic advances, each record- As we consider what the future of digital photogra- ing a greater dimensionality of the light within a scene, phy might bring, we might ask what other dimensions has progressively brought a greater sense of “being there” of information about a scene we could consider captur- in a scene or that the photographed subject is in some ing. In 1991, E.H. Adelson and J.R. Bergen2 presented way present within our own environment. Nonetheless, a mathematical description of the totality of light within a photographic image—even a 3D, omnidirectional, high- a scene as the seven-dimensional plenoptic function P: resolution motion picture—shows a scene only the way it was at a particular time, the way things happened, in the P = P (x, y, z, ␽, ␾, ␭, t) illumination that happened to be there. If it were somehow possible to record the scene itself, The last four dimensions of this function are familiar, rather than just the light it happened to reflect, there representing the azimuth, inclination, wavelength, and would be a potential to create an even more complete time of the incident ray of light in question, as Figure 2 and compelling record of people, places, and events. In shows. The additional three dimensions (x, y, z) denote this article, we focus on a particular part of this prob- the 3D position in space at which the incident light is lem: Is there a way to record a subject or scene so that we being sensed. In the case of the first photograph, (x, y, z) can later produce images not just from every possible would be fixed at the optical center of Niépce’s rudimen- viewpoint, but in any possible illumination? tary camera. Tantalizingly, this simple function contains within it VIRTUAL CONTROL OF ILLUMINATION every photograph that could ever have possibly been Recording a scene so that its illumination can be cre- taken. If it were possible to query the plenoptic function ated later has potential applications in a variety of dis- with the appropriate values of x, y, z, ␽, ␾, ␭, and t, it ciplines. With virtual relighting, an architect could would be possible to construct brilliant color images (or record a building to visualize it under a proposed night- movies) of any event in history. time lighting design or capture a living room’s response 58 Computer to light to see how its appearance would change if an ulate arbitrary lighting conditions in a scene, since we apartment complex were built across the street. An only need to capture the scene in a set of basis lighting archaeologist could compare digital records of artifacts conditions that span the space of the lighting conditions as if they were sitting next to each other in the same of interest. In 1995, Julie Dorsey and colleagues sug- illumination, a process difficult to perform with stan- gested this technique in the context of computer-ren- dard photographs. dered imagery to efficiently visualize potential theatrical Some of the most interesting virtual relighting appli- lighting designs.8 cations are in the area of filmmaking. A small crew could We should now ask which set of basis lighting condi- capture a real-world location to use it as a virtual film tions will allow us to simulate any possible pattern of illu- set, and the production’s cinematographer could light mination in the scene. Since we would like to place lights the set virtually. Then, actors filmed in a studio could be at any (x, y, z) position, we might suppose that our basis composited into the virtual set, their lighting being vir- set of recorded plenoptic functions should consist of a set tually created to make them appear to reflect the light of of images of the scene, each with an omnidirectional light the environment and further shaped by the cinematog- source (such as a tiny lightbulb) at a different (x, y, z) rapher for dramatic effect. position, densely covering the space. However, this light- Virtually changing the lighting in a scene is a more ing basis would not allow us to simulate how a directed complex problem than virtually changing the viewpoint. spotlight would illuminate the scene, since no summation Even recording the entire 7D plenoptic function for a of omnidirectional light source images can produce a pat- scene reveals only how to construct images of the scene tern of light aimed in a particular direction. in its original illumination. If we con- Thus, our basis must include vari- sider a living room with three lamps, ance with respect to lighting direc- turning on any one of the lamps will Some of the tion as well, and we should choose give rise to a different plenoptic func- most interesting our basis light sources to be single tion with remarkably different high- rays of light, originating at (x, y, z) lights, shading, shadows, and re- virtual relighting and emitting in direction (␽, ␾). From flections within the room.

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    9 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us