Digital Web Design: a Computer
Total Page:16
File Type:pdf, Size:1020Kb
Digital web design:A computer- generated Peter Parker clings to a speeding train in Spider-Man 2. 66 TECHNOLOGY REVIEW September 2004 www.technologyreview.com THE NEW FACE OF SONY HOLLYWOODPICTURES IMAGEWORKS IS LEADING THE WAY IN CREATING THE WORLD’S MOST REALISTIC DIGITAL HUMANS—PUSHING THE LIMITS OF WHAT’S POSSIBLE IN MOVIES, GAMES, AND COMPUTER INTERFACES. BY GREGORY T. HUANG www.technologyreview.com TECHNOLOGY REVIEW September 2004 67 The stakes are huge. Digital ATOP AN ELEVATED effects are a billion-dollar business and growing fast; these days, a typi- TRAIN BARRELING cal blockbuster film’s budget can be $150 million, with half of that THROUGH DOWNTOWN, going to effects companies. Indeed, Spider-Man 2 is just one example THE MASKED MAN IN of Hollywood’s increasing use of cutting-edge graphics research to THE RED AND BLUE SUIT create better digital actors, from stunt doubles of Neo and the mul- IS IN TROUBLE. titude of Agent Smiths in the Not only is he fighting a lunatic scientist who’s trying to kill Matrix films to Gollum in the Lord of the Rings series. It’s reached him with robotic tentacles, but he also needs to save the passen- the point where the industry jokes about replacing actors with gers on the train. It’s all in a day’s work for superhero Peter Parker, computers (the premise of the 2002 film S1m0ne). And Sony Pic- also known as Spider-Man—but it means months of work for an tures Imageworks, founded in 1992 and with more than 40 fea- elite team of graphics gurus. ture films to its credit, is in the vanguard of effects houses that vie This action sequence from the megahit Spider-Man 2 has for big studios’ business (see “Making Faces,” p. 73). dazzled millions of moviegoers this summer. Look closely at the But the real benefit of digital actors isn’t replacing live villain, Doctor Octopus (played by actor Alfred Molina), and ones: it’s delivering scenes that take viewers to places no real you’ll see him grin maniacally and yell while ambulating along actor, or camera setup, could go.“It’s giving directors more flexi- the top and sides of the train. Later in the scene, Spider-Man bility, and it allows them to do actions they can’t do with real (Tobey Maguire) loses his mask as he braces himself against the stunt people,” says Scott Stokdyk, the visual-effects supervisor front of the train, bringing it to a screeching halt. His frantic at Imageworks in charge of the Spider-Man series.“In the past, facial expressions appear convincingly natural. directors and editors have basically massaged the cut around dif- With all the swift cross-cutting between images of superhero ferent quick actions and camera angles to convey a story,” and villain, the audience probably does not suspect that the faces Stokdyk says. “Now they don’t have those kinds of limits.” and figures appearing on the screen much of the time are not the Thus liberated, directors can follow synthetic actors as they real thing. Rather, they are digital concoctions created inside a swoop around skyscrapers and dodge bullets in sweeping slow computer at Sony Pictures Imageworks in Culver City, CA. motion. What’s more, actors can be digitally aged, or de-aged, “We’ve reached a point where we can make every single thing without having to spend hours in makeup. Long-deceased computer-generated. Everything from the shattered glass in the movie stars could even be digitally resurrected. train to all the buildings, tracks, and people,”says Mark Sagar, a And movies are just the beginning. Techniques for creating graphics supervisor on Spider-Man 2 and Imageworks’ resident digital humans are pushing the broader frontier of computer expert on digital human faces. He points to the train sequence graphics and interfaces. These efforts could enable strikingly real- as it plays on his computer screen.“Watch what the camera does. istic medical training simulations, lifelike avatars for e-mail and Here it’s going fast along the train, then underneath the per- Internet chat rooms, and soon, much more compelling char- formers, then in front of them, then a wide shot. How could you acters in games and interactive films. The technology, Sagar says, do that with a real camera?” But the digital faces on the people “is absolutely ready for prime time.” are the last, crucial piece of this puzzle. In the past, splicing footage of real actors into a digital scene required real cameras, difficult stunt work, and tweaking to get the look of the real and digital VIRTUAL VISAGE images to match; the ability to do computer-generated everything, Mark Sagar has always been torn between art and science. including human faces, opens a wealth of creative possibilities. After college, he spent three years traveling the world, sketching Photorealistic digital faces—ones that can pass for real in still portraits for a living. But the tug of technology made him pictures or on the big screen—are among the last frontiers of return to graduate school in his native New Zealand to study computer graphics. Until recently, digital faces have looked engineering. “I never thought I’d spend years of my life study- fake when examined closely and have therefore been relegated ing the human face,” he admits, sitting in his office at Image- to quick cuts and background shots. The problem is that we’re works, surrounded by books and papers on visual perception. extraordinarily sensitive to how human faces should look; it’s Hearing Sagar describe the human face as “a multichannel much easier to fool people with a computer-generated T. rex than signaling device” suggests that the science and engineering side with a digital human. But advances in rendering skin, lighting of him has won out. Understanding the science behind faces, he digital scenes, and analyzing footage of real actors for reference says, enables him to make a digital character’s message come are now allowing artists and programmers to control the texture through more effectively on the screen. Expressions like down- and movement of every tiny patch of pixels in a computerized cast eyes, a furrowed brow, or a curled lip signify a person’s emo- face. The Sony team hopes that audiences will be unable to tell tional state and give clues to his or her intent. the difference between Tobey Maguire and his digital double— Sagar’s path to Hollywood opened almost by accident. In the perhaps the first time such verisimilitude has been achieved. mid-1990s, as a graduate student at the University of Auckland PREVIOUS SPREAD: OF SONY PICTURES IMAGEWORKS COURTESY 68 TECHNOLOGY REVIEW September 2004 www.technologyreview.com Movies on his mind: Mark Sagar is resident expert on digital human faces at Sony Pictures Imageworks. JOE TORENO www.technologyreview.com TECHNOLOGY REVIEW September 2004 69 and as a postdoctoral fellow at MIT, he developed computer simulations of the human eye and face that could help doctors- in-training learn surgical techniques. His simulations looked so real that a team of dot-com entrepreneurs convinced him to cofound a graphics startup called LifeFX in Newton, MA. Its mis- sion: commercialize software that everyone from filmmakers to Web businesses and e-mail providers could use to produce photorealistic likenesses of people. Sagar soon became a leading authority on digital faces for entertainment. In 1999, he came to Los Angeles to work on computer-generated face animations for films, including one of actor Jim Carrey. Paul Debevec, a graphics researcher who made his name creating virtual environments and advancing DIGITAL ACTORS CAN NOW ZOOM AROUND SKYSCRAPERS, OVER TRAINS,AND UNDERWATER,EMOTING ALL THE WHILE. digital lighting techniques, saw Sagar’s films at a conference and was intrigued: he had never seen faux faces that looked so con- vincing up close. “That was the moment that made me cross the threshold of truly believing that a photoreal computer- graphics face would happen in the next five years,” says Debevec, who is now at the Institute for Creative Technologies at the University of Southern California (see “Hollywood’s Master of Light,” TR March 2004). The two scientists struck up a collaboration, using Debevec’s lighting techniques to render Sagar’s digital faces—a combina- tion that quickly catapulted them to the forefront of the field. It turns out that if you’re trying to simulate a face, getting the lighting right is a big deal. Unlike previous computer simulations that looked odd in different contexts and had to be adjusted by trial and error, Sagar and Debevec’s faces could be tailored to match the lighting in any scene. That’s because they were built using a rich database of real faces photographed from different angles and illuminated by many different combinations of Spider-Man 2.The action scenes in the film required detailed light. When LifeFX folded in 2002, Imageworks snatched up and expressive simulations of the faces of well-known actors— Sagar specifically for a particularly tough problem, says Sagar. Not only are audiences his expertise in faces. quick to reject ersatz human faces in general, but they are par- He immediately ticularly sensitive to faces they recognize; any discrepancy began working on the between digital and real could be perceived as fake. To make the first feature-film test simulations work, the researchers needed lots of reference of these techniques: footage of the real actors under different lighting conditions. So Maguire and Molina each spent a day in Debevec’s lab. FACE VALUE Supervised by research programmer Tim Hawkins, they sat in a For this shot, Doc Ock’s special apparatus called a “light stage” while four still cameras cap- hair, eyes, skin, and subtly tured hundreds of images of their heads and faces making a variety changing expression were rendered on a computer, of expressions and illuminated by strobes from every possible using reference photos of angle.