Digital Impact
Total Page:16
File Type:pdf, Size:1020Kb
History of film. Pre visit Activity 6. ʻThe move to digitalʼ Read the extract below from ʻFilmʼs not dead, damn it!ʼ a 2003 article by Stephanie Zacharek. Fill out the graphic organiser on the benefits and disadvantages of digital technology on the film industry. Add your own ideas as well: http://www.salon.com/2003/07/03/cinematographers/ Meanwhile, film — the medium in which cinematographers have been working for some 100 years, a medium that in its relatively short history has given most of us more joy and pleasure than we can possibly measure — is dead. Or at least, people like George Lucas would have us think so. Every now and then, a major news outlet will run a feature sounding a tinny but trumped-up death knell for film as we know it. Last summer, Los Angeles Times staff writers P.J. Huffstutter and Jon Healey jumped on the bandwagon, detailing the way Lucas had tried to convince his colleagues of the supremacy of digital technology by holding that powwow in his private screening room. He showed them two identical clips from “Monsters, Inc.,” one completely electronic (in other words, stored on digital tape and run through a digital projector), the other on a reel of film that had already done four weeks in a local multiplex. The electronic clip, the story noted, “looked less like a motion picture and more like an open window onto a real world.” Compare that with the jiggly, scratched-up image that limped onto the screen via the poor, pathetic step cousin known as film. Lucas had gathered his colleagues, ostensibly, to issue a warning: It was time to leave film behind, or get left in the dust. Lucas, after all, had broken some ground of his own with “Star Wars: Episode II — Attack of the Clones,” which was shot entirely with high-definition digital cameras — that is, a new breed of cameras that record images on videotape instead of 35-millimeter film, but with crisper detail and a wider range of color than video cameras have traditionally been able to capture. “Attack of the Clones” was also edited with digital equipment and, in the relatively few theaters equipped to do so, projected digitally. So because he’d been able to make a stiff, crummy-looking, overblown faux-epic on a new plaything, Lucas felt completely justified in foretelling the death of film. The L.A. Times article played right into his phony argument, in language that sounds borrowed from that most filmic of news sources, a World War II newsreel: “Lucas’ blunt message stands at the center of a schism in Hollywood over the fate of film in the film business. New high-definition video cameras and digital editing equipment challenge the longtime supremacy of film. They are cheaper and more flexible. But they also frighten directors and cinematographers who understand every nuance of film. A creative misstep can tarnish a career, so many of those established in the film industry blanch at the thought of showing their inexperience with the latest technology. A colossal mistake, seen by millions of fans, might reveal that they are passé storytellers — easily replaced with younger, cheaper and more tech-savvy rivals.” Aside from a quote or two from cinematographers Roger Deakins (“With digital, it’s all very businesslike. We’re not businessmen. We’re artists and magicians”) and Emmanuel Lubezki, who shot portions of Michael Mann’s “Ali” using a high-definition camera (“This is different from film. Not better or worse but different”), cinematographers were woefully underrepresented in the piece. Considering these would be the guys who’d understand better than anyone the potential advantages, or lack thereof, of digital video, it undoubtedly seemed more convenient to not even bother to ask them. One of the chief problems with the Times article — and with Lucas’ argument in general — is that it makes no distinction between the various uses of digital technology. As Poster explains it, “One of the things journalists and the public are confused about is that when you use the term ‘digital cinema,’ you lump it into one kind of thing, but it’s really three things: It’s image acquisition, it’s postproduction and it’s exhibition.” Digital applications are currently most widely used in postproduction, the steps taken at the end of the moviemaking process before the definitive print — called the answer print — is struck. That’s the stage at which cinematographers color-correct the film, which generally means sitting down with a lab technician and making sure every frame looks the way it’s supposed to. “Stuart Little 2,” for example, included lots of special effects that had to be added during postproduction. The entire film — even those portions of it that didn’t feature special effects — was digitized, and Poster used some new digital tools to complete the color correction before transferring the whole thing back to film. “We edit digitally, we do visual effects digitally, and now we’re starting to finish the film digitally,” Poster says. “The tools are tremendous, and it’s just developing into something that’s going to become ubiquitous within the next year or two. Finishing a film digitally will be the norm, not the exception.” That’s a case of the technology being used to make the process more efficient, but it also, of course, works in the service of maintaining the visual integrity of a film. Poster is less enthusiastic about digital technology as it has so far been applied in terms of exhibition — that is, projection. As moviegoers, we’ve all seen our share of dingy prints at the multiplex: By the time a picture has been shown five or six times a day over a period of several weeks, any print is going to show some wear and tear. No D.P. likes to see that happen to his or her movie. But there are still too many variables involved in digital projection to make it an immediately viable solution, Poster says, no matter how “clean” Lucas’ digitized “Monsters” may have looked. For one thing, there’s no worldwide standard for digital exhibition of movies. “Film is a worldwide standard,” Poster explains. “You can send a 35-millimeter film to Bangladesh and get it shown.” But various different standards are competing in the digital realm, with no single version in a dominant position. Also, the cost of equipping a theater to project movies digitally is still prohibitive. Poster puts it at around $150,000 per screen, and because the technology is changing so rapidly, the equipment could become obsolete in as few as five years. (Whereas a regular old motion picture projector costs around $30,000 and might last 20 years.) And how will people be trained to maintain digital- projection equipment and play digital movies so they look as good as they should, when most of the big movie chains have done away with most of their union projectionists? “It’s a much more complex technology than we’re ready to deploy,” Poster says. On a more fundamental level, Poster also says we don’t really know how images shown on film, as opposed to those captured or projected digitally, affect audiences on a subconscious level. He wonders if maybe there isn’t “a perceptual quality to motion pictures that exists maybe because of the flaws of motion picture film.” The very slight “jiggle and weave” of film, as opposed to the much-touted steadiness of digital images, may have something to do with why we respond to movies as we do. “There’s the granularity of film, which changes on every frame. There are all these perceptual components, which create a hot medium for the audience. It engages the audience in a way.” The point isn’t that Poster and his colleagues are resistant to digital technology. They want to make sure any new technology they adopt is better than what they’ve already got, in subtle ways as well as obvious ones. Today’s cinematographers see what’s coming in terms of technology and equipment. When it’s good enough for them to use, they say they’ll be ready. We’re still in the infancy of “high-definition technology” in the movie business, Poster explains. “But high-definition technology, which has been said to be the death of film, the be-all and end-all, is in a rudimentary form that is rapidly changing. So this technology that was supposed to replace film is, within the next year, going to be the old technology.” It’s crucial to note, though, that the term “digital technology” doesn’t mean much all by itself. Cinematographer Wally Pfister’s credits include “Memento” and “Laurel Canyon,” but because he began his career as a news cameraman in the early ’80s, he knew how to shoot on videotape long before he ever shot a frame of 35-millimeter film. As much as he loves working with film, he’s convinced that within 15 to 20 years, electronic media will replace it. But for now, he says the quality of digital images is nowhere near that of images recorded on film, in terms of resolution, richness or subtlety. Pfister believes that large electronics corporations are using the term “digital” to sell the idea of something revolutionary and hot, even though this “new” technology, at least at this point, is no improvement on the old one. Sony and Panasonic both manufacture high-definition cameras, and have a stake in getting their products used and accepted (not to mention plugged by Lucas), whether they produce satisfactory results or not.