RECESS  |  CULTURE

An ode to 35mm film

<p>Quentin Tarantino (above) is one of a few active directors who remain loyal to&nbsp;35mm&nbsp;film despite the rise&nbsp;of digital production.</p>

Quentin Tarantino (above) is one of a few active directors who remain loyal to 35mm film despite the rise of digital production.

My friends roll their eyes in mock disdain whenever I ask them if they’d like to watch a “film” with me. They plead with me—just call them “movies” for once, you pretentious nightmare—but I’m unyielding. Films earned their moniker from the very 35mm film stock that breathed life into moving image, and it’s only proper to recognize the much-fought-for medium with its appropriate name. As the millennium inches forward and we gravitate away from conventional modes of filmmaking, film cinematography’s form is quickly vanishing, threatening to become obsolete with the click of every new digital camera’s shutter. As I see it, films should be just that—film. So, here I sing the praises of 35mm film and plead its case for continued longevity.

In Hollywood, celluloid is all but dead—you’d be hard-pressed to find a major studio that still distributes film prints on a wide scale, and most movie theaters are outfitted with high-resolution digital projectors. That, however, was not always the case: cinema, from its conception, has been intertwined with film—light-sensitive celluloid strips that measured 35 millimeters wide, to be exact. It was the very creation of film that gave way to the existence of filmmaking and moving image. Plausibly, no relationship so intimate had previously existed within art. Film, then, became the basis of cinema for the duration of the 1900s, moving from monochrome to color, expanding to include IMAX and 70mm options as the years pressed on.

The collapse of film’s reign came around with the 2000s, and its downfall was twofold: filmmakers began to utilize digital cameras and movie theaters began to purchase digital projectors. Digital cinematography’s pioneers were few but forceful—most notably, George Lucas paved the way with 2002’s “Attack of the Clones,” which was the first major studio film to be shot on digital, rather than film, cameras. The trend caught on, much to the dismay of purists and film theorists, and cinema’s landscape has changed irrevocably since. It’s not hard to understand why, either—digital cameras are much cheaper than film cameras, much easier to handle and provide instant gratification (within seconds, you can view everything you’ve just filmed).

Perhaps the average filmgoer can’t tell the difference between a movie shot on film and a movie shot digitally—this was the argument made by the likes of Lucas—but the medium provides an unparalleled experience in the realms of both filmmaking and film-viewing. Film demands you be deliberate, thoughtful and light-handed—each frame can only be exposed once, and the slightest imbalance will ruin a shot. Thus, if you want to be as economical as possible, each take requires immense attention to detail beforehand. For some filmmakers, this is less of a problem than it seems; Stanley Kubrick’s shooting ratio for “The Shining” was reportedly 102:1, meaning 1/102th of the film shot was actually used in the movie.

But for others, this encourages purpose and intention when directing and composing a scene. It wouldn’t be until the next day, as the filmmaker views the dailies, that they’d see the work they had created. The allure of digital cinematography is that you can film a scene, play it back instantly and make the necessary adjustments—but that allure leads to heavy-handedness and rapid-fire changes, blurring the filmmaker’s original intents. I’ll never forget the anxiety I felt as I shot 16mm film for a class project, the Bolex camera heavy in my hands, my storyboard meticulously pored over. I had one chance to expose the 100 feet of film, and every frame mattered. When I viewed the film after it had been developed, I was struck by how difficult and incredibly rewarding the experience had been.

Then there’s the inexhaustible loveliness of film and celluloid itself. Film negatives, stored at the right temperature, can be preserved for centuries; their images can be reproduced innumerable times. They will never be deleted or hacked or become non-transferable, the largest disadvantage of digital filmmaking. (This is why most studios opt to print digitally-shot movies on film, an act of ultimate preservation.) Film provides rich colors and tones and grain that can never be fully recreated digitally. It can be manipulated in ways that hard drives cannot—every time 35mm film is run through a projector it collects blemishes and tears, a singular history carved into its very cells, never the same film shown twice. That is, for a lack of a better word, beautiful.

In 2012, Kodak, the largest manufacturer of 35mm film, filed for bankruptcy; in 2013, Paramount became the first major studio to solely distribute movies to theaters digitally, killing any future film distribution permanently. And while many filmmakers have kicked their feet against the ubiquity of digital filmmaking—Quentin Tarantino, Martin Scorsese, Paul Thomas Anderson and Christopher Nolan are film’s most visible proponents—the death of 35mm film seems nigh. It’s a shame, in the most earnest usage of the word, for I fully agree with Tarantino when he derided digital filmmaking and projection as mere “TV in public.” As technology continues to evolve and the universality of digital cameras expands, film is a necessary reminder that some old-fashioned methods aren’t as obsolete as they’re believed to be.

Discussion

Share and discuss “An ode to 35mm film” on social media.