RECESS  |  CULTURE

Sundance 2018: 'The Cleaners' looks at the dark corners of social media

film review

<p>Moritz Riesewieck and Hans Block's documentary "The Cleaners" follows the content moderators responsible for viewing and removing images flagged as violating the community standard on websites like Facebook, Twitter and YouTube.</p>

Moritz Riesewieck and Hans Block's documentary "The Cleaners" follows the content moderators responsible for viewing and removing images flagged as violating the community standard on websites like Facebook, Twitter and YouTube.

Social media is perhaps the most pervasive technology in our world today, especially among college-aged people. Nearly everyone I know has some sort of social media profile: Facebook for keeping up with campus events; Instagram for showcasing your best, most attractive achievements; Snapchat for communicating with your real friends; maybe Tinder for passing the time. We spend so much of our days looking at our phone and computer screens, engaged in a virtual world composed primarily of images — images that we take for granted as benign and consumable.

We go online and expect that we’ll see 10-second makeup tutorials and brownie recipes, that we won’t see images of violence and destruction. Of course, we know that sensitive images violate the community standards of most of the popular social media sites, but surely people still upload them. What happens to these images when they are posted? Who sees them? How are they removed? Who controls what we see online, and to what extent does this censorship protect or control us?

Hans Block and Moritz Riesewieck grapple with these questions in their debut documentary film, The Cleaners. The film, which premiered at Sundance Film Festival on Jan. 19, follows several Cleaners, or “Content Moderators”: people employed by third-party companies responsible for removing images and videos that have been flagged as violating the community standards of social media sites like Facebook, Twitter and YouTube. As Cleaners, they sift through approximately 25,000 images and videos each day, deciding which ones do, in fact, violate the site’s terms and which can be ignored. Each Cleaner is assigned a topic, ranging from terrorism to self-harm and suicide to sexual content. They are told that “cleaning up the dirt” and protecting the user should be their primary goals, but many find that the Content Moderator position takes a psychological toll.

One Cleaner said, “I’m a different person from what I was before. It’s like a virus slowly penetrating my brain.”

Another likened being a Cleaner to slavery — these Content Moderators sacrifice their own wellbeing so that social media empires can continue to thrive.

Day in and day out, the Cleaners are subjected to images so horrifying that none of us can imagine them. One Cleaner estimates that, in her role as Terrorism Content Moderator, she has seen hundreds of beheadings. She is so familiar with images and videos of beheadings that she has come to be able to identify the instrument used in various attacks. A sharp edge on the skin of the severed head indicates a sharp blade. More jagged edges, a blade as dull as a butter knife. 

Block and Riesewieck shed light on an unethical practice in which these psychologically-damaging positions are outsourced to people in the developing world. Because the Cleaning companies are more loosely regulated in these countries, they often offer no psychological support to their employees. Thus, trapped in a job that prevents them, under a non-disclosure agreement, from speaking about what they see, Cleaners are forced to process their trauma on their own.  

The political implications that arise when companies control our access to information compounds the unethical treatment of Content Moderators. “The Cleaners” focuses briefly on Illma Gore and her nude painting of Donald Trump, an image that went viral in Feb. 2016. Gore’s piece, a clearly politically-motivated one, was swiftly removed from all social media sites because it was flagged as having violated community standards. Which community standards did it violate? The image was removed for nudity (which Facebook permits in fine art images) and because it defamed Donald Trump’s character. She was left with two options: Respond with her name and address to an anonymous copyright filing or delete her Facebook account. 

In other instances, social media companies and their cleaners effectively assist foreign governments in keeping information out of their countries. Rather than have their site blocked in a country entirely, social media sites like YouTube and Twitter block IP addresses in countries like Turkey from viewing content their government has deemed unfit. 

One opponent of content moderation said, “We might lose democracy because we’re willing to give it up.” 

“The Cleaners” provides such a comprehensive, nuanced picture of content moderation and the modern person’s relationship to social media that there is too much to unpack in an 800-word article. The relevance and significance of Block and Riesewieck’s film lie in the questions it asks of social media companies, of cleaning companies and of our own social media practices: How much control are we willing to surrender in the name of safety? How do we understand and differentiate between the documentation of civil rights violations and the reverence of them? What is the value of being disturbed by an image? “The Cleaners” asks us to consider how we interact with social media as a technology, biased in its very design. 

Discussion

Share and discuss “Sundance 2018: 'The Cleaners' looks at the dark corners of social media” on social media.