A woman sits in a dark room with her face half hidden. Staring blankly at a computer screen, she says: “I’ve seen hundreds of beheadings”.
In an effort to elaborate, she starts pointing to a blurred image on the screen. There are “good” and “bad” beheadings, she explains.
“Sometimes they are lucky that it’s just a very sharp blade that’s being used. The worst scenario is the little knife, similar to a kitchen knife, which is not that sharp. Sometimes it will last for one minute before the head is cut.”
The woman isn’t identified but we’re told that she is one of the tens of thousands of Filipino workers whose job it is to filter out vile content from social media platforms like Facebook and YouTube.
The scene is not from a dystopian Hollywood film but from a documentary called The Cleaners, released last year, which gives a chilling insight into the previously hidden world of “digital sweatshops” — data hubs in which so-called content moderators sit at banks of computers and remove objectionable posts from the internet.
The documentary, by German filmmakers Moritz Riesewieck and Hans Block, has been part of a growing movement to lift the lid on the gritty work done by content moderators, where low-paid workers are forced to watch and censor the worst content on the internet. It follows several anonymous workers, whose job it is to remove objectionable posts. They say they’re subjected to image after image of confronting content, including murder, suicide, beheadings, child sexual abuse and terrorism, and have as little as five seconds to determine whether the image or video breaches a platform’s guidelines.
Social media moderators are just one component of the booming outsourcing industry in the Philippines. More than 1.3 million people go to work every day moderating forums, writing captions and entering data for some of the world’s biggest companies. They do so in the unglamourous offices of third-party contractors in downtown Manila, far from the lush, branded campuses of tech companies in Silicon Valley.
INQ spoke to one Filipino worker who has worked as a social media moderator for various foreign companies, including an Australian one. She did not want to identify herself for fear of losing her job. While she counts herself as one of the lucky ones to be moderating mild content, she says she is angry her country has become a dumping ground for the world’s digital waste.
“It’s just not right,” she told INQ. “My fellow countrymen are viewing such gruelling content every single day to make these platforms safer for other people.”
Labour rights activists in the Philippines told INQ they are increasingly concerned about reports of extreme working conditions for social media moderators.
“These are images that are full of violence, full of a lot of hatred and psychologically it does something to a person. Their mental health is affected,” says Lisa Garcia, executive director of advocacy group Foundation for Media Alternatives.
They say the industry is ripe for exploitation given the high level of secrecy in which third party contractors operate. Workers are often made to sign confidentiality agreements that prevent them from discussing the work they do for prestigious Silicon Valley clients, whose local employees get much more handsomely compensated.
“You’re not supposed to tell people you’re working in this job,” Garcia said. “These big tech companies are giving such big salaries to their in-house moderators, but those working for third party contractors are not getting that much.”
Garcia says US and Australian companies need to take responsibility for their contracted workforce in the Philippines. “Of course these companies are here to make money. But since they are making plenty of money already they must also ensure the well-being of the people working here. The issue of mental health is very important, and that’s something they need to invest in,” she says.
It’s no surprise that companies like to outsource this work. What has been dubbed “the worst job in technology” by The Wall Street Journal has already attracted lawsuits in the US, where a small number of contractors operate. One lawsuit alleges Facebook violated California law by failing to provide thousands of content moderators with a safe workplace. It claims the moderators were exposed to murders, suicides and beheadings that were livestreamed on Facebook, and have suffered psychological trauma and symptoms of post-traumatic stress disorder as a result.
Another anonymous Facebook content moderator told The Guardian: “You’d go into work at 9am every morning, turn on your computer and watch someone have their head cut off. Every day, every minute, that’s what you see. Heads being cut off.”
In June, The Verge released a series of interviews with people who had worked for Cognizant in the US. Cognizant is one of the biggest third-party contractors for Facebook. In one interview, moderator Shawn Speagle said he had been traumatised by the first video he saw in the job — a group of teenagers smashing an iguana on the ground. “They beat the living shit out of this thing,” he told The Verge. “The iguana was screaming and crying. And they didn’t stop until the thing was a bloody pulp.”
Mark Zirnsak, a senior social justice advocate at the Uniting Church Synod of Victoria and Tasmania says that tech companies are trying to moderate content “on the cheap”, and are “exploiting and harming the content managers in the process”.
He also holds grave fears that evidence of “serious human rights violations, such as the rape, torture and sexual abuse of children” was potentially being deleted from social media platforms without being reported to the police.
As it stands now, the law gives social media platforms — and its third-party contractors — the freedom to choose what sort of content is visible to its billions of users, and what sort of content should be hidden. This has triggered an inevitable debate over whether the platforms have too much say over what is removed from view.
Nicolas Suzor, an associate professor of law at Queensland University of Technology and the deputy chair of activist group Digital Rights Watch, says tech companies have an incredible degree of power over the information we see.
“It’s a big, messy, complex system, one that relies on tens of thousands of moderators around the world trying their best to make the right decision,” he said.
“Platforms have ultimate discretion over what to post. They have clauses that says we can remove any post for any reason. There’s really no legal limits here that relate to constraining the ability of these companies to moderate content.”
One area in which free speech has been a particular focus relates to the rights of social media companies to remove terrorism- and hate speech-related content. Workers contracted out by Facebook claim they are required to remove content relating to a list of 37 terrorist organisations given to them by the US government’s Homeland Security, raising further questions about just whose version of the truth is being upheld.
Zirnsak says there is a need for governments globally to agree on regulation of multinational social media corporations. “Otherwise these corporations get to decide what we can and cannot see and which serious crimes that happen online get investigated.”
In the wake of the Christchurch massacre, in which a gunman live-streamed the murder of 50 people, governments have put pressure on social media companies to act swiftly to remove dangerous and violent content. But Suzor says this will put pressure on them to delete more than they should.
“Facebook removed that particular video in 37 minutes … it would be really problematic to get those decisions made much faster than that,” he said.
Sorry, Crikey, but journalism fail for this article. The desire – or business strategy – to be ‘edgy’ doesn’t justify click-bait or tabloid techniques. It’s a strategic mistake too. You’re creating cognitive dissonance in your readers, trying to paint the platforms many of us use to keep in touch on a daily basis with Aunt Jemima or the grandkids as tools of unspeakable monstrosity. The big social media platforms are easy targets for reporters but I expect a slightly more nuanced analysis from Crikey than I would get from a third-rate Murdoch rag.
I thought the article was about the treatment of third world workers who are forced to deal with the problem with poor wages, no danger money and no policy for spreading the risks and guarantees to help people who get sick as a result of the job. That is the problem as I saw it, and something needs to be done to force media companies to deal better with the workers who clean up the mess created by giving the possibility of publication to hate content.
I, for one, had no idea that such sweat shops existed for this purpose. That they do gives pause for thought as nobody should be subjected to true awfulness on this scale and intensity even with full support never mind being left entirely to their own midnight horrors. I met Jim Gamble when he was chief executive of CEOP and know the care that was taken for those working in this most difficult of areas, that of Child exploitation and Online Protection. There has to be a means of identifying the jurisdictions this material originates from and a follow through of referring it to relevant law authorities. I know that what is lawful varies with jurisdiction but would hope that any Australians involved might then have their collar felt. As things stand the only penalties being paid are by those poor souls hunched over screens day in day out.
Any journalism that tells you things you didn’t know, and should, works for me.
I think INQ is kicking goals big time so far. Long form, strategic, joined-up, and untethered from the 24-7 business as usual psp if the reactive meeja. Appreciating it a lot.
Very much so Jack, writing styles differ as you would expect as do personal interests of the reader. You can just hope that the overal financials are good enough to continue to support this worthwhile exercise in informing.
I naively thought that content was censored by fancy algorithms not Philippino workers. I guess they are being paid to do it albeit not much and they can leave if they want. Nevertheless they should have support mechanisms around them as well. Must try and catch that doco.