
Social media moderators’ lives are getting worse. Big Tech needs to take responsibility
Content moderation is vital work. The people tasked with reviewing and removing posts on the world’s social networks help keep billions of users shielded from some of the darkest images imaginable. The reason you don’t see murder, violence and abuse on your timeline is because someone else saw it instead and took it down. Even in the best possible conditions, their job is hugely difficult.
But the conditions are very often terrible. Paid little, given gruelling targets and offered scant psychological support, content moderators are the forgotten footsoldiers on the frontline of social media.
They are also kept at arm’s length from the companies they ultimately serve. Rather than hire these workers directly, social media giants like Meta and TikTok generally use intermediaries: firms paid millions of dollars to supply the tech industry with cheap outsourced labour.
Two years ago, one of these contractors, a smaller one called Sama based in Kenya, was exposed by Time for employing Facebook moderators who reported low pay, trauma, intimidation and union-busting. When they were laid off following the revelations, they launched legal action against Meta. Former Sama workers on the Meta contract have since been diagnosed with work-related PTSD.
Sama has called the allegations “disappointing and inaccurate”. It said it values its employees and invests heavily in training, personal development, wellness programs and competitive salaries, while Meta said it requires its partners to provide industry-leading pay, benefits and support.
Shortly afterwards, we teamed up with Time to expose similar conditions faced by moderators in Colombia employed by the huge outsourcing firm Teleperformance. Tasked with keeping TikTok free of extreme violence, sexual abuse and other horrific content, they told us of $10-a-day pay, punitive salary deductions and extensive workplace surveillance. They also said Teleperformance had tried to stop them unionising. Neither TikTok nor Teleperformance responded to our allegations but both issued statements saying they were committed to the wellbeing of their employees.
When the conditions of moderators are exposed, their employers and the tech firms tend to make these sorts of statements, promising that their workers will be looked after and protected.
Today, TBIJ and the Guardian have revealed that those promises are worth very little. After its Kenya operation resulted in lawsuits, Meta cancelled its deal with Sama and hired a new contractor, but kept the new company and location a closely guarded secret. We’ve found out that the company in question is Teleperformance and the new operation is in Ghana – where conditions are said to be even worse. The allegations are by now familiar: low pay, punishing targets, psychological trauma, inadequate support, workplace surveillance, union-busting.
Teleperformance told us that it provides moderators with a wellbeing programme and that its moderators in Ghana receive base pay of more than double the minimum wage. Meta said its contracts with outsourcing companies contain expectations including counselling and support, as well as an obligation to pay moderators above the local industry standard. It said it respects the right of outsourced employees to organise.
Yet lawyers who visited the Ghana workers told us these were the worst conditions they’ve seen in six years working with moderators around the world. It is dispiriting to see things are continuing to get worse for those tasked with protecting social media users from harm. It is also extremely worrying that Meta tried to keep the details secret, moderators telling us that they were forbidden from telling even their families what they were doing and who they worked for.
Without these workers, few people would actually use the social media platforms that so many of us spend time on. That means they are also an integral part of the business that drives huge profits for the likes of Meta and TikTok.
They deserve to be supported, protected and paid decently for their work, and to be able to speak out when they aren’t. It’s up to the companies that employ them to do that, and the people who use social media platforms to make their voices heard when they aren’t.
Reporter: Jasper Jackson
Deputy editor: Katie Mark
Editor: Franz Wild
Fact checker: Ero Partsakoulaki
Production editor: Alex Hess
TBIJ has a number of funders, a full list of which can be found here. None of our funders have any influence over editorial decisions or output.
-
Area:
-
Subject: