
My two years reporting on Big Tech’s hidden scandal
Exhausted, traumatised and underpaid, content moderators are the invisible victims of a labour system in need of overhaul
Isis members using a severed head as a football, someone wearing a flayed face like a mask, a person being covered in petrol and set alight.
These are just a handful of the videos I’ve heard content moderators describe in my role as Big Tech reporter at TBIJ. The images conjured up by the interviews have been difficult for me to forget – let alone for the workers who had to see them first-hand.
Content moderators are the invisible cogs in the vast machinery of Big Tech. There are tens of thousands of them, based all over the world, serving some of the world’s best-known tech platforms: Facebook, Instagram, TikTok, Tinder. Their job is to look at images, video and text that has been posted online to decide whether it should be deleted or restricted. Much of what they see is mundane. But some of it leaves lasting scars.
While some are employed by the tech firms directly, many more are employed through faceless outsourcing companies – an arm’s-length set-up that enables exploitation while hindering accountability.
Over the past two and a half years, our team has spoken to around 100 moderators and other safety workers based in 12 countries. And though each person has their own story to tell, certain common themes emerge: the demands are high, the wages are low, the work is precarious.
The theme that comes up most often is the impact the work has on their mental health. Workers have described symptoms of depression, PTSD and anxiety; told us of disturbed sleep, tremors and heightened sensitivity to noise. In multiple cases, moderators told us they had attempted to take their own life.
Others have told me about other, more subtle changes. I’ve been struck by the number who described a feeling of numbness or desensitisation to disturbing images after the initial horror wore off. One of the more surprising findings from my interviews was the number of moderators who’d taken up smoking.
Employers always provide some form of mental health support to moderators, but the quality of this varies. Some workers – generally those directly employed in the UK and the US – have access to high-quality mental health support whenever they need it. For others, it’s not so simple. TikTok moderators at Telus in Turkey said they had to use their designated “wellness breaks” to see the in-house psychologist, while several workers at TikTok contractor Teleperformance in Colombia described the support as inadequate.
One Honduras-based moderator working on the Grindr team at its contractor PartnerHero described the support available to him as worthless: “You call them, and they go through … the same questions like ‘How do you feel?’, ‘Can you just disconnect yourself for a moment?’, ‘Take a deep breath’, ‘Can you ask for a day off?’, ‘Can you talk to your supervisor?’ That’s it.”
(Both Teleperformance and PartnerHero told us they were committed to their employees’ welfare. Telus said it has “a comprehensive well-being framework” and offers unlimited wellness breaks.)
On top of this is the stress that comes from the insecurity of the job – a function of a system where outsourcing companies around the world are competing for the tech giants’ lucrative contracts. It’s a global supply of labour that leaves workers disempowered and dispensable.
“It’s very easy to move these jobs from one place to another and that creates a race to the bottom for wages, working conditions, protections,” Mark Graham, director of labour research unit Fairwork, told my colleague Claire recently.
Moderators’ jobs are also at risk from tech itself. When we did our first moderation story in 2022, an expert told us that the cost of computing required to use AI to moderate videos was prohibitively high. Two years later, TikTok told us that “80% of violative content [is] now removed by automated technologies”. In November last year, TBIJ reported that more than 100 of its UK-based safety jobs were at risk and in February its global safety teams were subject to redundancies, according to Reuters.
While moderators’ pay varies dramatically depending on their location, they tend to be among the lowest-paid workers at a tech company. Those who are outsourced will be paid even less. Outsourced Meta moderators in Ghana, for instance, earn a base wage of just £65 a month. (Meta told us its outsourcers are obliged to pay moderators fairly.)
Meta made $62bn in net profit last year. TikTok’s parent company made $33bn. Their contractor Teleperformance made $589m. These companies can afford to pay their moderators more – they are simply choosing not to.
Virtually all moderators’ pay varies based on their ability to meet performance targets, and these bonuses can be up to a quarter of their total salary.
“You have to just work like a computer,” one Colombian TikTok moderator told us. “You pick the policies, no more. Don’t say anything, don’t go to bed, don’t go to the restroom, don’t make a coffee, nothing.”
I’ve heard a range of different opinions about how reasonable these targets are, and this varies significantly by company and team. Many felt the targets were attainable, others felt they were set at an unsustainable level. I’ve spoken to several moderators who were fired for not meeting their targets – including one Honduras-based worker who said their role was terminated in the midst of a severe mental health crisis.
We’ve reported on various allegations of moderators losing their jobs after trying to improve their conditions. Two years ago, Facebook moderators in Kenya reported low pay, trauma, intimidation and union-busting. When they were laid off following the revelations, they launched legal action against Meta.
Those lawsuits are among the numerous attempts by or on behalf of moderators to hold their employers accountable. But while some of these efforts have made meaningful progress – such as the formation of a global trade-union alliance for moderators – the same problems continue to persist across the industry. The tech giants and their contractors have largely avoided accountability.
The reasons for this are systemic. The industry is shielded by a culture of secrecy, often enforced through NDAs that promise harsh penalties for workers who speak out. They have vast resources to fight legal cases brought against them. And if they find that a country is becoming an unfavourable place to do business, they can simply move the jobs elsewhere. (After the lawsuits were brought in Kenya, Meta simply relocated its African moderators to Ghana – where they face even worse conditions than before.)
What could be done to improve the lives of content moderators? There should be consistent access to mental health support, not just during break times. Targets should be clearly explained, and moderators should be able to query them if they feel they are unreasonable. Moderators should be able to raise concerns about their working conditions and form unions without fear of retaliation. Researchers at Cornell found that “only a few very recent scientific studies have set out to measure or improve the wellbeing of content reviewers.” Robust information about the prevalence of mental health issues among moderators should be gathered.
Perhaps most fundamentally, they should get a pay rise. The work they do is valuable and meaningful, but its importance isn’t recognised. Without them, tech companies would expose their users to serious harm, face legal repercussions and risk losing valuable advertisers.
Ultimately, moderators are the victims of a labour supply chain in need of overhaul. But the companies employing them should not escape blame. They can do better – and they should.
In the UK, Samaritans can be contacted 24 hours a day, 365 days a year on freephone 116 123 or by email at [email protected]. Other international helplines can be found at befrienders.org.
Reporter: Niamh McIntyre
Big Tech editor: Jasper Jackson
Deputy editor: Katie Mark
Editor: Franz Wild
Production editor: Alex Hess
TBIJ has a number of funders, a full list of which can be found here. None of our funders have any influence over editorial decisions or output.
-
Area:
-
Subject: