27.04.25 Big Tech

Suicide attempts, sackings and a vow of silence: Meta’s new moderators face worst conditions yet

Hit by workers’ rights lawsuits in Kenya, the tech giant has moved its outsourcing to a top-secret new site – where life is grimmer still

Content warning: This story contains references to violence, suicide, child abuse and self-harm.

A suicide attempt, depression, substance abuse, insomnia, surveillance, threats. These are just some of the experiences reported by the low-paid moderators tasked with sifting through Facebook and Instagram’s most disturbing images.

The tech giant Meta, which owns both platforms, has kept the whereabouts of this operation a closely guarded secret since moving it from Kenya, where the company is facing lawsuits over working conditions and human rights. For months, it has also refused to name the company that won the lucrative contract to provide the content moderators who deal with disturbing content on its platforms. Such secrecy works to shield tech companies like Meta from accountability.

Now, the Bureau of Investigative Journalism (TBIJ) and the Guardian can reveal that Meta has moved this business to a new operation in the Ghanaian capital of Accra – where working conditions are said to be worse in almost every way. The company employing the moderators is Teleperformance, a French multinational with a history of controversy around workers’ rights.

Based in an anonymous office building, the 150 or so moderators spend their days reviewing content flagged on Facebook, Instagram and Messenger, including videos of extreme violence and child abuse. They say they are forced to work at a gruelling pace in order to meet a series of opaque targets that dictate whether or not they are able to scrape by in Accra.

Employees say they have developed mental illnesses and at least one was fired after advocating for better conditions. Many have come from abroad and some told lawyers that they had not spoken out for fear of being forced to return to conflict zones. They say they are instructed to tell no one, not even their families, that they are moderating Meta’s content.

Accounts heard by TBIJ were corroborated by lawyers from Foxglove Legal, a non-profit that is supporting the cases brought in Kenya. It visited the Accra-based moderators in March and is investigating their allegations.

“After the atrocious treatment of Facebook content moderators we exposed in Kenya, I thought it couldn’t get any worse,” said Martha Dark, Foxglove’s co-executive director. “I was wrong. These are the worst conditions I have seen in six years of working with social media content moderators around the world.”

Teleperformance told TBIJ: “We have robust people management systems and workplace practices, including a robust wellbeing program staffed by fully licensed psychologists to support our content moderators throughout their content moderation journey”. It said that “during the interview process, within the employee contract and through employee training and resilience testing, [it is] fully transparent with its prospective moderators regarding the content they might see during their work to keep the internet safe”.

Meta told TBIJ that it takes the support of content reviewers seriously and that its contracts with outsourcing companies contain expectations including counselling, training and other support.

Mark Graham, professor at the Oxford Internet Institute and director of the Fairwork project, which evaluates tech companies’ work standards, explained that an opaque supply chain means the company at the top – Meta – cannot be held accountable for harm further down. In the garment industry, for instance, such transparency is now expected.

“It’s very easy to move these jobs from one place to another and that creates a race to the bottom for wages, working conditions, protections,” Graham said. “Workers are extremely replaceable”.

‘I’m not the person I was before’

Almost three months after Solomon* tried to end his own life, his job as a content moderator was terminated and Teleperformance booked him on a flight back to East Africa, where his hometown is in the midst of an insurgency.

“They told my friends, ‘Because [he] attempted suicide, [he] cannot continue with Meta,’” Solomon told TBIJ. One of Solomon’s colleagues said they believed his mental health crisis was kept quiet because Teleperformance managers were worried the company would lose its contract with Meta.

Solomon’s suicide attempt was the culmination of a mental health crisis that he believes was brought on by his work. He said he had been in a precarious mental state for almost a year, since starting the job, when he learned that his best friend had been killed back home – a loss that sent him spiralling into depression in November 2024.

Grief and anger led him to smash a window in the crowded living quarters he shared with colleagues. When Teleperformance found out, he said, an HR representative threatened to fire him before recanting and placing him on mandatory leave. He claims the company did not provide him with any meaningful support during that time.

In the days that followed, Solomon was left alone in his apartment as horrific images from work looped through his mind. Two videos in particular stayed with him. In one, a man is skinned alive. In another, a screaming woman is beheaded in front of the camera.

Unable to cope, Solomon started to self-harm. He stopped eating, and began to smoke and drink compulsively. “I was not the kind of person I was before,” he told TBIJ.

Late last December, he attempted suicide. When he was discharged from a psychiatric hospital, he learned that Teleperformance had booked him a flight back to east Africa. He said managers told him he needed “family support” – which he sees as the company rejecting its responsibility for workers’ wellbeing. After he decided not to take the flight, he was offered a non-moderation job, which he declined because it was lower paid.

Teleperformance told TBIJ that Solomon had no reported depression or psychological concerns prior to his episode, and that he has declined offers of psychological support before and after his employment. It said that upon learning of his depression it conducted a psychological assessment and determined that he was no longer well enough to continue providing moderation services.

It said he was offered a different role, which he declined, insisting that he wanted to continue in this moderation role. When he left, Teleperformance said, he was paid compensation in accordance with his contract. The company questioned Solomon’s motives for talking to the media.

Solomon feels like he was treated like a disposable object – a “water bottle” that tech companies drained and then threw away when it was no longer of use.

In mid-March, he returned home, where he said his safety is at risk, on a flight booked for him by Teleperformance which it said it did so that his family could provide him with support. The wages he earned in Ghana were so low he cannot afford to live anywhere else.

“This happened because of the nature of the job,” Solomon said, of his suicide attempt. “But I am determined to fight for my rights.”

The building in Accra where the moderators work Foxglove Legal

From bad to worse

Solomon’s experience is part of a far bigger story about the treatment of content moderators from poorer countries by some of the world’s richest tech companies.

This system is facilitated by outsourcing firms that employ the workers at arm’s length from the tech giants. It was one of these companies, Sama, that oversaw the Kenya operation before Meta terminated that contract in 2023. The contract was reportedly taken over by Marjorel, a Luxembourg-based outsourcer that was bought by Teleperformance the same year. But Meta never confirmed the contract publicly, nor did it respond to inquiries about where moderation was taking place in Africa.

Teleperformance employees working on Meta content in Accra appear to have been recruited primarily to moderate content in languages from East Africa, on the other side of the continent. The vast majority are not Ghanaian citizens and are on temporary work permits.

Their wage structure is “uniquely cruel”, said Michaela Chen, a researcher and advocate with Foxglove.

Contracts seen by TBIJ show that the base wage starts at about 1,300 Cedis per month – just over $80 – which is less than a fifth of the estimated average cost of living in Accra. Teleperformance told TBIJ that its moderators in Ghana receive base pay that is more than double the country’s minimum wage, which is about 20 Cedis, or $1.30, a day.

This is supplemented by a system of “bonuses”, the upper range of which would amount to around 4,900 Cedis a month, according to contracts seen by TBIJ. That works out at around $315, significantly less than the estimated average cost of living in Accra.

Teleperformance disputed the figures, claiming that its foreign content moderators earned roughly 16 times the national minimum wage, and local moderators 10 times, once a range of additional payments were taken into account, and that these payments “are automatically paid to the moderator and are not performance-based”. It also said that half of its workers achieve their targeted performance bonuses. The company did not provide evidence to support its claims, despite being asked to do so.

Performance bonuses are gained by hitting targets, some of which moderators say are set by an automated system that assigns posts for review. If moderators miss these targets, even by a small amount, they lose significant amounts of their performance bonus.

“In order to hit their targets, moderators sometimes have less than a minute – even as little as 20 seconds – to decide if a piece of content needs to be taken down,” Chen explained. “When dealing with distressing and complex content, that is an impossible task.”

Moderators also said that they were paid less if they worked extra shifts, sometimes as little as half of their normal rate.

Abel, a content moderator also from east Africa who had previously worked for international companies, said he told his managers that overtime pay should be higher than normal.

“They had the audacity to say, ‘This is voluntary,’” he told TBIJ. “I told them, ‘I’m not here to [do] charity for a tech billionaire company.”

Chen said that such low basic wages force workers into what is effectively mandatory overtime to obtain small premiums that keep them above the breadline.

“It’s nothing less than forced exploitation,” she said.

Teleperformance disputed the claims about overtime pay and told TBIJ that “moderators always make more than their daily rate when working overtime, not less”.

Meta told TBIJ that the outsourcing companies it works with are contractually obliged to pay moderators above the industry standard in the countries they operate.

‘You don’t work for Meta’

When Abel was first hired as a Ghana-based moderator by Majorel last year the situation in his home country was deteriorating and his business had gone under. In the interview, he was told he would be addressing online bullying and harassment – work he felt prepared to handle. After arriving, he was trained exclusively on Meta’s policies.

When he logged on at the start of each day, he was greeted with a message that read, “Thank you for helping to protect Meta’s platforms”. But he said he was also under strict instructions to keep it secret.

“[They said] don’t tell anyone that you work for Meta, and don’t tell what the nature of your work is,” Abel told TBIJ. He said his managers would encourage employees to try and forget what they did at the end of each shift. “I told them, ‘We are not robots that you can switch on and switch off.’”

Foxglove corroborated these claims. “Workers spend all day working on Meta’s platforms, yet they are told they do not work for Meta, and are forbidden from telling anyone they do,” said Dark.

After he completed his training, Abel spent 10 months trawling through horrific content. Like Solomon, the work took a toll on his mental health.

“Things that used to be weird to me became normal,” he said. “I used to be spooked when I saw blood. Now, I’ve become numb. It altered my character.”

Abel now experiences nightmares and intrusive thoughts, and says he struggles with relationships. The harms he and others describe are similar to those detailed by moderators who brought cases in Kenya.

“It is not only me,” he explained. “[It’s] everyone working on that floor.”

‘Advanced union-busting’

Abel was fired in late January. While working as a moderator, he had been a vocal advocate for better working conditions and spoke up for others, like Solomon, who could not advocate for themselves. Moderators who spoke to TBIJ and Foxglove said Abel’s firing left them even more afraid to speak up.

Employees in Ghana are also aware that those who fought back in Kenya suffered serious consequences. Claimants say they have been blacklisted by other digital outsourcing companies and are now struggling to find work. Lawyers who represented them say there is good reason to believe them.

“It’s an advanced form of union busting,” said Mercy Mutemi, a lawyer working with Foxglove on the Kenya cases. “It’s a way of rooting out dissenting voices […] and making sure you never even think about unionising.”

Meta told TBIJ it respects the right of outsourced employees to organise.

Abel’s termination letter, reviewed by TBIJ, cites unauthorised absences and the incorrect use of “idle codes” when he was away from his desk. Abel said this was when he was taking care of Solomon, who was afraid to stay home alone after his suicide attempt.

Moderators in Ghana are assigned living quarters, bunked two-to-a-room in a building where electricity and water shortages are frequent, say moderators, away from friends and family.

Teleperformance told TBIJ that its moderators have the option of finding private housing for which they would be paid a subsidy benefit.

A moving target

Solomon was diagnosed with depression and suicidal ideations in December. He said he was given two month’s severance pay – around $170 – but when he asked for compensation for harm and to pay for additional long term psychiatric care, Teleperformance refused. He has now been sent back home, where he is worried for his safety.

Teleperformance told TBIJ that Solomon had declined its offers of psychological support both while he was in Ghana and since returning home.

“Meta’s content moderation model made this tragedy not just possible, but inevitable,” said Foxglove’s Michaela Chen. “Worst of all, Solomon is not alone. But no others have been able to yet speak publicly.”

“Meta saw the game was up in Kenya,” said Martha Dark. “Now they are hoping to get away with causing harm to hundreds of other workers in Ghana. They are displaying a complete disregard for the humanity of its key safety workers, upon whom all its profits rely – content moderators.”

Mark Graham described the contradiction whereby Meta and other tech giants at the top of labour supply chains need to set the regulatory standards, but are unlikely to regulate themselves.

Cases against Meta and its contractors may yet deliver justice to workers in Kenya and Ghana, but they won’t necessarily improve conditions elsewhere. Part of the issue is that content moderation is highly mobile and African governments see digital outsourcing as a pathway to development. “We’re not able to get ahead of the problem,” Mutemi said.

As the Trump administration pushes for more tech deregulation, Mutemi argues that African countries also need to band together to set minimum labour standards.

“Efforts to fight this have to be at the continental level,” she said, “[so] you can’t get away with it anywhere”.

* Names have been changed to protect workers’ safety

Samaritans can be contacted 24 hours a day, 365 days a year on freephone 116 123 or by email at [email protected]

Reporters: Claire Wilmot and Rachel Hall
Tech editor: Jasper Jackson
Deputy editor: Katie Mark
Editor: Franz Wild
Production editor: Alex Hess
Fact checker: Ero Partsakoulaki
Illustration: Anson Chan

TBIJ has a number of funders, a full list of which can be found here. None of our funders have any influence over editorial decisions or output.