13.12.23 Big Tech

Toxicity and trauma: a day in the life of a dating app moderator

Content warning: This story contains references to sexual abuse.

Gael, who was a freelance moderator for Bumble until earlier this year, would usually work a night shift. From his home in Brazil, he’d clock on at 9pm or 10pm and work for up to six hours, finishing in the early hours of the morning.

Once he had logged on to Bumble’s system, he’d begin working his way through a queue of reports in multiple languages.

There were various ways users could violate Bumble’s rules. “So we have underage [users], abusive/offensive [content], which can come in various forms. Then you have a lot of sexual harassment – it’s one of the main issues in the platform,” he said.

“Discrimination … toxic male behaviour as well. And then commercial, so profiles that try to make money somehow, either through scams or legit businesses.”

After reviewing the report, moderators decide whether Bumble’s rules have been broken and if someone should be banned from the platform. “If somebody sends a [nonconsensual] nude picture to somebody, that’s an immediate block,” Gael explained. If a user says something racist, “that’s a straight block”.

He went on: “Then you have more hardcore situations, such as paedophilia, for example. Things like that have a higher category of block because it’s situations that need to be investigated as well.”

In these instances, Gael would refer the ticket up to a more senior internal team who dealt with illegal behaviour.

Some of the abuse reports were submitted by Bumble users, while others were generated automatically. “The machine-learning software detects certain words or sentences or images, [and] that triggers the system,” he said. “And we needed to double check if the information is actually valid.”

Gael said he was asked to get through 100 of these reports per hour, or one every 36 seconds, though moderators were given different targets depending on their task.

Gael generally managed to meet his target, but found it draining. “It’s really brainwashing for the agent, because at the end of five hours, you went through … 600 reports of abusive behaviour.”

Some of the most serious cases he dealt with had a lasting impact.

“I had two cases that really troubled me,” he said. “One was [images of] intercourse between somebody who appeared older with a very young girl. And the other case was similar, very explicit nude photos of a young boy.”

While Bumble’s staff members can access mental health support, these benefits were not available to freelancers like him.

Gael felt he had not received adequate training to deal with such distressing images. “It’s like they expect for you to be smart and agile enough to handle these types of situations,” he said.

“It’s messed up because the platform itself is created towards safety for users, but it doesn’t really think about the safety for their agents.”

Bumble told TBIJ that freelance content moderators are given control over the tasks they choose to complete and that their daily workload is determined by moderators themselves. The company said it has doubled its investment in safety since 2021.

Bumble also told TBIJ that moderators are provided with clear and consistent enforcement guidelines and reporting requirements for different types of suspected abuse. It said it provides “innovative benefits to front-line content moderators whose work helps keep the platform as safe as possible”.

Gael was paid €10 per hour – a very good rate, he said, when converted into Brazilian reals.

“Although it paid really, really well, it was affecting my life a lot,” he said. “One or two weeks after quitting and changing to the job that I’m doing now, I was a lot lighter.

“It’s a tough job, it was an experience – but I don’t wish to do it again.”

* Name has been changed

Read our full investigation into the workers who keep dating apps safe

Reporter: Niamh McIntyre
Tech editor: Jasper Jackson
Deputy editor: Chrissie Giles
Editor: Franz Wild
Production editor: Alex Hess
Fact checker: Billie Gay Jackson
Illustration: Anson Chan

Our reporting on Big Tech is funded by Open Society Foundations. None of our funders have any influence over our editorial decisions or output.