TikTok’s Content Moderators Are Having Nightmares About Their Work

SvD Näringsliv

This analysis was first published in SvD Näringsliv, in Swedish, on October 25th, 2022. This piece was translated from Swedish by Claude. Some phrasing may differ from a human translation.

TikTok’s algorithm has been praised as the best in the world at selecting content. But the hardest decisions are made by people who watch thousands of disturbing clips every day — for a wage of around one hundred kronor a day.

Luis, a 28-year-old Colombian student using a pseudonym, spoke to The Bureau of Investigative Journalism about the kind of content he has to watch as part of his job. Luis is a content moderator for TikTok, reviewing material uploaded to the platform. Several people interviewed by the TBIJ describe having developed nightmares and psychological problems from the work. To hit the targets set, moderators must watch between 900 and 1,000 videos per day. Pay for this is around 2,800 kronor a month — barely above the Colombian minimum wage.

Much of TikTok’s success is attributed to its algorithms — the software that determines which content is shown to each individual user. It is personalised, so each person gets the content the algorithm thinks they will watch. But that should not be confused with content they actually want. There are countless examples of users being served material they dislike, yet end up watching anyway — something SvD documented in an investigation last year, which focused on eating disorders. That is far from the only example.

With sophisticated algorithms, you might expect this kind of offensive material to be filtered out automatically. But the internal systems designed to do that are far from adequate. Human moderators are needed to fill the gap, manually reviewing disturbing videos.

There is another reason the system works this way: it is cheaper. Roy Carthy, head of marketing at moderation company L1ght, told TBIJ that competing with those low wages simply isn’t possible. For TikTok, solving this problem technically is not profitable.

The situation is familiar. A few years ago, news site The Verge reported how Facebook moderators were developing symptoms resembling PTSD — post-traumatic stress disorder — without being entitled to any support from the companies purchasing their services. Both Facebook and TikTok use subcontractors in low-wage countries for this work, increasing the distance and reducing visibility into exactly how it is conducted and what is done to support the workers.

Another worker, named Alvaro in the TBIJ report, says “you have to work like a computer. Say nothing, don’t lie down, don’t go to the bathroom, don’t make a cup of coffee, nothing.” He describes the home environment that many moderators work in, where they are also monitored by video cameras by TikTok’s subcontractor. Bonuses are paid based on the number of videos watched. Alvaro received a written warning after watching only 700 videos in one shift.

Once again, we are confronted with the priorities of large tech companies. That people upload inappropriate or outright illegal content is not the platform’s fault — individual users are responsible for that. But when platforms create the infrastructure for it, you can be certain of what will follow. No one can reasonably claim to be surprised by the outcome.

As a user, you can sometimes marvel at how good free services are. How can all this content — entertainment, information, culture — be delivered to your phone at no cost? But there is almost always a price. It is just that someone else is the one paying it.

The Author

Björn Jeffery is a Swedish technology columnist, advisor, and independent analyst based in Malmö, Sweden. He is the technology columnist for Svenska Dagbladet and co-hosts a podcast for the newspaper. He was previously CEO and co-founder of Toca Boca, the kids’ media company that grew to over one billion downloads. Through his advisory practice, Outer Sunset AB, he works with companies on digital strategy, consumer culture, governance, growth, and international expansion.