Creators shock — YouTube ducks on Christofer ‘Chrippa’ Berg

SvD Näringsliv

Originally published in Svenska Dagbladet by Björn Jeffery, June 22, 2025

Want to succeed on YouTube? It can pay to shock and be controversial. But a recent example shows how easily people can end up caught in the platform’s grey zones.

Every minute of every day, around 500 hours of video are uploaded to YouTube. The Google-owned video service has become the Western world’s second-largest search engine. Who is actually watching all this material?

That question is difficult to answer. The differences between superstars who attract millions of views and home filmmakers with single-digit viewer counts are enormous.

What is easy to answer, however, is who is not watching all the video being uploaded — and that is YouTube itself. So who keeps track of what gets uploaded to the platform? Nobody. And that can have major consequences.

YouTube itself likes to talk about its automated systems that scan material for unsuitable content. And large numbers of videos are removed with the support of these systems — around 9.4 million video clips disappeared this way in the last quarter of 2024. But if the machine-learning systems are like a net, it is not a particularly fine-meshed one. A great deal of material that breaks YouTube’s rules gets published anyway.

There are many examples of this. In the documentary podcast Badfluence from SvD and Podme, 16 examples containing potential violations are sent to YouTube. The creators behind the videos are major figures on Swedish YouTube — Pontus “Anjo” Björlund, Alexander Rask and Christofer “Chrippa” Berg. YouTube is given just over a week to analyse the 16 videos. Shortly afterwards, one of them disappears, and another appears to have been edited. The accompanying comment from YouTube points to the company’s guidelines for what is and is not permitted on the platform. But the review only took place after SvD and Podme had shared the links.

YouTube’s policy resembles most closely a kind of public insurance policy. A document one can point to in order to justify removing and changing material when necessary. Because a policy is easy to write but difficult to uphold.

What emerges are two parallel worlds on YouTube — a set of rules that dictates what you are allowed to do, and millions of videos that have neither been filtered out nor reported by any viewer. In many cases, the two have very little to do with each other.

It goes without saying that the challenge for YouTube of keeping track of all this material is enormous. 500 hours of video per minute amounts to around 720,000 hours of new material to review — every day. How could that even be done? That it requires some form of automation is obvious. And that there will be shortcomings in these systems is equally so.

At the same time, the problem they are trying to manage is entirely of their own making. There are reasons why many other platforms do not allow people to freely upload whatever material they like to their services. Doing so quickly becomes a question of responsibility. Your platform, your responsibility, right?

That would be one way to see it, at least. In practice, YouTube has grown up in what is almost a lawless territory where regulation of tech companies has essentially not existed — particularly not in the service’s home country, the US. Within the EU, new legislative packages have been introduced, and in October 2024 YouTube received a formal inquiry about how content is recommended on the platform. A closely related area, but not identical. It is also worth noting that YouTube has existed for 20 years — and society has not progressed further than cautiously beginning to ask questions. There is something to be desired there.

YouTube’s incentives to change the situation are few. While they do not want material that is directly illegal on their service, the difficulties arise in the grey zones. In the Badfluence podcast one hears about a ruthless world where creators slander each other. The accusations that fly concern everything from various crimes to infidelity. Secretly recorded phone calls, text message conversations and censored nude images are shared. Inappropriate? Yes. Popular? Also yes. The algorithms that determine what viewers are shown favour what is controversial. And the more people who watch a particular clip, the more people may have it recommended to them.

Here we find the underlying problem. The more video views, the more money both YouTube and the creator earn. Being controversial therefore pays. The platform and the creators both feed and depend on each other. In such a scenario it is easy to understand why the grey zones have received lower priority.

Unfortunately, it is precisely in these grey zones that people get hurt. Private information is shared, people are violated, lies are spread. YouTube does not encourage creators to do this. But they do not do particularly much to stop them either.

The Author

Björn Jeffery is a Swedish technology columnist, advisor, and independent analyst based in Malmö, Sweden. He is the technology columnist for Svenska Dagbladet and co-hosts a podcast for the newspaper. He was previously CEO and co-founder of Toca Boca, the kids’ media company that grew to over one billion downloads. Through his advisory practice, Outer Sunset AB, he works with companies on digital strategy, consumer culture, governance, growth, and international expansion.