This analysis was first published in SvD Näringsliv, in Swedish, on October 18th, 2023. This piece was translated from Swedish by Claude. Some phrasing may differ from a human translation.
Major news events drive people to social media. But the channels flood with disinformation — spread further by the tech giants’ own algorithms.
Working in media means constantly reading obituaries for your own industry. A few have come from this pen over the years too. The challenger in recent years has always been the same: social media.
The appeal is obvious. When major news events occur in the world, both eyewitness accounts and reporting from journalists on the ground are shared on social platforms. The speed and authenticity of active social media users is said to challenge the role of traditional news media.
And it has undeniably become a challenge — for both media companies and society. But not for the reason that was first assumed.
The report “Swedes and the Internet 2023” shows clear differences in how and where people of different generations consume news. Among those born after 2000, 56 percent say they get news from social media — 16 percentage points ahead of the next largest medium, television, at 40 percent. The corresponding figures for those born before the turn of the millennium look markedly different: there, television, radio, and websites are all individually larger than social media.
Beyond the choice of channel, one should also look at the type of news and information being conveyed there. In complex conflicts such as the ongoing one in Israel and Palestine, engagement on social media platforms like X and TikTok was enormous. A large part of the content turned out to be false — everything from video game footage to fireworks from other occasions was presented as part of the fighting. Drawing on the new Digital Services Act (DSA), the EU required X to share content data and the platform was subsequently forced to remove thousands of posts.
Tech companies owning services like Facebook, TikTok, and Instagram have long encouraged news consumption on their platforms.
This is how it sounded in 2017, for example, when Facebook hired journalist and then-CNN anchor Campbell Brown to lead their news team:
“Right now we’re seeing a massive transformation taking place in the news industry — both in how people consume news and how reporters distribute news. Facebook is an important part of this transformation.”
Facebook did indeed become an important part of how news spread in the months that followed — but perhaps not in the way Brown had intended. The following year, 2018, the Cambridge Analytica scandal exploded, and Facebook CEO Mark Zuckerberg was called to testify before Congress.
Two weeks ago, Campbell Brown left Facebook (now Meta, which also owns Instagram and WhatsApp). X has also let go of staff working on similar issues.
What will happen to Meta’s investment in news and fact-checking is therefore unclear. But the timing could hardly have been worse. The fighting between Israel and Hamas is ongoing, as is the war in Ukraine. And within a year, there is a US presidential election. All are major news events with impact far beyond their immediate surroundings.
In Sweden, a parallel form of information distribution via social media is also playing out. SvD’s Emil Arvidson recently reported on how parts of a criminal network livestreamed themselves on a Sunday evening. Around 17,000 user accounts watched as both weapons and gang members were put on display. A live broadcast is just one button press away — whether you are a local politician or a criminal gang.
That individuals are responsible for the content is central to the entire concept of social media. There is something both appealing and democratic about it. Being able to express oneself publicly is now easily accessible to everyone.
The problem, however, is not this form of digital freedom of expression — it is how the platforms amplify these voices to others. When actors consistently spread disinformation through influence campaigns, they do not only reach those immediately around them; they are amplified by the services’ algorithms. News, information, or outright propaganda that engages people gets increased distribution — regardless of whether it is true. When all content and all senders are treated the same way, a culture of discourse is built in which the loudest, most controversial, and sometimes most reprehensible content is often what spreads furthest.
This is worth bearing in mind as more and more people go directly to these channels to form a view of the world. The companies behind social media care about the number of people watching, rather than the quality of the sender. In this way, the medium shapes its own content.
Given what that content has looked like in recent weeks, it is clear that the tech giants need to take greater societal responsibility. Not everyone needs to get their news from the same place. But it should be possible to tell who the sender is — and, as far as possible, to trust that what is being said is true.