This analysis was first published in SvD Näringsliv, in Swedish, on March 25th, 2026. This piece was translated from Swedish by Claude. Some phrasing may differ from a human translation.
Meta and YouTube caused a young girl to suffer anxiety and depression. An American jury has now established this in a landmark case. The verdict opens the door to an avalanche of similar cases — and could tear down the tech giants’ unique legal protection.
“Section 230 of the Communications Decency Act of 1996.”
This brief piece of American legislation, just 26 words long, is what has legally protected social media companies all these years.
Responsibility for all content uploaded — images, texts, videos — is considered under it to belong to the person who uploaded it. Services like Instagram, YouTube, and TikTok have been regarded as neutral platforms.
But that came to an end on Wednesday evening when an American jury — drawing on other laws as a basis — showed that they could be found liable for harm nonetheless.
The method is legally creative. Both Donald Trump and his predecessors have said they want to repeal “Section 230.” None of them have managed to do so, and a completely different approach was therefore needed to test the responsibility that social media companies bear for young people’s wellbeing.
What was done in this pilot case is to advance the thesis that it is the design of the services that has harmed young people — not the content. It is, for example, the recommendation algorithms and the ability to scroll endlessly that caused the harm. In this way, prosecutors were able to circumvent the deadlock that has existed when it comes to holding social media companies accountable.
The main figure in the case is a young woman now aged 20. Of the many thousands of cases making similar claims, hers was considered among the stronger, and she was therefore a good candidate to test first. What follows is therefore an avalanche of further cases in which predominantly young people will attempt to obtain redress for harm they believe they have suffered through their use of social media.
The fine of 3 million dollars — around 28 million kronor — is a rounding error for both Meta and YouTube, who are found liable. Going forward, there may be a larger class action, or a near-endless number of smaller cases with individuals making similar claims. But above all, the symbolic value of the loss is enormous. Many have drawn the comparison that this resembles the cases brought against tobacco companies in the 1990s. These proved expensive for those companies, but they also marked a turning point in public perception. Tobacco stopped being accepted in the same way after that.
It is entirely possible that the same shift in attitudes will happen here.
Parents in particular have long been worried about the impact they perceive social media to have had on their children. Research on this has been slow and insufficient, and some of the strongest evidence of harm has, ironically, come from inside companies like Meta. Internal documents and studies have shown how certain features could be addictive for young users. And yet nothing definitive has happened — beyond individual bans for young people in certain countries, such as in Australia.
After Wednesday’s verdict, these legislative proposals are likely to accelerate sharply around the world. The American ruling does not technically affect how other countries need to act, but in practice it is likely that it will do so anyway. It shifts from having been an uncertainty to a decided case in which these social media companies have been proven to have harmed a young woman. Which politicians will be able to allow this to continue unimpeded in their respective countries?
Companies like Meta, YouTube, and TikTok now face a crisis. They have numerous emerging bans for young people around the world, an enormous number of legal cases of a similar nature, and a potentially shifting opinion on how to regard these companies’ social responsibilities.
From having been a self-evident part of many young people’s daily lives, parents at least are likely to use the verdict as grounds for trying to restrict it. Some advertisers may come to want to avoid being associated with these platforms.
Earlier this week Meta also lost another case in the American state of New Mexico, for failing to protect children from abuse. Despite what are likely to be endless appeals before final verdicts are reached, the case involving the 20-year-old woman marks a clear turning point. It may be that social media has reached its peak — and is now beginning a downward journey.
Not everyone will delete TikTok from their phone immediately, but a new era is beginning. Social media can harm young people. It will now be difficult for anyone to ignore this.