Apps are abused – and it’s their own fault

Leave a comment
SvD Näringsliv

This analysis was first published in SvD Näringsliv, in Swedish, on November 29, 2020.

SvD’s exposé of the app companies shows how difficult it is to prevent and catch those who exploit children. But the fact that the technology is used in a way that was not intended is no excuse. Instead, it should be the starting point for a larger process of self-criticism.

“We can do better,” say Facebook and Twitter. “We have tightened security,” claims Star Stable. “Delete is our default,” promises Snap.

Self-criticism mixed with big words, but in the end it turns out to be worth very little. Preventing individuals, organizations, or nations from using their platforms in a harmful way is difficult, as SvD’s exposé has shown. And yes – it is difficult. But it is not a law of nature that these services needed to be so difficult to protect from people with bad intentions from the beginning. The companies themselves have – consciously – chosen to accept the negative side effects that have arisen when designing their platforms.

Take the company Snap for example. A large part of their service, Snapchat, is about sending messages that are temporary and disappear after a short time – an idea that was later copied by both Instagram and Twitter. The ephemeral is the charm.

But the ephemeral also has immediate limitations – not least from a security perspective. This is also why police say that Snapchat is the online pedophile’s favorite app – an insight Snapchat should have thought of when creating it. Because it is obvious. And when you still choose to build the service in this way, Snap has down prioritized that risk.

The gaming industry is in a similar situation. All trends point towards allowing players to participate in the creative process and interact with each other. It is fun and gives rise to an incredible creativity and sense of belonging among many. But when you are allowed to create freely, risks also arise. As in the incredibly popular children’s game Roblox, where you could witness what looked like a gang rape. It’s a very small part of a game where millions of people participate every day, but it’s still there.

I have experience from the gaming industry myself, where we made products for children. I understand the allure of riding the trends to create more fun and more viral games. At the same time, I also remember the conversations we had in parallel – the ones about whether we could guarantee children’s safety. If we could be sure that our games could not be abused in any way. And when we came to the conclusion that we could not be sure of that – then we refrained from building that kind of game. It is possible to say no, even as a tech company.

Should you then never be able to design games or services where there is the slightest risk? Yes, of course. But you can also prevent the risks that arise.

You can add enough resources to minimize them from the beginning. Security in games and services is often pitted against simplicity for users. But that is an oversimplification. There are countless examples of successful games and services that do not have as big problems with this as Snap does. On the contrary, innovating here could be a competitive advantage. One would have wished that the same amount of creativity and resources be put into finding new solutions for this problem, as there is in creating new game concepts.

We should stop accepting the “it is difficult” as an excuse for tech companies’ self-inflicted problems. The responsibility begins with the design of the product. That is where the difficult questions need to be answered – not when services and games are already in the hands of children.

This analysis was first published in SvD Näringsliv, in Swedish, on November 29, 2020.

Processing…
Success! You're on the list.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.