This analysis was first published in SvD Näringsliv, in Swedish, on February 26th, 2026. This piece was translated from Swedish by Claude. Some phrasing may differ from a human translation.
AI is not quite as artificial as many believe. A new investigation by SvD reveals how private moments are shared with office workers in Africa, whose job it is to train AI systems. Invisible to the user — but absolutely crucial to the company.
We talk a lot about how we can use AI.
Perhaps we should talk more about how AI uses us.
In an investigative report by SvD and GP, we get a glimpse into what jobs in what is called “Silicon Savannah” look like. At a subcontractor for Meta, people in Nairobi sit and watch video clips. Their job is to tell AI systems what can be seen in the footage.
It is called “annotating data,” but it can be explained more simply than that. It is when a human helps AI understand what is happening on screen, in the hope that one day it will be able to manage on its own. But before we get to that point, it needs help — drawn boxes with explanations. And it needs the raw material — the data — to get there.
In this case, the video clips come from Meta Ray-Ban — a pair of glasses equipped with a camera and microphone. With them, the company obtains a type of data that is hard to come by otherwise. If someone went around filming you on the street, you would probably ask what they were doing. You would likely speak up and ask them to stop. But when it happens through a pair of glasses, it is not as obvious what is going on.
For AI systems to get better, they use the data we produce.
There are plenty of cases where this exchange of data can be beneficial for both parties. Having one’s genetics reviewed can help identify health risks. In some cases, it can be the missing piece for understanding why one feels the way one does.
A considerably more trivial exchange occurs every time you go on Facebook or Instagram. They know who you are, and they can indirectly sell that information to advertisers who want to reach you. What you get back is an entertaining service with amusement. You have agreed that they may do this — otherwise you would not be able to use these services. But you probably do not think about it every time you use them.
While the festive photos people share of themselves on Instagram are a conscious choice, the cameras on Meta Ray-Ban face the other direction.
The report describes how workers in Nairobi are required to watch a long series of private events — sex, toilet visits, intimate conversations — as part of the job. Situations that could not reasonably have been intended to be shared with anyone, yet ended up on a screen in Kenya. Did those who were filmed even know this could happen?
It is said that you cannot walk many metres in London without being captured by one of the roughly one million surveillance cameras in the city. Signs inform people of this, but it cannot be opted out of even if one wanted to. It is the price you pay for moving around the city. The phenomenon is therefore not unfamiliar, but now it can happen anywhere. In changing rooms, at a restaurant — or in your own home.
The purpose also differs. One can certainly have opinions about the surveillance cameras that exist, but the purpose is to be able to resolve, and hopefully prevent, various types of crime. Now we are putting cameras on our faces to let an AI service tell us what we could reasonably already see for ourselves.
For the time being, artificial intelligence is not always entirely artificial, and not always particularly intelligent either. There are thousands of people sitting in offices in Africa whose job it is to tell these systems what the difference is between an armchair and a sofa. They train AI systems to become smarter.
The human component in all of this work is almost invisible to the user, but is absolutely crucial. As a result of this, a great deal of information and data about you ends up on the screen in front of individuals who watch, assess, and review what you do. They are working. But not for you — for AI.
The work was halted after SvD’s and GP’s investigation. Now 1,108 workers have been given notice of redundancy, according to multiple international media outlets.
In an office building on Mombasa Road in Nairobi, thousands of the AI revolution’s heavy labourers work. People who are supposed to teach the systems to understand what is captured in images.
They are called “annotators” and their work is invisible to the end user. The company they work for is called Sama. Until recently, its major client was the tech giant Meta, which is investing heavily in camera-equipped glasses powered by AI technology.
SvD and GP were able to reveal in February that the annotators had been exposed to a great deal of private material in the course of their work. It could involve sex, naked people, and toilet visits filmed via Meta’s glasses.
The consequences for the tech giant have been severe. Four data protection authorities on two continents have acted against Meta by demanding answers from the company or opening their own investigations.
Earlier this week it became clear that Meta is completely halting its collaboration with Sama in Kenya after investigating the claims that emerged in the investigation.
Now the consequences are falling on Sama’s employees. 1,108 people risk losing their jobs as a direct result of Meta’s decision, the company states in a written announcement.
“We are aware of the significant impact this has on the team and the local community. We are actively working to support the affected employees with care and respect,” the company writes.
The news has been reported by, among others, NTV Kenya, The Guardian, and AP. Sama states in its announcement that negotiations with Meta have been ongoing, but that the talks have been fruitless.
Kauna Malgwi, who previously worked for Sama, tells The Guardian that the layoffs show how the AI industry works.
“The power lies with the big tech companies and the risk is pushed downwards and affects the workers, often in the Global South, who have the least protection.”
Meta has stated that the company has investigated the claims in SvD’s and GP’s investigation.
“Last month we paused our work with Sama while we investigated these claims. We take them very seriously. Images and videos are private to users. People review AI content to improve product performance, for which we get clear user consent. We have also decided to end our collaboration with Sama as they do not live up to our standards.”
Meta has not, however, responded to follow-up questions about what its investigation has found. Sama has stated that the company “adheres to strict standards of data security and privacy, including GDPR and CCPA.”
After SvD’s and GP’s investigation, the Facebook owner is ending its collaboration with the annotation company Sama in Kenya. At the same time, employees speak of retaliation, insecure working conditions, and wages not paid on time.
According to multiple sources, after the work stoppage employees have been going to work without receiving any tasks. Instead, they have been sitting in front of their computers refreshing the screen roughly every eight minutes to show that they are present.
“Nobody is working on annotation in these projects anymore,” says one source.
After the investigation, employees at Sama report that conditions have worsened and that security at the company has been tightened further, and that they have attempted to track down sources.
Eric Mugendi, editor-in-chief of Kenya-based Africa Uncensored, one of East Africa’s largest investigative journalism outlets, says that the tactic of trying to identify sources is common in the country.
“After revelations like these, it is common for pressure on employees to increase, with stricter controls, and then they try to trace who has spoken to the media,” he says.
The slightest rumour can be grounds for dismissal, he adds.
“If one person does something, everyone can be punished. And if you hear something — even if it is just a rumour without concrete evidence — it can still be used against you.”
SvD and GP put questions to Sama about their view of the claims regarding worsened working conditions and the tracking of sources, and about Meta pausing its work with Sama. Representatives for Sama write in a comment that they do not comment on specific client relationships.
Sama also writes that it takes data security very seriously. “We strongly dispute several of the allegations and emphasise that we adhere to strict standards of data security and privacy, including GDPR and CCPA. We are equally committed to the wellbeing and fair treatment of our employees,” Sama writes, adding: “As a company, we do not monitor or track employees with the aim of identifying individuals who may have spoken to the media, and we reject any insinuations of retaliation.”