Who Pays the Price for Our Digital Comfort?
The Invisible Exploitation in Kenya
We can access summaries of hundreds of pages in seconds, learn the content of an hour-long video in a few lines, and accomplish coding tasks that would take a programmer days in a very short time. Artificial intelligence saves time, increases productivity, and sometimes even saves lives through medical diagnoses.
However, behind the "miraculous" level AI has
reached today lies an invisible cost. This price is paid with the labor, time,
and mental and physical health of people living in less fortunate regions of
the world. There is a significant group of people whose mental and physical
limits are pushed so that safe content can be produced for users.
The Dark Side of Artificial Intelligence
Almost every day, the short videos we watch on YouTube and
the "reels" content we encounter on social media undergo a "data
labeling" process to train AI algorithms and filter out harmful content.
This process cannot yet be automated. Videos are watched one by one by people
who decide what they are about, what they contain, and what they represent. In
other words, AI is still learning from humans.
Data labeling is an extremely arduous and grueling task. In
addition to the mental and physical fatigue caused by staying in front of a
screen for hours and constantly having to make decisions, the psychological
exhaustion resulting from exposure to extremely disturbing content is very
difficult to compensate for.
In the West, relatively high wages were paid for this job,
and the physical and psychological problems caused by data labeling were
addressed within insurance systems. However, a new way was found to reduce
these costs and risks: to protect the mental and spiritual health of its own
society, the West moved this line of work to a geography it has exploited in
various forms throughout history: Africa.
Another global power that realized this "solution"
was China: China also followed the same path as the West, turning to the
African continent, especially Kenya, known as the "technology
savannah," which offered more attractive conditions.
Why Kenya?
Despite youth unemployment exceeding 60%, Kenya creates an
ideal digital labor pool for global technology companies due to its high
education level and English proficiency. The country has the advantage of
working in the same time zone as the West and China. Regulations are lax, and
worker protection mechanisms are weak. The high unemployment rate forces even
qualified young people to accept almost any condition.
Kenya is also marketed as Africa's ideal "technology
hub" with its digital service infrastructure and low costs. This time,
exploitation is legitimized by the discourse of "digital development"
and "youth employment."
The Meta–Sama Case: The First Breaking Point of Visible
Exploitation
The picture of digital exploitation targeting Kenya first
became visible on a global scale with the Meta–Sama case that erupted in
2022-2023. Sama, a US-based subcontractor handling content moderation work for
Meta's Facebook and Instagram, forced hundreds of Kenyan youth it employed in
Nairobi to filter extremely psychologically burdensome content for very low
wages.
Employees exposed to thousands of images containing murder,
child abuse, rape, and extreme violence began to show serious trauma symptoms
over time. According to testimonies reflected in court files and the
international press, these digital workers were not clearly told about the
nature of the content when hired; only vague descriptions like "technology
support services" were used. Adequate psychological support was not
provided to employees suffering from post-traumatic stress disorder, depression,
and anxiety disorders.
Everything was legal on paper. There were contracts, company
names were known, offices were official. But thanks to legal loopholes, weak
worker protection mechanisms, and high unemployment rates, the resulting
picture was a clear system of exploitation in ethical terms.
Jobs that initially started with simple and harmless
questions like "how to make coffee?" evolved over time into savage
and shocking content like "how to cook human flesh?" or "how to
kill a baby?" Digital workers were forced to face the dark side of
building AI. Protests were organized, legal avenues were tried; but no concrete
gains were achieved.
Digital Exploitation Continues
The form of exploitation changed but did not disappear.
Today, Western AI companies post fake ads; job descriptions use phrases like
"customer support services." The real job emerges long after the
hiring process.
China, on the other hand, follows a much less regulated and
legally detached method. A system has been established that operates entirely
within closed networks based on referrals, without any legal company structure,
contract, or point of contact. In this system, young people labeling videos
often do not even receive the wages promised to them. For a job that pays $20
an hour in the USA, Western and Chinese companies offer a Kenyan youth $2.
The process usually starts with WhatsApp groups of about ten
people. In these groups, they are asked to watch and label tens of thousands of
short videos, each averaging a few seconds. When the weekly quota is not met,
groups are closed without any explanation; there is no point of contact to
demand rights.
Payments are made via M-Pesa, the mobile payment system
widely used in East Africa; but employees are not provided with any payslip,
contract, or legal record.
While the West exploits existing legal loopholes, China
advances by establishing unregistered digital networks.
Thousands of Videos, Thousands of Micro-Traumas
There is extensive research on the harms of watching short
videos. Experts emphasize that consuming short but intense content leads to
sudden dopamine spikes in the brain; this weakens long-term concentration
ability, dulls abstract thinking, and creates serious cognitive stress.
Insomnia, eye and spine problems, chronic tension, and neurological disorders
are among the common consequences of this process.
A Kenyan youth forced to watch and label tens of thousands
of short videos a day is actually forced to make tens of thousands of
micro-decisions. This situation rapidly erodes mental capacity; sleep is
postponed to meet the quota, and the body and mind are kept in a constant state
of alarm.
Moreover, the content of the videos often includes violence
and extreme sexuality. For hours, they watch people having their throats slit,
abused children, bodies burned alive, mothers killing their own babies,
pornographic content.
Digital workers exposed to violent content for long periods
develop serious psychological problems over time.
Digital Awakening
All this picture is transforming Kenyan digital workers not
only psychologically and physically, but also politically. They have become not
only the exploited but also the first actors to expose this system.
For artificial intelligence to provide more benefits to
humanity and become reliable, this grueling work must be done by someone. The
problem is that this benefit and reliable platforms are produced through
boundless exploitation.
The demands of digital workers are extremely humane: a fair
wage, psychological support and guarantees that should be provided after
exposure to heavy content, and secure employment contracts.
While companies earn billions of dollars, it is a clear
injustice that those doing the most difficult and risky part of the job cannot
share in these earnings. Digital exploitation points to a structural problem
beyond individual cases.
Similar digital labor networks are rapidly spreading today
in countries like Uganda, Rwanda, Ethiopia, and South Africa; Kenya is just a
starting point for many companies.
Conclusion
The digital devices we carry in our pockets today are the
product of bodies exploited in cobalt and coltan mines in Congo. The
"intelligent" systems inside these same devices work thanks to minds
and souls exploited in Kenya. In Congo, the ground is excavated; in Kenya,
minds are excavated. One is the exploitation of bodies in mines, the other is
the trauma of minds behind screens.
The original of this article was published in Independent
Turkish.*
Comments
Post a Comment