Facebook, Youtube and TikTok content moderators in Kenya form labour union
What you need to know:
- At a meeting held in Nairobi on Monday, 200 content moderators from Sama and Majorel -- the firms that serve Facebook, Youtube, TikTok and Chat GPT -- took a stand against tech giants’ mistreatment of their workers by coming together to lobby for their rights.
Kenya-based content moderators working for Facebook, YouTube, TikTok and Chat GPT have established a labour union to push for better working conditions.
The formation of the union comes on the backdrop of a lawsuit filed by 184 Facebook content moderators in Kenya who were sacked in January. They sued the social media company's parent firm Meta for unfair dismissal.
At a meeting held in Nairobi on Monday, 200 content moderators from Sama and Majorel -- the firms that serve Facebook, Youtube, TikTok and Chat GPT -- took a stand against tech giants’ mistreatment of their workers by coming together to lobby for their rights.
In a first-of-its-kind event, moderators covering 14 different African languages came together on Labour Day to vote for establishing a union to address issues including mistreatment of workers.
Daniel Motaung, a former content moderator who got fired after trying to lead the unionization effort, told the Nation that content moderation is in a state of crisis and that "content moderators are paying for it with their lives".
"They serve as the first line of defence against harmful content, yet they face hazardous work conditions without hazard pay. Mental health support is severely lacking, job security is scarce, and some moderators feel silenced by strict non-disclosure agreements,” he said.
This is not the first time content moderators have attempted to form a union. In 2019, those at Sama, the Nairobi-based office responsible for Facebook's content moderation, attempted to unionise due to low pay and poor working conditions. Workers threatened to strike if their demands for better pay and conditions were not met.
Motaung, the strike leader, was fired and accused of putting the relationship between Sama and Facebook at risk. Sama also told other participants in the labour action effort that they were expendable, leading the workers to stand down.
'Rest of world'
The tremendous growth of platforms like Facebook and TikTok has led to a series of issues that have gone unaddressed. A critical one is how inadequate content moderation affects general users and entire societies.
When content moderation fails, it has dire consequences such as incitement of violence online that can lead to the loss of lives offline, as seen in the Tigray conflict in Ethiopia. Additionally, the spread of political falsehoods has the potential to compromise the integrity of elections, even in countries like Kenya.
Additionally, the mistreatment of moderators and tech workers is a growing concern.
This problem is worsened by tech giants' lack of investment in entities that they refer to as the "rest of world" countries. For example, in 2021, an investigation by the Wall Street Journal uncovered that Meta’s Facebook at the time spent 87 percent of their misinformation resources on the United States and Western Europe, leaving the rest of the world vulnerable to dangers of the spread of false information.
Critics argue that outsourcing moderation allows tech giants to exploit workers and avoid responsibility for union-busting.
Nerima Wako, executive director of Siasa Place -- a youth-focussed civil society group in Kenya -- told the Nation that the model distances tech giants from accountability.
"The outsourcing model enables them to evade responsibility when issues arise, as they are not legally registered in the country and thus, not subject to its laws. This strategy serves as a convenient scapegoat for these companies, allowing them to sidestep the consequences of their actions," she said.
Cotu Secretary-General Francis Atwoli, in his Labour Day speech, highlighted that outsourcing practices as facilitators of modern day slavery.
Court case
The new union comes at a significant time for the moderators: the court case against Meta could have wide-ranging ramifications.
With the lobby, they hope to rebalance power dynamics and hold tech giants accountable.
Richard Mathenge, one of the moderators who was present and worked with ChatGPT, Open AI’s globally renowned AI product, told the Nation thus: “The union will be a very huge and positive step for us in terms of understanding how we can bring our voices to the table and address the conditions that we have been trying to get highlighted for some time.”