Dark reality of content moderation: Meta sued for poor work conditions
A fresh battleground has emerged in the struggle to compel Meta Platforms Inc, the parent company of Facebook, to acknowledge and pay a particular group of workers who perform essential and frequently gruelling tasks on behalf of the company.
This is the third time a case has been filed against Meta in Kenya and sheds light on the harsh reality of content moderation work.
The lawsuit, filed by content moderators engaged through third-party company Sama through their lawyer, Ms Mercy Mutemi, accuses Meta and its partners of unlawful termination, discrimination and violation of their rights.
The suit reveals the devastating impact of content moderation on mental health and the inadequate support provided by wellness counsellors. It also reveals the amount of content moderation resources Meta dedicated to Eastern and Southern Africa — 260 overall, all of whom were let go after Meta terminated their contract with Sama, and 43 have filed the lawsuit.
The content moderators’ working conditions reportedly took a turn for the worse following the filing of a separate case in May 2022 (ELRC Petition E071 of 2022; Daniel Motaung v Samasource and Meta). It cited various human rights violations.
Subsequently, the moderators faced a hostile work environment, with new conditions being introduced, such as the termination of transportation services and being policed while at work. They also experienced vagueness regarding contract renewals, with short-term extensions being promised with no guarantee of longer contracts.
The court petition also reveals that Sama began letting go of people for unjustifiable reasons and no replacements were hired despite the acknowledged need for more content moderators.
Content moderation often involves filtering through the darkest corners of the internet, sifting through a relentless barrage of disturbing material such as extreme violence, child pornography and terrorist propaganda. Examples of the posts that they view on a daily basis include pictures and videos of people being raped, children being molested and people being slaughtered or burnt alive. Some of them even saw their own relatives die on the platform.
According to the court petition, the constant exposure to toxic content has left many moderators struggling with mental health conditions such as Post-Traumatic Stress Disorder, depression, and anxiety. Others report insomnia, graphic nightmares, hallucinations and suicidal thoughts.
Moderators were not informed of the risks involved in the work when they got the job. The true nature of the work was only revealed during induction training, which, for many moderators who had relocated to Kenya, was too late to back out, especially if they came from poor backgrounds and volatile regions in countries like Ethiopia. Previous reports have pointed out that many moderators had applied to be call centre agents in Kenya but in reality, they were being lured in to do content moderation. The training was deemed sub-par and inadequate for preparing moderators for the challenges they would face on the job.
This situation made many of the moderators deem the mental wellness support that was provided to them inadequate. The petitioners allege that the counsellors lacked the necessary qualifications to give them the care they needed and deal with the complex traumas they faced.
The counsellors have been accused of reporting sensitive information discussed during sessions back to management, which is then used to exert more pressure on the employees. Additionally, the non-disclosure agreements moderators signed did not allow them to share details of the content they saw with the counsellors.
The petitioners also allege that Sama unlawfully retained their documents under the guise of securing their immigration status. This includes holding their passports for months without cause and failing to inform the moderator of their work permit status. Moderators were working in Kenya without work permits or alien cards, a situation that often put them in trouble with the authorities.
Additionally, the petition highlights uncertainty in the terms of pay and discrepancies between moderators’ wages. Meta, renowned as one of the most lucrative tech giants globally, is known for providing its direct employees with generous six-figure salaries. However moderators engaged through Sama earn approximately $2.20 (about Sh300) per hour, significantly less than their counterparts in other countries who earn between $18 (Sh2,200) and $20 (Sh2,500) per hour. Some moderators reported fluctuations in their monthly pay, with additional payments being sent from the Human Resource Manager's personal account, suggesting improper withholding of wages. They claim that deductions were being made from their salaries under NHIF but the same was not being remitted. That means when many of them sought medical services, they would be told their NHIF contributions were not up to date.
Meta terminated its contract with Sama in January this year. They moved the contract to Majorel who also offer content moderation services to companies such as TikTok in Kenya and other countries. The moderators also claim that Meta interfered with the recruitment process, instructing Majorel not to engage any of the moderators who had listed Samasource as their previous work location. Which is why they are accusing Majorel and Meta of discrimination. Majorel has also come under fire in recent times for their treatment of content moderators, reportedly even paying less than Sama while having equally bad working conditions in their bases across the world.
The petition also points out that, during the process of firing workers, the company treated them unfairly. They forced the moderators to sign agreements that stopped them from saying anything negative about the company. Moreover, they would only give the workers their final payments if they agreed to sign these agreements.
Sama also allegedly failed to present moderators with written proposals of their terminal dues before serving them with termination letters. Majorel and Meta did not respond to requests for comment by the time of going to press. Sama declined to comment on the accusations.
The case comes at a time when vital questions are being raised about the outsourcing practices of multinationals, which often look to countries like Kenya for cheap labour but turn a blind eye to the violations their workers face. The outcome of the lawsuit may have significant implications for the future of content moderation and labour rights in the tech industry, both in Kenya and worldwide.