Happening Now: Demolition of sinking building in Mombasa
Blow to Meta as court allows petition by Ethiopians to proceed

Kenyan court ruling on Meta sets precedent for multinational accountability in Africa.
Facebook owner Meta has suffered a blow after the Kenyan High Court ruled that it has jurisdiction to determine a case brought by two Ethiopians accusing the social media platform of promoting content that led to ethnic violence and killings in the country.
High Court judge Lawrence Mugambi on Thursday dismissed objections by Meta Platforms that the Kenyan courts do not have the powers to determine matters which happened in another country.
Meta Platforms had also submitted that anything done by Facebook was subject to the terms and conditions of the company, and any aggrieved person should file a complaint in Delaware.
But Justice Mugambi dismissed the objections, saying the case was about impact on human rights and it didn’t matter the terms or where the company was based.
In yet another win for Mr Abrham Meareg and Fisseha Tekle, the court certified their petition as one raising substantial questions of law that should be handled by a bench of more than two judges.
Justice Mugambi directed that the file be taken to Chief Justice Martha Koome for the appointment of a bench to determine the case.
Mr Abrham said his father - Prof Meareg Amare Abrha - was killed in November 2021 amid the war between the Ethiopian government and the Tigray People’s Liberation Front (TPLF).
He says the post on Facebook profiled and accused him of associating with TPLF.
Mr Tekle, a legal adviser at Amnesty International, said human rights groups could not protect people’s rights if social media users rely on it for news and connection, which fuels hate and disinformation.
They accused the social media giant of treating Facebook users in Kenya and African countries differently from users in other countries, especially in times of conflict.
“Facebook users having noted that inciteful, hateful and dangerous content gets them more visibility are incentivised to post more of the same. The darker the content, the higher the likelihood it will be prioritized,” the petition stated.
While certifying the case as raising substantial questions of law, Justice Mugambi agreed with lawyer Mercy Mutemi’s submissions that the protection of human rights in the age of Artificial Intelligence was novel.
The petitioners submitted that the case called for the court to apply itself to the violation of human rights perpetrated by Meta’s algorithm on Facebook and the duty of social media platforms to moderate content on their platforms.
The two Ethiopians also want the court to determine the extent of the duty of social media platforms to moderate their content in accordance with Article 33(2) of the Constitution and algorithmic bias and discriminatory AI.
Mr Abrham and Tekle sued Meta Platforms over claims of fueling violence in East Africa by failing to moderate inciteful messages posted on Facebook.
They argued that Facebook’s software design choices – the way it organises its social media algorithm – have caused hatred and incitement to violence to go viral repeatedly, fueled real-world attacks in Ethiopia.
The petitioners have demanded historic safety changes to Facebook – including adjustments to Facebook’s algorithm to protect users.
It is their argument that Facebook should hire and value sufficient numbers of safety staff, and create a restitution fund for victims of viral hate in the Ethiopian war, which had claimed some 500,000 lives.
Meta wanted the case struck out, arguing that they are foreign firms that do not trade in the country and Kenyan courts do not have jurisdiction over them.
skiplagat@ke.nationmedia.com