How al-Shabaab uses Facebook to spread extremism in East Africa
Al-Shabaab and the Islamic State are using Facebook to spread their hateful ideologies, grow audiences and broadcast their messaging in East Africa, with Kenya as their focus.
This is according to a new report shared with the Nation by the London-based Institute for Strategic Dialogue (ISD).
According to ISD, there is a highly-coordinated online propaganda machine that relies on thousands-strong networks of Somali, Swahili and Arabic language Facebookprofiles and pages to seed its content on the platform, some of which may have been hacked from unsuspecting users.
The research, carried out between 2020 and 2022, reveals that entities affiliated with the two groups are posing as “independent news outlets” on the platform.
However, these digital assets are instead allegedly being used to share terrorist propaganda under the guise of “objective” news to influence audiences in Kenya and the wider East African region.
Mr Moustafa Ayad, the author of the report, told the Nation: “The al-Shabaab and Islamic State support networks on Facebook have shown significant resilience. They have become very adept at seeding propaganda across different social media platforms.”
“Independent news websites”
The London-based think-tank gathered data about the growth and impact of 30 al-Shabaab- and Islamic State- linked Facebook pages, 8,000 Facebook accounts and five central al-Shabaab news websites.
According to the researchers, this network experienced a boom in 2022, growing roughly 41 per cent in size over the past four months, adding 12,500 net followers to their audiences between January and April.
It is this booming network that operates as the seeding mechanism for the audiences of “independent news websites” affiliated with Isis and al-Shabaab.
Researchers also mention that content spread by these digital entities is often obtained from websites that use centralised al-Qaeda storage databases as the source for their material.
Al-Shabaab-linked websites are then used to share the content on Facebook.
According to Mr Ayad, “Content from the pages was often a mix, or repurposed mainstream media news and straight up Islamic State, al-Qaeda or al-Shabaab content. They focus on spreading disinformation through a journalistic veneer, while still leading audiences to the larger machinery of its extremist propaganda communication platforms, content caches and local influencers.”
For example, Warbahinta Al-xaq or Truth Media was the most active of the Somali language support pages the researchers identified. The page amassed 4,711 followers and claimed to be an independent Islamic media outlet whose motto is “truth and transparency”.
In actuality, the page rebrands al-Shabaab content for distribution on Facebook. ISD says this positioning could be what helps it get around Facebook’s content moderation policies.
Researchers also found another network of six pages that calls itself al-Hijrateyn media.
They took Arabic language content from the Islamic State and translated it for Swahili and Somali speaking audiences.
Furthermore, the network produces hour-long audio updates, sharing news and fatwas (legal rulings or decrees on points of Islamic law) in English religious edicts linked to the Islamic State.
All of this content went unmoderated on Facebook and was continually shared by a dedicated set of supporters.
The pages shared 112 videos since 2020, garnering close to 84,000 views on the platform. Al-Hijrateyn’s most popular piece of content for example, was an edited video of the bodies dumped in River Yala, Kenya.
From August last year, locals had been finding bodies of unknown men in the river, and in January 2022, activists called a press conference to expose the grim findings and to accuse police of extrajudicial killings.
Many of the bodies -- numbering 30 -- were later identified to be of missing men, and one woman, from across the country, mostly who had a criminal past.
The edited al-Hijrateyn video is a shoddily compiled set of videos with a voice-over saying that Kenyan security forces were on a campaign to kill unarmed Muslims, dumping their bodies in the river, in the “darkness of night”.
The video goes on to say that Muslims face injustices all over Kenya and calls upon them to “distrust” their government and actively reject the “lies” it peddles.
The use of “Independent news outlets” proved to be highly effective on Facebook. ISD estimates that these “news networks” have garnered close to 39,488 followers, and have grown by more than 8,600 followers since January 2022
Facebook’s vulnerabilities
ISD’s findings point towards Facebook having lapses in terms of content moderation and abuse of its pages and group features.
The Meta-owned platform has a well-documented struggle in terms of its ability to moderate languages other than English and in territories outside of the US. All of the content ISD identified was published in Somali, Swahili and Arabic.
Each language had its own dynamics in terms of how it propagated across the platform.
Both Swahili and Arabic language posts formed a smaller part of ISD’s data in terms of the amount of content produced, but they were still able to spread widely and get a high number of views.
It is the Somali language content, however, that ISD’s data shows exposes a particular weakness in Facebook’s moderation systems.
Somali language pages garnered more views and shares because there was a larger presence of extremist-linked pages in comparison to the others that the think-tank studied.
ISD’s research uncovered a network of 23 Somali pages with 32,363 followers on Facebook. Additionally, extremist pages published 843 videos, which got 650,186 views and were shared 101,137 times on the platform.
Pages and groups are also central to how al-Shabaab communities operate on Facebook. Groups, in this case, provide al-Shabaab communities with the ability to coordinate, collate and seed content for dedicated propagandists on the platform.
For example, ISD found gruesome al-Shabaab assassination videos being seeded through various Facebook groups dedicated to Sheikh Aboud Rogo Mohammed, who was shot dead by unknown assailants in Mombasa on August 27, 2012.
Rogo was an extremist Islamist preacher and a supporter of al-Shabaab and al-Qaeda.
The report says that flagging and taking down extremist content has its place and can be effective, but the American company would need to be more comprehensive and take these features into account.
The extremists’ ultimate goal
The network of pages sought to position al-Shabaab and the Islamic State as the rightful protectors of the Muslim ummah in the Horn of Africa while simultaneously excommunicating “infidel Muslims” in other East African countries.
Mr Ayad said: “These ecosystems seek to sow distrust and present both al-Shabaab and the Islamic State as popular alternatives to the current status quo.
“The content published by these networks seeks to dehumanise societal out-groups to strengthen their movement and radicalise people.”
It is no secret that extremists love the internet. Controlling digital territory is just as integral to the survival and ideologies of groups as physical territory.
In response, therefore, Big Tech companies have had to course-correct and create policies to address what extremist behaviour looks like on their platforms.
Facebook and its parent company, Meta, weed out extremist content by relying on a combination of artificial intelligence, user reports and human moderators.
The company claims to remove 99 per cent of content in violation of its “Dangerous Individuals and Entities policy”. It’s a sweeping set of restrictions on what Facebook’s nearly three billion users can say about an enormous and ever-growing roster of entities such as criminal and extremist organisations.
The policy states: “We do not allow organisations or individuals that proclaim a violent mission or are engaged in violence to have a presence on Facebook.”
However, ISD says that all the content it identified went unmoderated, suggesting that the platforms’ monitors were unaware of the outlets and the content was not likely found in the company’s database of terrorist content.
At a minimum, this demonstrates that Silicon Valley’s much-touted moderation policies (now being applied to hate speech and dangerous rhetoric in this election season) are not using rigorous enough standards in classifying extremist propaganda. Instead, as per ISD’s evidence, it is letting them grow.
A spot check by the Nation on another social media platform, TikTok, found several violent jihad sermons instructing audiences to pick up arms.
Furthermore, ISD also found parts of the network it studied extended on to Twitter and Telegram as well.
“Any kind of terrorist activity on a platform is a problem and points to larger issues within its systems. If terrorist activity is able to get around moderation systems that have been put in place, whether manual or automated, think about how much hate content can exist on the platform in those languages,” said Mr Ayad.
He added: “Terrorist propaganda online represents only one aspect of the overall set of online harms that are flourishing online and targeting East African communities.”
In response to the findings, Facebook removed a number of the accounts that had been flagged by the ISD, which the Nation also brought forward to them.
They however did not respond to questions about why the content was allowed to stay unmoderated on the platform for the length of time that it did.
Through a spokesperson, they told the Nation that they were actively investigating the networks that had been uncovered.
“We've already removed a number of these Pages and Profiles and will continue to investigate once we have access to the full findings. We don't allow terrorist groups to use Facebook, and we remove content praising or supporting these organisations when we become aware of it.”
The spokesperson also told us that, “We have specialized teams — which include native Arabic, Somali and Swahili speakers — dedicated to this effort. We know these adversarial groups are continuously evolving their tactics to try and avoid detection by platforms, researchers and law enforcement. That’s why we continue to invest in people, technology, and partnerships to help our systems stay ahead.”
Ms Emma Wabuke, a PhD candidate on radicalisation at Cambridge University, said social media companies must recognise the delicate role they have to play in this ecosystem.
“The platforms have to handle this carefully, given the freedom of speech and privacy issues concerned. But ultimately, they have an opportunity more than most to moderate the content on their platforms in good time.” Nevertheless, she cautioned that, “Their role, however, should not extend to informing law enforcement, unless there are specific orders that have been enforced by the courts.”
ISD also called for broader community based interventions. “We must educate young people so that they are armed with the knowledge to make informed decisions about extremism and to recognise propaganda when they see it. Education is more effective than intervention.”