The parent company of Facebook, Meta, was sued on Wednesday in Kenya’s High Court for allegedly encouraging hate speech, inciting ethnic conflict, and failing to moderate content in Eastern and Southern Africa.
The petitioners are Kenyan rights group Katiba Institute, Ethiopian researchers Fisseha Tekle and Abrham Meareg, whose father, Professor Meareg Amare, was killed during the Tigray War, weeks after some posts on Facebook incited violence against him. Facebook only removed them eight days after he was killed.
The case is also supported by a raft of NGOs across Africa, including Global witness, Article 19, the Law Society of Kenya, and Amnesty International, among others.
According to the lawsuit, Meta neglected to take adequate precautions on Facebook, which encouraged hateful content and exacerbated tensions around Ethiopia’s deadly Tigray conflict.
The lawsuit claims that Meta has failed to employ enough safety measures on Facebook, which has, in turn, fueled conflict over their country’s deadly Tigray conflict by promoting hateful content. However, this seems to be wrong as the social company,
The petitioners are asking the court to order Facebook to take steps to remedy the situation, including creating a restitution fund of about 200bn Kenyan shillings (Ksh) ($1.6bn) for victims of hate and violence incited on Facebook and a further 50bn Ksh ($400 million) for similar harm from sponsored posts.
They also want the court to prevent Meta’s algorithm from recommending “inciteful, hateful and dangerous content” and for the company to employ enough moderators to translate local content, ensuring equity between the moderation in Nairobi and that for US users.
Facebook’s inability to moderate content properly
This is not the first time Facebook has been called out for its inability to moderate its content adequately.
The Whistleblower Frances Haugen previously accused Facebook of fanning ethnic violence in Ethiopia.
Also, the findings of an investigation carried out by Global Witness in collaboration with the legal nonprofit Foxglove, and the independent researcher Dagim Afework Mekonnen revealed Facebook’s negligence in identifying hate speech in the official language of Ethiopia.
Furthermore, it demonstrated how ineffective Facebook’s purported safety and security measures are at preventing ads that incite violence, particularly given that Meta has long held Ethiopia to be one of its highest priorities for country-specific interventions to keep people safe, given the risk of conflict.
It has been accused of neglecting hate and violent posts on its platform, especially on non-English content, which has also caused these recurrent problems in sharply-polarised countries like Nigeria.
In May this year, Nigeria’s minister of information and culture, Lai Mohammed, asked Facebook and other social media platforms to stop allowing the Indigenous People of Biafra (IPOB) to use their platforms to incite violence and instigate ethnic hatred in the country.
Could it be that Facebook’s content moderation hub in Kenya lacks sufficient content moderators?
Mutemi, in the lawsuit, explained that maybe the impossible prioritization of hateful speech on the Facebook platform could be a result of its content moderation decisions or lack of investment in content moderation.
If we recall correctly, Nzili and Sumbi Advocates, a law company, sued Sama, Meta’s primary subcontractor for content moderation in Africa, and Meta over alleged unsafe and unfair working conditions at Sama’s hub in Kenya, which Meta denied.
However, following this particular lawsuit against Meta, Amnesty International has said that Meta must reform its business practices to ensure Facebook’s algorithms do not amplify hatred and fuel ethnic conflict.
Speaking on the issue, Flavia Mwangovya, Amnesty International’s Deputy Regional Director of East Africa, Horn, and Great Lakes Region, in a statement said,
The spread of dangerous content on Facebook lies at the heart of Meta’s pursuit of profit, as its systems are designed to keep people engaged. This legal action is a significant step in holding Meta to account for its harmful business model.
Alongside the reformation, the petitioners also demanded that Facebook prohibits and degrade hateful comments, employ more content moderators for its Kenyan hub and create a restitution fund of over £1.6 million for victims of hate and violence incited on Facebook.
Meta’s comment on the issue
A Meta spokesperson reportedly told TechCrunch that the company has stringent policies regarding what is and isn’t permitted on Facebook and Instagram.
These policies prohibit hate speech and inciting violence, and the company has made significant investments in teams and technologies to help combat content related to those issues.
Speaking on the ongoing accusations, the spokesperson also said that Meta’s integrity in Ethiopia is guided by feedback from local civil society organizations and international institutions.
The spokesperson also claims that meta has employed staff with local knowledge and expertise who have been on the lookout for violating content in some of the most widely spoken languages in the country, including Amharic, Oromo, Somali, and Tigrinya.
Get the best of Africa’s daily tech to your inbox – first thing every morning.
Join the community now!