Facebook’s algorithms ‘supercharged’ hate speech in Ethiopia’s Tigray conflict
Facebook has come under scrutiny from rights group Amnesty International for allegedly exacerbating violence during the harrowing two-year conflict in Ethiopia’s northern Tigray region.
In a damning report, Amnesty contends that the social media giant’s algorithms significantly amplified the dissemination of harmful rhetoric, asserting that the company inadequately addressed the spread of such content.
Amnesty’s allegations represent another challenge for Facebook’s parent company, Meta, which has previously denied similar claims. Meta has insisted on its extensive investments in content moderation and the removal of hateful materials from its platform. Facebook remains a crucial source of information for many Ethiopians.
However, as the conflict between the federal government and allied forces and the Tigrayan forces raged on, Facebook’s role in allegedly propagating hate speech came under increasing scrutiny.
The African Union’s peace envoy, former Nigerian President Olusegun Obasanjo, estimated that approximately 600,000 people perished during the conflict, with causes of death attributed to combat, starvation, and inadequate healthcare.
The conflict reached a ceasefire almost a year ago following a peace agreement between the federal government and the Tigray People’s Liberation Front (TPLF), which predominantly governs the Tigrayan region. Nevertheless, Ethiopia continues to grapple with other conflicts, including those in the expansive Oromia and Amhara regions.
The Amnesty International report highlights Meta’s “data-hungry business model,” which, according to the report, still poses “significant dangers” to human rights in areas affected by conflict. This isn’t the first time Facebook has been accused of disseminating incitement messages against ethnic Tigrayans. Currently, Meta is facing a lawsuit alleging its failure to address harmful content. Two petitioners are seeking more than $1.5 billion (£1.2 billion) in damages.
Amnesty’s investigation involved a review of internal documents from Meta, including communications the company received from 2019 to 2022. The rights group asserts that despite repeated warnings and a history of contributing to violence in other nations, Meta failed to implement necessary measures.
According to Amnesty, “Facebook’s algorithmic systems supercharged the spread of harmful rhetoric targeting the Tigrayan community, while the platform’s content moderation systems failed to detect and respond appropriately to such content.”
Meta responded by informing the BBC that it was actively enhancing its capabilities to combat “violating content” published in widely spoken Ethiopian languages.
Ethiopia, Africa’s second most populous state with a population of 113.6 million, recognizes Amharic as its official working language, although other languages like Afaan Oromoo, Tigrinya, Somali, and Afar are also spoken in the country.
Source: Africanews