Facebook parent meta on “evaluating the feasibility” of a human rights review on practices in Ethiopia

0
30

Facebook owner Meta Platforms said Thursday it would “assess the feasibility” of commissioning an independent human rights assessment of its work in Ethiopia after its oversight board recommended a review of how Facebook and Instagram were used to distribute content that increase the risk of violence there.

The panel, set up by the company to address criticism of its handling of problematic material, makes binding decisions on a small number of challenging content moderation cases and makes non-binding policy recommendations.

Meta has come under scrutiny from lawmakers and regulators over user safety and how it handles abuse on its platforms around the world, particularly after whistleblower Frances Haugen leaked internal documents showing the company’s struggles to oversee content in countries , where such utterances were most likely to cause harm, including Ethiopia.

Thousands have died and millions have been displaced during years of conflict between the Ethiopian government and rebels from the northern Tigray region.

The social media giant said it had “invested significant resources in Ethiopia to identify and remove potentially harmful content” as part of its response to the committee’s December recommendations on a case involving content posted in the country.

The oversight body last month upheld Meta’s original decision to remove a post alleging involvement of ethnic Tigrayan civilians in atrocities in Ethiopia’s Amhara region. Since Meta restored the post after the user appealed to the board, the company had to remove the content again.

On Thursday, Meta said while it had removed the post, it contradicted the board’s rationale that it should have been removed because it was an “unconfirmed rumor” that significantly increased the risk of impending violence. This would “impose a journalistic publication standard” on people, it said.

A spokesman for the oversight body said in a statement: “Meta’s existing policies prohibit rumors that contribute to imminent violence that cannot be debunked in a reasonable timeframe, and the board has made recommendations to ensure these policies are effective in conflict situations.” be applied.”

“Rumors of complicity in atrocities such as those identified in this case by an ethnic group have the potential to cause serious harm to people,” they said.

The Board had recommended that Meta commission a human rights due diligence, to be completed within six months, which would include a review of Meta’s language skills in Ethiopia and a review of measures taken to prevent abuse of its services in Ethiopia country should include.

However, the company said not all elements of this recommendation “may be feasible in terms of timing, data science, or approach.” It said it would continue its existing human rights due diligence and should have an update on whether it could act on the board’s recommendation within the next few months.

Previous Reuters coverage of Myanmar and elsewhere has examined how Facebook struggled to monitor content across the world in different languages. In 2018, UN human rights investigators said Facebook use played a key role in spreading hate speech that fueled violence in Myanmar.

Meta, which said it was too slow to prevent misinformation and hate in Myanmar, said the company now has native speakers worldwide reviewing content in more than 70 languages ​​working to curb abuse on its platforms in places stop where there is an increased risk of conflict and violence.

The board also recommended that Meta reword its safety value statement to reflect that online speech can pose a risk to people’s physical safety and right to life. The company said it will make changes to this value to partially implement the recommendation.

© Thomson Reuters 2022

Check out the latest from the Consumer Electronics Show on Gadgets 360 in our CES 2022 hub.

LEAVE A REPLY

Please enter your comment!
Please enter your name here