Search This Blog

Thursday, September 29, 2022

Facebook amplified hate before Rohingya massacre: Amnesty report

 Facebook algorithms amplified hate content against the Rohingya minority prior to widespread violence committed against the group by Myanmar’s military in 2017, according to findings in a new report from Amnesty International.

“While the Myanmar military was committing crimes against humanity against the Rohingya, Meta was profiting from the echo chamber of hatred created by its hate-spiralling algorithms,” said Amnesty International Secretary General Agnès Callamard in response to the report, referring to Facebook’s parent company.

Amnesty’s assessment claims that Meta, the parent company of Facebook, knew that its algorithms were contributing to the spread of hateful content against the Rohingya, but that the social media giant failed to address the problem.

“Actors linked to the Myanmar military and radical Buddhist nationalist groups flooded the platform with anti-Muslim content, posting disinformation claiming there was going to be an impending Muslim takeover, and portraying the Rohingya as ‘invaders,’” Amnesty said in its report.

Posts including one describing a Muslim human rights defender as a “national traitor,” were shared over 1,000 times. These posts often included comments calling for the murder of Muslim minorities.

“Don’t leave him alive. Remove his whole race. Time is ticking,” one person commented on the image, according to the Amnesty report.

Facebook also platformed hateful posts from Myanmar’s leadership, according to Amnesty International.

“We openly declare that absolutely, our country has no Rohingya race,” wrote the leader of the Myanmar military, Senior General Min Aung Hlaing, in 2017.

A year prior to the genocide, Meta’s internal research found that Facebook’s recommendation systems “grow the problem” of extremism, according to documents viewed by Amnesty.

The Hill has reached out to Meta for comment on the report.

Facebook’s hate speech policies prohibit users from posting content targeting a person or group of people with “violent speech or support in written or visual form,” “subhumanity,” “dehumanizing comparisons” and “generalizations that state inferiority,” among other things.

A group of Rohingya refugees sued Facebook last year, alleging that the company failed to act in the face of dangerous rhetoric that led to the massacre of thousands.

More than 25,000 Rohingya were murdered during the conflict and thousands more were raped in Myanmar’s Rakhine State, driving 700,000 people to flee.

The Biden administration declared the mass violence seen in the Rakhine State to be genocide and a crime against humanity earlier this year.

https://thehill.com/policy/international/3667549-facebook-amplified-hate-before-rohingya-massacre-amnesty-report/

No comments:

Post a Comment

Note: Only a member of this blog may post a comment.