2023-08-24 23:01:44
“Our investigations have revealed that Facebook’s dangerous algorithms, designed to encourage engagement and corporate profits at all costs, fueled hatred and led to mass violence and the forced displacement of more than half of Myanmar’s Rohingya population to neighboring Bangladesh contributed,” said Pat de Brún. “It is high time that Meta took responsibility by making amends to the Rohingya and changing its business model to prevent something like this from happening once more.”
August 25th is also an important day to hold big tech accountable for its human rights impacts, as important provisions of the Digital Services Act for the major online platforms in the European Union come into force on this day. The Digital Services Act is a landmark piece of legislation to strengthen rights in the digital age, which might have repercussions far beyond the EU.
A personal appeal to Meta and Mark Zuckerberg
Today publish Amnesty International and Al Jazeera a harrowing account of Rohingya refugee Maung Sawyeddollah, who was forced to flee his village in Myanmar as a teenager. He fled through torched villages and fields covered in corpses and now lives in the world’s largest refugee camp, Cox’s Bazar in Bangladesh, with regarding a million of his compatriots.
As a child, before the hatred took root with the help of Facebook, he and his mostly Muslim Rohingya friends played happily with the mostly Buddhist Rakhine children from the neighboring village – but that changed with the military invasion.
“I would like to meet Mark Zuckerberg and his team. Perhaps they would like to come and spend a night or two in the refugee camp,” Sawyedollah writes.
I would say to them, ‘Don’t you see the part you play in our suffering? We have asked you once more and once more to help us so that we are better… But you ignore our requests.
Maung Sawyeddullah, Rohingya
“Tell me, do you feel anything for us? Is it just the data, is it just the dollars?”
To the backgrounds
Last year, Amnesty International published a report detailing Meta’s role in Myanmar’s military atrocities once morest the Rohingya in 2017. The report revealed that even Facebook’s internal studies from 2012 indicated that Meta knew its algorithms might cause serious real-world damage. In 2016, Meta’s own researchers clearly acknowledged that “our recommender systems magnify the problem of extremism.”
Beginning in August 2017, Myanmar security forces carried out a brutal ethnic cleansing campaign once morest Rohingya Muslims in Myanmar’s Rakhine State. They unlawfully killed thousands of Rohingya, including young children, raped and perpetrated other sexualized violence once morest Rohingya women and girls, tortured Rohingya men and boys in detention centers and burned hundreds of villages. The violence has displaced more than 700,000 Rohingya – more than half of the Rohingya population living in northern Rakhine State when the crisis began – to neighboring Bangladesh.
Meta contributed to serious human rights abuses once morest the Rohingya in connection with the 2017 atrocities in Rakhine State and is therefore obliged under international human rights standards to provide effective redress to the community. This includes the company making the necessary changes to its business model to ensure something like this never happens once more. All companies have an obligation to respect human rights wherever they operate. This is set out in various international business and human rights standards, including the UN Guiding Principles on Business and Human Rights and the OECD Guidelines for Multinational Enterprises.
Further information
Check out Maung Sawyeddollah’s story, plus an analysis by Pat de Brún, Head of Big Tech Accountability at Amnesty International on this page read.
Presse
Facebook algorithms encourage violence once morest Rohingya
1692931242
#Meta #finally #amends #Rohingya