After failing to stop the hate speech and misinformation that it fueled a genocide in Myanmar, Facebook now says it plans to take proactive content moderation steps following a military coup taking place in the country.
In an internal message published late Monday and seen by BuzzFeed News, Rafael Frankel, director of public policy in the Asia-Pacific region, told his employees that the social network watched the “volatile situation” in Myanmar “with grave concern.” and described a series of measures to repel people who used it to spread misinformation or threaten violence.
As part of these measures, Facebook has designated Myanmar as a “Temporary Location of Risk” for two weeks, allowing the company to remove content and events in the country that include “any call to carry weapons”. The social network first applied this designation in Washington, DC, after the uprising at the U.S. Capitol on January 6th.
The social network, which had pledged its efforts to protect the integrity of Myanmar’s national elections in November, also said it would protect posts criticizing the army and its coup, and would follow page and account reports. pirates or taken from the army. .
“Myanmar’s November elections were an important moment in the country’s transition to democracy, even if it was not without its challenges, as highlighted by international human rights groups.” , wrote Frankel. “This turn of events makes us feel days that we hope have been in Myanmar’s past and reminds us of fundamental rights that should never be given up for granted.”
Facebook’s moves come after General Min Aung Hlaing, Myanmar’s army chief, took control of the country’s government and detained his elected leader Aung San Suu Kyi and other members of his party. the National League of Democracy (NLD). After the election in which the NLD won the majority of seats in the parliament of Myanmar, opposition groups backed by the army called the results fraudulent and demanded a vote.
Tuesday, the U.S. Department of State officially designated the resumption of the army in Myanmar as a coup, triggering financial sanctions.
“After a review of all the facts, we have assessed that the actions of the Burmese army on February 1, after deposing the duly elected head of government, have constituted a military coup,” a U.S. official said. State Department in a briefing, using the name the US government uses to refer to the country.
In a statement to BuzzFeed News, Facebook confirmed the actions it indicated in Frankel’s post and said it would remove content that praises or supports the coup.
“We are putting the safety of the people in Myanmar first and eliminating content that violates our rules on violence, hate speech and harmful misinformation,” Frankel said. “This includes the elimination of misinformation that delegitimizes the outcome of the November elections.”
Facebook is operating in a country where it has previously received international condemnation for its handling of the displacement and genocide of Rohingya Muslims that began in 2016. In 2018, United Nations investigators found that senior military officials in Myanmar had used Facebook, which did not contain content moderators in the country, to encourage fear and spread hate speech.
The UN investigator concluded that “the extent to which Facebook messages and messages have led to real-world discrimination must be independent and thorough.” in their report.
In Monday’s post, Frankel said Facebook used “a number of product interventions that were used in the past in Myanmar and during the U.S. elections, to ensure that the platform was not used to spread misinformation. , incite violence, or coordinate damage. “
The company works to secure the accounts of activists and journalists “who are in danger or who have been arrested” and to eliminate content that threatens or calls for violence against them, Frankel wrote. The company will also provide “critical information on what is happening on the ground,” given the restrictions imposed on the country’s stores.
Facebook’s work is an ongoing effort. On Tuesday, he returned a page for Myanmar’s military television network late Monday, following investigations by in the Wall Street Journal. While the company had banned a page for the Myawaddy television network in 2018 during a repression on hundreds of Myanmar army-linked accounts, a new page had reappeared and garnered 33,000 likes.
Facebook has often been accused of favoring the growth of violent and extremist groups and their ineffectiveness in stifling disinformation. More recently, a group of tech watchdogs have accused the society of encouraging riots that led to the attempted coup in the United States.
“[Facebook] spent the past year refuting extremist activity and conspiracy theories linked to President Trump’s upcoming elections that have radicalized a vast band of people and led many down a dangerous path, ”the Tech Transparency Project ( TTP) he said in a report.
The report uncovered specific threats made to pro-Trump and militant groups on Facebook both before and after Joe Biden’s election victory in November.