Hate speech in Myanmar continues to thrive on Facebook

JAKARTA, Indonesia — Years after coming under scrutiny for contributing to ethnic and religious violence in Myanmar, Facebook still has problems detecting and moderating hate speech and misinformation on its platform in the Southeast Asian nation, internal documents viewed by The Associated Press show.

Three years ago, the company commissioned a report that found Facebook was used to “foment division and incite offline violence” in the country. The company promised to improve and created policies and tools to combat hate speech.

But the breaches have persisted — and even been exploited by hostile actors — since the Feb. 1 military takeover this year that resulted in gruesome human rights abuses across the country.

Scrolling through Facebook today, it’s not hard to find posts threatening murder and rape in Myanmar.

One 2 1/2 minute video posted on Oct. 24 of a supporter of the military calling for violence against opposition groups has garnered over 56,000 views.

“So starting from now, we are the god of death for all (of them),” the man says in Burmese while looking into the camera. “Come tomorrow and let’s see if you are real men or gays.”

Another account displays the address and photo of a soldier defector. Another post from Oct. 29 includes a photo of soldiers leading bound and blindfolded men down a dirt path. The Burmese caption reads, “Don’t catch them alive.”

Facebook considered Myanmar its model for export, despite the continuing problems. AP has reviewed the documents and found that Myanmar is a laboratory for the development of new technology in content moderation. This social media company tested various methods of automating the detection of hate speech, misinformation, with varying degrees success.

Facebook’s internal discussions on Myanmar were revealed in disclosures made to the Securities and Exchange Commission and provided to Congress in redacted form by former Facebook employee-turned-whistleblower Frances Haugen’s legal counsel. The redacted versions received by Congress were obtained by a consortium of news organizations, including The Associated Press.

Facebook is a relatively newer and more volatile social media platform than most. Myanmar became connected to the internet after decades of military censorship. Soon afterward, Facebook began to partner with local telecom companies to allow customers to access the platform, even though they had to pay for it at that time. The platform was widely used. Facebook has become the internet to many Myanmarans.

Htaike Htaike Aung, a Myanmar internet policy advocate, said it also became “a hotbed for extremism” around 2013, coinciding with religious riots across Myanmar between Buddhists and Muslims. It’s unclear how much, if any, content moderation was happening at the time by people or automation.

Htaike Htaike Aung said she met with Facebook that year and laid out issues, including how local organizations were seeing exponential amounts of hate speech on the platform and how its preventive mechanisms, such as reporting posts, didn’t work in the Myanmar context.

One example she cited was a photo of a pile of bamboo sticks that was posted with a caption reading, “Let us be prepared because there’s going to be a riot that is going to happen within the Muslim community.”

Htaike Htaike Aung said the photo was reported to Facebook, but the company didn’t take it down because it didn’t violate any of the company’s community standards.

“Which is ridiculous because it was actually calling for violence. But Facebook didn’t see it that way,” she said.

The international community was captivated by the absence of moderation years later. In March 2018, United Nations human rights experts investigating attacks against Myanmar’s Muslim Rohingya minority said Facebook had played a role in spreading hate speech.

When asked about Myanmar a month later during a U.S. Senate hearing, CEO Mark Zuckerberg replied that Facebook planned to hire “dozens” of Burmese speakers to moderate content, would work with civil society groups to identify hate figures and develop new technologies to combat hate speech.

“Hate speech is very language specific. It’s hard to do it without people who speak the local language and we need to ramp up our effort there dramatically,” Zuckerberg said.

Internal Facebook documents reveal that although the company made an effort to counter hate speech, it never achieved its full potential. Individuals within the company have repeatedly raised the alarm. In one document from May 2020, an employee said a hate speech text classifier that was available wasn’t being used or maintained. Another document from a month later said there were “significant gaps” in misinformation detection in Myanmar.

“Facebook took symbolic actions I think were designed to mollify policymakers that something was being done and didn’t need to look much deeper,” said Ronan Lee, a visiting scholar at Queen Mary University of London’s International State Crime Initiative.

In an emailed statement to the AP, Rafael Frankel’s, Facebook’s director of policy for APAC Emerging Countries, said the platform “has built a dedicated team of over 100 Burmese speakers.” He declined to state exactly how many were employed. Online marketing company NapoleonCat estimates there are about 28.7 million Facebook users in Myanmar.

Haugen, the whistleblower, testified to the European Union Parlament on Nov. 8. She criticized Facebook’s lack of third-party fact checking and instead relied on automated systems to find harmful content.

“If you focus on these automatic systems, they will not work for the most ethnically diverse places in the world, with linguistically diverse places in the world, which are often the most fragile,” she said while referring to Myanmar.

After Zuckerberg’s 2018 congressional testimony, Facebook developed digital tools to combat hate speech and misinformation and also created a new internal framework to manage crises like Myanmar around the world.

Facebook crafted a list of “at-risk countries” with ranked tiers for a “critical countries team” to focus its energy on, and also rated languages needing more content moderation. Myanmar was listed as a “Tier 1” at-risk country, with Burmese deemed a “priority language” alongside Ethiopian languages, Bengali, Arabic and Urdu.

Facebook engineers taught Burmese slang words for “Muslims” and “Rohingya” to its automated systems. They also trained systems to detect “coordinated inauthentic behavior” such as a single person posting from multiple accounts, or coordination between different accounts to post the same content.

The company also tried “repeat offender demotion” which lessened the impact of posts of users who frequently violated guidelines. In a test in two of the world’s most volatile countries, demotion worked well in Ethiopia, but poorly in Myanmar — a difference that flummoxed engineers, according to a 2020 report included in the documents.

“We aren’t sure why … but this information provides a starting point for further analysis and user research,” the report said. Facebook did not comment on whether the Myanmar problem was fixed within a year of its discovery or the results of the tools.

The company also deployed a new tool to reduce the virality of content called “reshare depth promotion” that boosts content shared by direct contacts, according to an internal 2020 report. This method is “content-agnostic” and cut viral inflammatory prevalence by 25% and photo misinformation by 48.5%, it said.

Slur detection and demotion were judged effective enough that staffers shared the experience in Myanmar as part of a “playbook” for acting in other at-risk countries such as Ethiopia, Syria, Yemen, Pakistan, India, Russia, the Philippines and Egypt.

While these new methods forged in Myanmar’s civil crises were deployed around the world, documents show that by June 2020 Facebook knew that flaws persisted in its Myanmar safety work.

“We found significant gaps in our coverage (especially in Myanmar and Ethiopia), showcasing that our current signals may be inadequate,” said an internal audit of the company’s “integrity coverage.” Myanmar was color-coded red with less than 55% coverage: worse than Syria but better than Ethiopia.

Haugen criticized the company’s internal policy of acting “only once a crisis has begun.”

Facebook “slows the platform down instead of watching as the temperature gets hotter, and making the platform safer as that happens,” she said during testimony to Britain’s Parliament on Oct. 25.

Frankel, Facebook’s spokesperson said that the company is proactive.

“Facebook’s approach in Myanmar today is fundamentally different from what it was in 2017, and allegations that we have not invested in safety and security in the country are wrong,” Frankel said.

Yet, a September 2021 report by the Myanmar Social Media Insights Project found that posts on Facebook include coordinated targeting of activists, ethnic minorities and journalists -– a tactic that has roots in the military’s history. It was also revealed that the military uses public pages to spread propaganda.

An October report from Myanmar Witness, a U.K.-based group that archives conflict-related social media posts, shows that both pro-military and opposition groups used Telegram encrypted messaging to organise two types of propaganda campaigns through Facebook and Twitter.

Myanmar is a “highly contested information environment,” where users working in concert overload Facebook’s reporting system to take down others’ posts, and also spread coordinated misinformation and hate speech, the report said.

As an example, one of the networks captured video in Mexico by the Sinaloa cartel of killed bodies, and falsely labeled the footage as proof that the opposition has murdered Myanmar soldiers.

“There’s a difficulty in catching it for some of these platforms that are so big and perhaps the teams to look for it are so small that it’s very hard to catch water when it’s coming out of a fire hydrant,” he said.

The organization also traced the digital footprint of one soldier at the incineration of 160 homes in the village of Thantlang in late October. The man was dressed in body armor and stood on top of the burning houses, holding a sign blaming opposing forces.

Facebook “conducted human rights due diligence to understand and address the risks in Myanmar,” and banned the military and used technology to reduce the amount of violating content, spokesperson Frankel said.

But, digital rights advocates and scholars in Myanmar believe that Facebook should still be open about how it handles content moderating, removal and demotion, as well as acknowledging its responsibility to the Myanmar people.

“We need to start examining damage that has been done to our communities by platforms like Facebook. They portray that they are a virtual platform, and thus can have lower regulation,” said Lee, the visiting scholar. “The fact is that there are real-world consequences.”

More Long Island News

Releated

Haley Lickstein Takes the Lead in Energizing Young Voters with Women Candidates Spotlight During Women’s History Month

Article written by Michael Spinakis. This Women’s History Month, Haley Lickstein is spearheading an innovative drive to energize the younger demographic, particularly Gen Z and millennials, who are poised to form the largest voter groups in the upcoming 2024 elections. Through a fresh campaign, Lickstein aims to shine a light on forward-thinking, pro-choice female candidates […]

How to Spruce up Your Apartment Balcony

  For most people, an apartment is their first home away from home, a milestone in independence. While it might not seem like a significant accomplishment the way a house is, moving into an apartment is incredibly liberating, as it lets you live on far more liberating terms and with a greater sense of privacy […]