In some countries, fake news on Facebook is a matter of life and death

Misinformation distributed by social platforms like Facebook has become a major issue in the United States, thanks to all the attention focused on Russian troll armies trying to influence the 2016 presidential election. But in some countries, “fake news” doesn’t just interfere with people’s views about who to vote for—it leads to people being arrested, jailed, and in some cases even killed. And Facebook doesn’t seem to be doing a lot about it.

Southeast Asia is one place where the social network is fomenting ethnic and political tensions in dangerous ways, according to a number of journalists who cover the region. This effect can be seen in countries like Thailand and Cambodia, but it has become increasingly severe in Myanmar, where the Rohingya people are being persecuted, driven from their homes and in some cases raped and killed.

“As complicated as Facebook’s impacts on the politics of the United States are, the impact in Asia may be even more tricky, the consequences more severe and the ecosystem less examined, both by Facebook and most people in the US,” says Christina Larson, who has written about the region for a number of outlets including Bloomberg and The Atlantic.

As the situation has escalated over the past six months, observers in Myanmar have reported waves of Facebook-based misinformation and propaganda aimed at fomenting anti-Rohingya fervor, including fabricated reports that families were setting fire to their own homes in an attempt to generate sympathy. More than 600,000 people have been forced from their homes so far, and an untold number have died in the process.

One of the main sources of anti-Rohingya propaganda is Ma Ba Tha, a group of radical Buddhist monks who have been preaching that the Rohingya are less than human, or that they are trying to over-run the country and make everyone into a Muslim. The leader of the group, Ashin Wirathu, has been banned from preaching, but he has been able to spread his message far and wide thanks to an orchestrated Facebook campaign.

Larson and others say the problem is compounded by the fact that a majority of Myanmar residents rely on Facebook for their news. And yet, the level of media literacy is shockingly low, primarily because smartphones and social media are still relatively new.

Until 2014, the digital SIM cards required to use smartphones were prohibitively expensive, because they were only available from the country’s government-controlled telecom carrier. After the industry was opened up, cheap smartphones and $1 SIM cards flooded the market, available from every street vendor—and almost all had Facebook installed by default.

“Facebook has basically become the way that people do everything,” says Paul Mozur, a New York Times reporter who covers Myanmar. “It replaces newspapers, it displaces outreach campaigns by NGOs and other agencies trying to reach people especially in remote areas, it replaces just about everything.”

Wirathu, the leader of the anti-Rohingya movement, used to print out paper pamphlets and flyers to spread his incendiary messages, Mozur says, but now he just posts fake images on Facebook and gets 100 times the reach.

Many of those who have been thrust into this new world of smartphones and social networks in Myanmar “just aren’t used to the level of misinformation or disinformation that’s happening on Facebook,” says Mozur. “Suddenly they’re subject to the full force of an information war coming out of Yangon, orchestrated by much more sophisticated sources, and it’s easy for them to become pawns in that war.”

And what is Facebook doing to help? Not much, some observers say. The social network has relationships with non-government agencies, but only a couple of actual staffers on the ground. “It’s become a bit like an absentee landlord in Southeast Asia,” according to Phil Robertson, deputy director of Human Rights Watch in Asia.

A Facebook spokesman told CJR the company works hard to keep hate speech and content that celebrates or incites violence off the platform, that it is working with non-profit groups in Myanmar to raise awareness of its community standards policies, and that it has local-language pages that offer tips on safety and security.

But still, the problem continues. and it is arguably far more serious than any safety tip guidelines could cover. At one point, Mozur says, messages were spreading on Facebook Messenger that said Muslims were planning an attack on 9/11, and at the same time a separate chain letter said that the Buddhists were planning to attack on the same day.

“I don’t know who was behind those messages, it could have been like four people, but it literally brought the country to a standstill,” he says. “A lot of times these rifts are there already, and so in a certain sense I guess Facebook is a mirror, holding itself up to the differences in society. But social media can also become a real catalyst for the violence.”

Christina Larson says there’s a debate to be had about what hate speech means in a particular context and how to define it, “but what I would consider dangerous speech is advocating that the Rohingya need to leave Myanmar, and sharing doctored images of them supposedly burning their own houses to create a media spectacle.”

In a way, she says, these images—which were liked and shared tens of thousands of times—”gave cover for military action and human rights violations, including violence and rape. You can’t say social media kills people any more than you can say guns kill people, but certainly social media shaped public opinion in a way that seems to have played a part in the escalation of violence against the Rohingya population.”

Facebook’s approach to countries like Myanmar and others in the region often strikes those on the ground as not just out of touch but actively cavalier. In its recent split-feed tests, for example, users in countries like Cambodia and Slovakia had news articles moved to a completely separate feed, which local non-profit groups and media outlets say significantly impacted their ability to reach people with crucial information.

It’s one thing to tread carefully around issues like free speech, Larson says, “but if you’re going to run A/B testing, where you change an algorithm and see what you think consumers like best, for God’s sake, stick to stable democracies. Don’t pick a place where there’s an authoritarian regime that is busy locking up opposition leaders, and Facebook is a primary way that activists communicate about their government.”

In many ways, Myanmar is an example of the future Mark Zuckerberg seems to want: A country in which most people are connected through the social network and get virtually all of their news from it. And yet, the outcome of that isn’t a utopian vision of a better world, it’s exactly the opposite—a world where ethnic and cultural tensions are inflamed and weaponized. And Facebook’s response looks almost completely inadequate to the dangers it has helped unleash.



“Facebook has become a bit like an absentee landlord in Southeast Asia,” says Phil Robertson, deputy director of Human Rights Watch’s Asia division. “When Buddhist extremists start instigating action against Muslims [in Myanmar], looking around for the local Facebook representative is hopeless — there isn’t one. Instead, it’s sort of, complain into the void and hope some relief arrives before it’s too late

Why is Facebook so useful to the junta? First, its insistence on a “real name-only” policy makes for easy tracking of dissidents. Even in cases where people successfully mask their names, their web of social connections makes them potentially easy to identify. (In the U.S., sex workers have already found themselves inadvertently exposedby Facebook’s data-aggregation and friend suggestions.) Hard-to-navigate privacy settings can mean that what people mistakenly think of as private speech, limited to a small group of friends, is often anything but. “If you make a certain kind of comment online, you can quickly be sent to prison in Thailand,” says iLaw researcher Anon Chawalawan.

But the BBC has reported that one unintended impact was dramatically shrinking the number of people who would see published items. “Out of all the countries in the world, why Cambodia? This couldn’t have come at a worse time,” a Cambodian blogger told the BBC, explaining that the number of people who saw her public video had dropped by more than 80 percent. “Suddenly I realized, wow, they actually hold so much power.… [Facebook] can crush us just like that if they want to.”

Facebook Can’t Cope With the World It’s Created


The crackdown has already claimed two NGOs, more than a dozen radio stations, and the local offices of two independent media outlets, Radio Free Asia and The Cambodia Daily. Hun Sen’s main opposition, the Cambodian National Rescue Party (CNRP), could be dissolved entirely at a Supreme Court hearing on 16 November.

“Out of all the countries in the world, why Cambodia?” Ms Harry asks of Facebook’s experiment. “This couldn’t have come at a worse time.”

Facebook surpassed TV as Cambodians’ most popular source of news last year, according to a survey from the Asia Foundation, with roughly half of respondents saying they used the social media network.

The platform helped power the CNRP’s gains against the governing Cambodian People’s Party (CPP) in the 2013 national elections and has been one of the only places for dissent in a country ranked 132nd out of 180 countries in Reporters Without Borders’ 2017 World Press Freedom Index.

Hun Sen’s longtime rival, Sam Rainsy, the exiled former president of the CNRP who runs a popular page of his own, said his traffic had dipped 20% since the start of the Facebook test. Unlike the prime minister, whom he accused of buying Facebook supporters from foreign “click farms”, Mr Rainsy said he could not pay to sponsor his posts to put them in front of more users in their usual News Feeds.

“Facebook’s latest initiative would possibly give an even stronger competitive edge to authoritarian and corrupt politicians,” he said.

Leang Phannara, web editor for Post Khmer, the Khmer-language version of independent English daily the Phnom Penh Post, said Khmer Facebook posts were reaching 45% fewer people, while web traffic was down 35%. The only way to recapture that audience was to pay to sponsor posts, he said.

“It’s a pay-to-play scenario,” Mr Phannara said.


Phil Robertson, deputy director of Asia Division of Human Rights Watch, said the Rohingya were forced to get the word out about their cause on Facebook and Twitter because the few media outlets in Myanmar that exercise independence in reporting on the situation in Rakhine face threats of boycotts and retaliation.
Not many media outlets in the country, he said, were willing to take the risk of alienating their readers, advertisers, and in some cases, their staff, by calling out the Burmese government for the campaign of ethnic cleansing they are involved in.

“Of course, the problem with social media is that their policing mechanisms can be used for harassment by those willing to mount a concerted campaign of filing complaints against specific Facebook pages or Twitter feeds,” Robertson added. “We’ve seen an explosion of Rakhine and Burman nationalists using Twitter, retweeting hateful messages and gory images, so it would not surprise me at all if some of those nationalists, using bot accounts and pages apparently set up en masse, are now going on the attack against Rohingya on Facebook.”

(Many Rohingya refugees and activists said their pages had been blocked or banned from Facebook because they were posting photos and videos of anti-Rohingya violence. Facebook said it was leaving some such posts up for news purposes but was removing those it said were promoting or celebrating violence).

“I believe [Facebook] is trying to suppress freedom [of] expression and dissent by colluding with the genocidaires in Myanmar regime,” the activist and journalist Mohammad Anwar told the Guardian. Anwar, whose allegations of censorship were first reported by the Daily Beast, shared screenshots of numerous posts that had been removed by Facebook for violating community standards. Several of the posts comprised only text, he said, and described military operations against Rohingya villages in Rakhine.

The Kuala Lumpur-based journalist, who works for the site, said that his reports come from a network of 45 correspondents and citizen journalists in Rakhine.


Laura Haigh, Amnesty International’s Burma researcher, told The Daily Beast there appears to be a targeted campaign in Burma to report Rohingya accounts to Facebook and get them shut down.

Mohammad Anwar, a Kuala Lumpur-based Rohingya activist and journalist with the site, told The Daily Beast that Facebook has repeatedly deleted his posts about violence in Rakhine State, and has threatened to disable his account.


“In a lot of these countries, Facebook is the de facto public square,” said Cynthia Wong, a senior internet researcher for Human Rights Watch. “Because of that, it raises really strong questions about Facebook needing to take on more responsibility for the harms their platform has contributed to.”


Fake news demonizing Muslims, particularly reports spreading fears of terrorism or Islamic fundamentalism, has sometimes led to disastrous consequences. Those reports have spread like wildfire on Facebook, where Buddhist nationalist groups like Ma Ba Tha have gained prominence by building legions of followers.

That’s what happened in the region of Bago, north of Yangon, on June 23, when a Buddhist mob reportedly destroyed homes and forced dozens of villagers to flee after rumors spread online that a new building in the village was going to be a Muslim school.

MIDO, which regularly monitors Burmese hate speech on Facebook as part of a research project with Oxford University, found that only 10% of the postings it reported according to its own definition of hate speech were eventually taken down by Facebook. The reporting mechanism is clunky, and the process is opaque, said MIDO’s Htaike Htaike Aung.


Much of India’s false news is spread through WhatsApp, a popular messaging app. One message that made the rounds in November, just after the government announced an overhaul of the country’s cash, claimed that a newly released 2,000 rupee bank note would contain a GPS tracking nano-chip that could locate bank notes hidden as far as 390 feet underground. Another rumor, about salt shortageslast November, prompted a rush on salt in four Indian states. In southern India, a rumor about a measles and rubella vaccine thwarted a government immunization drive.

Many false stories have led to violence. In May, rumors about child abductors in a village triggered several lynchings and the deaths of seven people. In August, rumors about an occult gang chopping off women’s braids in northern India spread panic, and a low-caste woman was killed.

Some stories exacerbate India’s rising religious and caste tensions. This week, for instance, images purportedly showing attacks against Hindus by “Rohingya Islamic terrorists” in Burma circulated on social media in India, stoking hatred in Hindu-majority India against Muslim Rohingya.

“There was one video with two people being beheaded, and the text was saying these were Indian soldiers being killed in Pakistan. When I found the original video, it was actually taken from footage of a gang war in Brazil,” said Pankaj Jain, founder of, a website that fact-checks circulating rumors on social media in India. “They’ll tell you this is fresh, these are images the media is not showing you, if you’re a true Indian patriot, you will forward this message.”


New York Times technology reporter Paul Mozur says in Myanmar, Facebook is everywhere.

“The entire internet is Facebook and Facebook is the internet. Most people don’t necessarily know how to operate or get on and navigate regular websites. They live, eat, sleep and breathe Facebook.” Facebook users in Myanmar grew from about 2 million in 2014 to more than 30 million today.

Which is why the misinformation spread on Facebook can be so dangerous.

Mozur says Facebook has become a breeding ground for pernicious posts about the Rohingya. “In particular, the ones that seem most problematic are government channels that have put a lot of propaganda out there, saying everything from the Rohingya are burning their own villages, to showing bodies of soldiers who may be from other conflicts but saying this is the result of a Rohingya attack, to more nuanced stuff like calling the Rohingya ‘Bengalis’ and saying they don’t belong in the country.”

These posts are widely shared and generate thousands of likes.


Social media messaging has driven much of the rage in Myanmar. Though widespread access to cellphones only started a few years ago, mobile penetration is now about 90 percent. For many people, Facebook is their only source of news, and they have little experience in sifting fake news from credible reporting.

One widely shared message on Facebook, from a spokesman for Ms. Aung San Suu Kyi’s office, emphasized that biscuits from the World Food Program, a United Nations agency, had been found at a Rohingya militant training camp. The United Nations called the post “irresponsible.”


(Craig Mod) Almost all of the farmers we spoke with were Facebook users. None had heard of Twitter. How they used Facebook was not dissimilar to how many of us in the West see and think of Twitter: as a source of news, a place where you can follow your interests. The majority, however, didn’t see the social platform as a place to be particularly social or to connect with and stay up to date on comings and goings within their villages.


Stevan Dojcinovic, who runs an independent non-profit investigative news agency in Serbia, wrote a piece for the NYT saying “Hey Mark Zuckerberg, My Democracy Isn’t Your Laboratory” — he says: “for us, changes like this can be disastrous. Attracting viewers to a story relies, above all, on making the process as simple as possible. Even one extra click can make a world of difference. This is an existential threat, not only to my organization and others like it but also to the ability of citizens in all of the countries subject to Facebook’s experimentation to discover the truth about their societies and their leaders.”

That’s why Mark Zuckerberg’s arbitrary experiments are so dangerous. The major TV channels, mainstream newspapers and organized-crime-run outlets will have no trouble buying Facebook ads or finding other ways to reach their audiences. It’s small, alternative organizations like mine that will suffer. A private company, accountable to no one, has taken over the world’s media ecosystem. It is now responsible for what happens there. By picking small countries with shaky democratic institutions to be experimental subjects, it is showing a cynical lack of concern for how its decisions affect the most vulnerable.


Leave a Reply

Your email address will not be published. Required fields are marked *