Chapter 12 Case Study: Genocide by Facebook? Distributing Responsibility between War Criminals and Those Who Gave Them the Tools
On October 15, 2018, the New York Times printed a story on the genocide of the Rohingya people of Myanmar perpetrated by the Myanmar military. The Rohingya are ethnic minority that speak an Indo-Aryan language in the country of Myanmar dominated by the Bumar ethnic group who speak Burmese—a Sino-Tibetan language. Burmese are largely Buddhist, and Rohingya are largely Muslim. Amnesty International reported in December of 2016 that a project of ethnic cleansing had begun by the Myanmar military. Villages of Rohingya were attacked, the people were subject to arbitrary arrest, as many as a thousand were summarily executed without trial, and villages were burned down. These extrajudicial killings and destruction of homes created a panic that led to a mass exodus of Rohingya people in and from Myanmar.
By January 2017, 65,000 Rohingya arrived on the Bangladesh border, and more than 23,000 were identified as internally displaced within Myanmar, generating a massive refugee crisis. By February of 2017, Bangladesh had so many Rohingya refugees that it announced plans to relocate more than 200,000 to Thengar Char, an island in the Bay of Bengal. Bangladesh was not the only country where displaced Rohingya sought refuge; they fled to India too. But after anti-Rohingya protests in India, India announced plans to deport 40,000 Rohingya—even though 16,000 were registered with the UN as refugees. In all, it is believed that more than 700,000 Rohingya have fled Myanmar within a year of the beginning of military’s attacks. They face an uncertain future as neither India nor Bangladesh desires to for them to stay and the face certain persecution in Myanmar.
It turns out that Facebook played an important role in the military’s operations against the Rohingya and the public support for the military’s actions toward them. The Times reports that a campaign of propaganda on Facebook authorized at the highest levels of the military was daily perpetuated against the Rohingya people. Officials worked day and night in little bases around the capital city tending to an army of troll accounts on Facebook that posed as fans of celebrities and were engaged in posts critical of the government and spreading fake accounts of attacks by Rohingya—so-called fake news. The military’s effort started years before the genocide and demanded the full-time work of more than 700 people to create and tend the fake news farm. The strategy was to create countless pages devoted to celebrities to attract followers and then subtly use those accounts to post false and slanderous stories about the country’s Muslims. Armies of fake online trolls then engaged in an operation of both reposting and spreading the inflammatory content and attacking real users who questioned it. Some of the fake stories included made up massacres perpetuated by the Rohingya. In 2017, the fake accounts were used to spread fear throughout Myanmar that the Rohingya were preparing a widespread jihadist attacks on Myanmar. At the same, time, it spread stories of a Buddhist-nationalist protest to Muslim users to frighten them.
Facebook took some responsibility for the abuse of its platform, apologizing, and eventually took action against the fake accounts the military operated and shut them down. The company admitted that the deactivated accounts had more than 1.3 million followers. A story in the Washington Post from October 29, 2017 shows that Facebook did not merely create a platform and stand by as it was used by the state; the company actively partnered with the state in 2016 to give subscribers to the state telecom company access to Facebook through a data plan in which Facebook activity would not count against users data amount. As a result of the cooperation between the state and Facebook, Facebook users in Myanmar went from 2 million in 2014 to over 30 million by 2017—increasing both the footprint of Facebook and the government’s ability to manipulate public opinion.
The Times quoted an activist on the subject of Facebook’s responsibility: “The military has gotten a lot of benefit from Facebook,” said Thet Swe Win, founder of Synergy, a group that focuses on fostering social harmony in Myanmar. “I wouldn’t say Facebook is directly involved in the ethnic cleansing, but there is a responsibility they had to take proper actions to avoid becoming an instigator of genocide.”
No one thinks that Facebook intentionally or deliberately facilitated genocide, yet the propaganda efforts of the government would not have been nearly as successful without Facebook. Adding to the complexity, Facebook was not merely passively involved as a tool by the government but acted in concert with the government to make Facebook more available to the public after there was already substantial fake page/troll activity coming from military compounds.
What sort of responsibility does Facebook have for how its platform was used?