Title: Facebook’s Quasi-Sovereignty: Myanmar’s Need for a New Free Speech Paradigm
For many in Myanmar the internet begins and ends with their Facebook newsfeeds. The New York Times reported that Facebook was used to spread hate speech and misinformation about the Rohingyas, a heavily oppressed ethnic and religious minority group in Myanmar, contributing to violent sectarian tensions. Earlier this year, Facebook CEO Mark Zuckerberg responded to the issue, claiming, “it is clear that people were trying to use our tools in order to incite real harm… our systems detect that that’s going on. We stop those messages from going through.” Myanmar-based civil society organizations (CSOs) begged to differ, responding in an open letter, “in your interview, you refer to your detection ‘systems.’ We believe your system, in this case, was us — and we were far from systematic. From where we stand, this case exemplifies the very opposite of effective moderation: it reveals an over reliance on third parties, a lack of a proper mechanism for emergency escalation, a reticence to engage local stakeholders around systemic solutions, and a lack of transparency.”
The authors of the open letter are correct. Facebook can seek to address these problems by working with Myanmar developers on natural language processing (NLP) algorithms to assist the human reviewers in identifying potential hate speech and misinformation, empowering civil society organizations to investigate and address the root causes of misuse of Facebook’s platform, establishing an independent review board which has real power to adjudicate and effectively enforce not only content issues that Facebook flags to it but also able to proactively identify content issues on the Facebook platform via a subpoena-like power, and improving escalation mechanisms for potentially inflammatory speech.
Over-reliance on third parties
Although Facebook now has 99 Myanmar reviewers, these individuals, who screen user-generated content for violations of Facebook’s policies, will not be enough to cover the vast volume of daily posts generated by the country’s nearly 20 million users. Facebook should adopt a supplement: NLP algorithms built by Myanmar nationals. The text of millions and millions of Facebook posts can be filtered through NLP algorithms, which can identify potential hate speech and misinformation that can then be escalated to human reviewers for final review. Developers would first prioritize building algorithms in Burmese, Rakhine, and Rohingya languages with plans eventually to cover all of Myanmar’s major ethnic languages.
The few Myanmar developers with NLP experience available worked with Google on Burmese-English speech-to-text and text recognition features for Google Translate. However, beyond elementary phrases and words, Myanmar native speakers find Google’s progress quite limited. This is likely due to the fact that Myanmar people do not use Google search or Gmail, thus limiting the size of Google’s dataset to train NLP algorithms via machine learning.
Due to its huge popularity in Myanmar, Facebook has a more expansive dataset and should directly work with, and support, Myanmar NLP developers.
Engage local stakeholders around systemic solutions
In Myanmar, local and international cultural anthropologists, human rights lawyers, and civil society organizations (CSOs) should have seats at the table with Facebook’s engineering, NLP, public policy, and data science departments, providing insights and local knowledge of sectarian issues and their underlying causes. Not only that, Facebook should adopt into its formal review procedures a process where CSOs can investigate root causes of hate speech and misinformation on Facebook.
An effort by German police to investigate a false rumor about a rape of a young German girl by Muslim refugees may prove instructive. German police went to the homes of those who had initially spread the false rumor via Facebook, and showed evidence that the rumor was wrong. All but one of the individuals removed or corrected their posts. The German police actions show the importance of offline advocacy needed to address hate speech and misinformation campaigns; CSOs and cultural anthropologists could serve in this role in a country like Myanmar.
With regard to the proposed NLP solution in the previous section, CSOs can help point out low-hanging fruit: for example, The Guardian has reported on the high-stress environment for Facebook content moderators, and Facebook is interested in showing how difficult content moderation is for hard cases in order to build empathy for what can sometimes truly be a difficult choice on whether or not to ban certain content. Myanmar CSOs, however, note that there are easy cases that Facebook is not addressing: hate speech and misinformation posts in Myanmar are often spread through simple copy and pasting of text of false rumors, and Facebook’s engineering team could easily design algorithms to track these rumors. By consulting regularly with CSOs, or even better, allowing CSOs to select the members for the independent board for their country, Facebook will have a much more efficient feedback loop on how to address these content issues.
Further, local and international experts and organizations can help generate counterspeech –content that corrects hate speech and misinformation or that provides information on civics, civil rights and liberties, health, education, and other areas. This could include posts that counter rumors about the behaviors or history of a particular ethnic group, such as the Rohingya, citing news outlets that follow international journalistic standards, such as Frontier Myanmar and The New York Times.
It is essential to consider the quasi-sovereign nature of Big Tech in many countries: at its peak, Facebook’s market capitalization was nearly 10 times Myanmar’s GDP. Like traditional sovereigns, this status creates obligations for Facebook beyond simple profit-maximization and efficient code. Facebook needs to incorporate civic responsibility into its technology.
Will the independent review board be an effective review mechanism, or just a corporate shill?
Facebook has announced that it will create a global independent review board that will (1) provide oversight of Facebook’s content decisions; (2) reverse Facebook’s decisions “when necessary”; and (3) be an independent authority outside of Facebook. This begs the question of the foundational premise of such an independent board: will this board be truly independent and have both jurisdictional and enforcement powers to give the board true “teeth,” or will the board simply serve as a vehicle for Facebook to shift blame or say that it has set up sufficient procedures while it maintains the (ineffective) status quo?
Facebook has provided some initial guidance on the role of the independent review board, pledging “to make sure it is able to render independent judgment, is transparent and respects privacy”, and over the next 6 months it will convene “a series of workshops around the world where we will convene experts and organizations who work on a range of issues such as free expression, technology and democracy, procedural fairness and human rights.” Facebook is also providing a mechanism for additional parties to comment, “announcing more about how proposals can be submitted in the coming weeks.”
Finally, Facebook could consider adding a C-suite position to be held by a human rights lawyer trained in free speech jurisprudence and/or a cultural anthropologist. This person would oversee proper enforcement of panel decisions by the board at Facebook. Importantly, this person would also have veto power over Mark Zuckerberg and Sheryl Sandberg on proper enforcement of panel decisions. Facebook recently released an update on an ongoing civil rights audit focused on the U.S. elections led by Laura Murphy, a national leader on civil rights formerly at the ACLU. She is a promising candidate for this C-suite position.
Conclusion
Facebook faces a daunting challenge: with over one billion daily active users, it simply lacks the manpower to monitor and regulate all of the user-generated content on its platform. However, Facebook’s massive influence and the reliance placed on it by users in many developing countries like Myanmar, where Facebook is essentially coterminous with the internet, creates an obligation for Facebook to do just that. Since it has no employees or offices in Myanmar, Facebook does not effectively utilize local NLP capabilities, CSOs, or cultural anthropologists; it should.
The solutions described above offer next steps for Facebook to reform its operations in the country. In recent years, Myanmar has become the case study of the power of Facebook gone awry; now is the chance for it to showcase the power of Facebook when it works to comply with appropriate regulations and has public and internal checks on its business. Legal academia, regulators, and the tech sector should continue to develop and refine a new area of law: private free speech law to govern quasi-sovereigns in Big Tech. Failure to do so will only result in continued dissemination of hate speech and misinformation on software platforms, and Facebook’s promising new board only serving as cover for lack of improvements to the Facebook platform to address content issues.
. . .
Michael Lwin is the CEO of Koe Koe Tech, an IT social enterprise in Myanmar.
Rachel Brown is a second-year law student at Yale Law School.