6 Minutes To Read

Shooting the Facebook Messenger (Part II)

6 Minutes To Read
  • English
  • Thomas Dowling assesses Facebook’s mitigation strategies before critically engaging with the Company’s intentions in Myanmar.

    This is Part Two of a four-part series. Part I introduces and provides pertinent background for the topic. Part II considers and evaluates some of Facebook’s mitigation strategies and outlines some issues that remain problematic. Part III argues that Facebook’s exclusive preference for Burmese at the expense of all other ethnic languages runs the risk of increasing Burmanisation. Part IV offers some reflections and concluding remarks, ultimately arguing for a more nuanced understanding of Facebook’s presence in Myanmar

    Friend Requests: Repairing a Reputation

    With all the violence—not to mention the torrent of misinformation—that was channeled through Facebook, its international reputation has taken a battering. Yet, to Facebook’s credit, they are trying to make positive changes in the hope that the future use of the platform will not be manipulated to manufacture, distil, and disseminate hate-speech and fake news, as it has been in recent years. Some of these moves include the expansion of the existing team of Burmese language moderators, which stood at 60 in August. The Facebook Newsroom targets 100 by the end of the year. While Paulsen says this is not enough for Myanmar’s 15-20 million users, 60 or 100 moderators represents an enormous increase from the two people at Facebook who could speak Burmese in 2015; in 2014, it was just one, based in Dublin. Furthermore, in September, Sky News reported that Facebook is set to hire a Human Rights Director ‘to address how it could be contributing to human rights abuses.’

    Beyond dramatically expanding its critically understaffed Burmese team, Facebook HQ has begun to re-invest some of their profits into their Myanmar operations more broadly. Sara Su, who penned Facebook’s update in mid-August, said that much of this fiscal input has gone into developing and improving its AI to better detect—and prevent—the spread of hate-speech, misinformation, and other interactions that break Facebook’s policies. Improvements have also been made to the reporting mechanisms and Burmese-friendly drop-down menus that seek to speed up the slow process of responding to flagged content, hitherto greatly inhibited by technical hurdles.

    It should also be considered a positive move that Facebook seems to have (eventually) listened to the advice of those on the ground, namely Phandeeyar, who with others, including Mido, Burma Monitor, and Equality Myanmar, jointly wrote to Zuckerberg after the Facebook CEO claimed credit for their work during an interview with VoxZuckerberg replied to the local companies personally, admitted to several failings, and committed to doing better.

    More recently still, on November 5th 2018, Alex Warofka, Product Policy Manager for Facebook wrote, ‘[w]e commissioned an independent human rights impact assessment on the role of our services in Myanmar. […] The assessment was completed by BSR (Business for Social Responsibility)—an independent non-profit organisations with expertise in human rights practices and policies—in accordance with the UN Guiding Principles on Business and Human Rights and our pledge as a member of the Global Network Initiative.’ The update by Warofka from the Facebook Newsroom confirms what Zuckerberg and previous updates declared earlier: that they weren’t doing enough and should do more. The BSR report, the Facebook’s announcement says,  further ‘recommends that Facebook adopt a stand-alone human rights policy, establish formalised governance structures to oversee the company’s human rights strategy, and provide regular updates on progress made.’

    Almost certainly, the most far-reaching response to Facebook’s earlier inaction was the deletion of 18 accounts linked to senior army officers, including C-in-C SG Min Aung Hlaing, for their alleged roles in the brutal counterinsurgency in Rakhine State. This bold response coincided with the release of the UN’s Fact Finding Mission’s report. This was the first time that the social media giant had deleted the account of any country’s military or political leaders and should not be underestimated. In the view of such journalistic mainstays as Thompson Chau (Myanmar Times), San Yamin Aung (The Irrawaddy), and Frontier Myanmar‘s Editorial Team, removing Myanmar’s top army brass from the platform has done more to harm these individuals than international pressure and the talk of re-imposed sanctions. Given this, a Guardian editorial posited that Zuckerberg was more powerful than the UN Secretary General in Myanmar, ‘since he could, if he wished, cut off one of the main distribution channels for propaganda against the Rohingya and other minorities.’

    Reading through Warofka’s post-BSR report update, it does feel like Facebook has taken the criticism onboard and positively responded with several intelligent changes in the light of heavy criticisms, such as in the UN’s Fact-Finding Mission’s 444-page behemoth that mentioned Facebook nearly three-hundred times.

    While I am personally optimistic (from a non-technical perspective) about Facebook’s efforts, it would be naive for me to not at least offer a caveat or two. A more cynical assessment of Facebook’s mitigation strategies, therefore, might first point to the company’s annus horribilis. Not only has Facebook’s involvement in the Rohingya crisis (hitherto unprecedented in its scale and brutality) negatively affected the company’s reputation for more than a year, Menlo Park also had to deal with the fallout of the Cambridge Analytica Scandal, its unclear involvement regarding the 2016 US Presidential elections (as well as various controversies surrounding the ‘Brexit’ referendum in the United Kingdom), and was also required to very publicly face US senators back in April to explain such controversies. Zuckerberg and Facebook Inc. certainly received more than one black eye prior to the release of the UN’s FFM (the release of which coincided with the very overt, public, loud deletion of all those army officers and associated pages as already discussed). One could certainly—and not unreasonably—see Facebook’s post-UN FFM actions as an immediate band-aid to a gushing laceration.

    The second caveat is the how much power Facebook actually has in Myanmar. A smaller group of experts suggest that Facebook’s power is much more limited than many perceive, and while Facebook has had some role in Myanmar’s ethnic conflicts, particular against the Rohingya, another school of thought says that these problems happened long before Facebook arrived and would have continued on regardless. For example, Mark Farmaner, Director of Burma Campaign UK, ‘t[old] TIME that, “violence against the Rohingya would have happened with or without Facebook.”’

    Ultimately, the motivation for reform matters as it belies Facebook’s intentionality to improve things in an already fragile, volatile country. Questions about Facebook’s timing for accepting its responsibilities and instigating change need to be asked to better assess their future progress in curbing the more dangerous aspects of its platform’s usage. Perhaps the most pressing question we should be asking Zuckerberg and Co. is this: would have expensive mitigation strategies have been implemented if 2018 was, in fact, an annus mirabilis absent of all those very public—and negative—stories? I’m not sure; but the short history of Facebook in Myanmar might provide some pause for thought, particularly when June and October 2012 in Rakhine State are considered.

    Deleting History: Problems Remain

    Even if we assume Facebook has acted with good intentions following its period of self-reflection, several issues persist. The Burmese language team charged with weeding out hate-speech, fake news, and abusive content is based in Kuala Lumpur, Malaysia, not Myanmar, and remains decidedly small: as Eric Paulsen argues, ‘[t]hey currently have 60 employees sifting through content from Myanmar, nowhere near enough to cover a user population that has anywhere between 15-20 million active users.’ Predictably, the process of responsibly assessing the vast backlog of text, images, and videos that either inspire hate-speech, promote misinformation, or otherwise violate the company’s rules is painfully slow.

    Compounding this is the fact that the materials the moderators assess is flagged for review, not actively hunted for on Facebook’s platform. This reliance on third parties to highlight violations of policy has been criticised: not only is the approach unsystematic, but it has easily missed dangerous content. The Reuters investigation discovered over 1000 extremely offensive, graphic, hate-speech posts—some dating back to 2013. This clearly illustrates the extent of the work that needs to be done.

    While the company took decisive steps to remove senior army personnel, The Irrawaddy informs readers that ‘the official Facebook pages of Myanmar’s air force and navy remain live online.’ Both of these branches fall under C-in-C SG Min Aung Hlaing’s remit. The concern now is that future moves are decidedly more difficult. Deleting military accounts and pages constitute low-hanging fruit for Facebook, which understandably suits the company as a cheaper alternative to hiring yet more staff to actually police the site properly. Furthermore, removing such accounts is both an immediate and a highly-visible tactical response. However, deciding when other, less overt accounts and pages are used to achieve the same broad objectives as their disappeared counterparts, that Frontier Myanmar’s Editorial Team expects there to be complications.

    Also worthy of consideration is the excellent question Kayleigh Long put to Facebook in her article for Time. Long asked if any of the deleted army accounts had paid to “boost” their posts. She writes that they declined to comment. I followed up this question with Long via Twitter, and she informed me that she has not yet received a reply. I think this is a very interesting question because it could help observers decide where Facebook’s priorities lay: is Zuckerberg’s company a reforming business with an eye on its own social responsibility; or an exploitative foreign company that reacts to public opinion only when its profits are jeopardised.

    With these existent challenges considered, some commentators are of the option that the most important driver of change will be how Facebook plans to enforce its reforms and mitigation strategies. In my view, the biggest threat Facebook presents to Myanmar is not (only) that the platform is used to proliferate hate-speech and publish misinformation, it is the exclusively of its preference for Burmese in Myanmar. This, I argue in Part III, is the most dangerous aspect to Facebook’s operation because it abets the long-held State policy of Burmanisation.

    Thomas Dowling is a Ph.D student with the University of Leicester. His primary research interests revolve around environmental security in Myanmar (particularly in the context of human security), viewed through the prism of securitisation theory. Thomas is also well-travelled in Myanmar, and lived in Taunggyi, Shan State for a short while. Previously, Thomas earned degrees in Ancient History (BA, MA; Bristol University) and International Security Studies (MA; Leicester University). Presently, Thomas lives in Daegu, South Korea, with his wife, baby, and Jack Russell.

    Stay in the loop.

    Subscribe with your email to receive the latest updates from Tea Circle.