Social Media - A double-edged sword?
A theoretical analysis
of the Rohingya genocide
and protest movements
in Myanmar
by Anton J. Hansen


Imagine yourself waking up to gun shots followed by screams and the smell of burning straw. As you walk outside, you see soldiers raping and killing your neighbours while setting your house on fire. What sounds like a horrible nightmare has been the reality since October 2016 for more than one million Rohingyas, an ethnic group, living in Rakhine state in western Myanmar. Systematic violence against the Rohingya which increased from 1978 onwards when their citizenship was taken away, escalated in 2016 as the military started a brutal campaign of killings, mass rape, arson and ethnic cleansing against the Rohingya. More than 30.000 people got killed and more than one million fled to neighboring Bangladesh. Facebook played a key role in the genocide as it was the main medium used to spread hate speech and propaganda against the Rohingya (Banaji & Bhat, 2022; Davis, 2021; Farrelly & Win, 2016; Renshaw, 2013; Whitten-Woodring et al., 2020). By connecting theories and literature about social media and its role in politics, propaganda and the digital public sphere, I will answer the following research question in this essay.


            The networked society in Myanmar as a double-edged sword: How can we understand the role of social media logics in the genocide against the Rohingya and the internet’s chances of strengthening democratic movements?


            First, I will give an introduction into how information technology quickly was adopted by Myanmar’s predominantly media illiterate society from 2014 onwards. Secondly, I will describe how the Tatmadaw, Myanmar’s armed forces, as well as nationalist leaders used Facebook for propagandistic purposes and how social media logics helped them. Thirdly, I will discuss the sub-question to what extent Facebook is responsible for the genocide and connect this to theoretical debates about the role of social media platforms as mere media platforms, gate keepers or content moderators. Fourthly, I will discuss my analysis in regard of the positive aspects of social media for democratic purposes, conclude my findings and name limitations of my research.


            Myanmar is a country in South East Asia bordering Thailand, China, India and Bangladesh. Since the assassination of its first socialist leader general Aung San, the country became a military dictatorship which was isolated for decades. When cell phones and sim cards arrived in the country in 2000 they costed together 5000$ (Banaji & Bhat, 2022, p.41) and were only given to high ranked officials and army generals. Internet access was similarly restricted and the only media that people could consume were strictly controlled by the government (Renshaw, 2013). Due to increasing protests and the Saffron Revolution in August 2008, the political landscape in Myanmar changed and democratic parties were granted participation in elections (Selth, 2020). Nevertheless, military officials continued to dominate the parliament and other state institutions as they added articles to the country’s constitution to manifest their power. In 2014, the country opened to foreign investments and the prices for sim-cards quickly diminished from 150$ to 1,50$ which made them accessible to large parts of the population (Davis, 2021; Farrelly & Win, 2016; Renshaw, 2013).

Graphs above: Typical price for sim cards in US-$ and the number of active sim cards (Rio, 2020).

However, Myanmar’s military, the Tatmadaw, banned many internet sites but allowed websites like Facebook. The optimism about the chances for political liberation through the spread of the internet got less once it became evident that Myanmar’s digital public sphere, which primarily unfolded on Facebook, got hijacked by the Tatmadaw and nationalist leaders such as the monk Ashin Wirathu (Banaji & Bhat, 2022, p.41). The internet and social media which are often seen as the core infrastructure for providing spaces where citizens can engage in democratic activities such as discussions and information exchange (Chadwick, A., Dennis, J. & Smith, A. P., 2016; Dahlgreen, 2015; Papacharissi, 2004, 2021; Margetts, 2018) were used for completely different purposes in Myanmar, mainly for propaganda.


            Due to the ‘Free Basic’ tariff which allows subscribers to access Facebook without having to pay internet fees, Facebook quickly became the dominant source of information for many. On Facebook people quickly came across people like Ashin Wirathu or Han Nyien Oo who had among posts about Buddhism, nationalism or gossip, hate speech against a particular ethnic group – the Rohingya (Banaji & Bhat, 2022, Chapter 2; Davis, 2021; Stecklow, 2018).

            As dictatorships need a common enemy and as the Tatmadaw saw Rohingya political activity as a serious threat to their power, the Rohingyas quickly became the subject of one of the biggest propaganda campaigns of the 21st century that unfolded almost completely on Facebook (Whitten-Woodring et. al, 2020). Through posts which claimed that Rohingya rebels have committed hate crimes or want to undermine Myanmar’s society, the public became emotionalized and radicalized. Emotions of hate and sorrow were effectively triggered to unite and connect the public (Coleman, 2021, p.21) against an enemy which was artificially created by military propaganda. Such elaborate interventions have great chances of political success, especially if they are shared in personal networks such as among friends or acquaintances which has more persuasive effects (Jungherr, Rivero & Gayo-Avello, 2020, p.13).

            Messages by nationalist monks like Ashin Wirthu but also famous military leaders spread quickly among Facebook users in Myanmar. Examples are: “Cut off those necks of the sons of the dog and kick them into the water”, “Pour fuel and set fire so that they can meet Allah faster”, “We must fight them the way Hitler did the Jews, damn kalars!” or “may the terrorist dog kalars fall fast and die a horrible death”. One nationalist group even set up a page called the “Kalar Beheading Gang” (Stecklow, 2018). “Kalar” is a racist, anti-Muslim slur used against people with South-Indian origin. Such posts were accompanied by pictures and videos of executions and rapes that remained on Facebook for years (Davis, 2021).

            Such effects paired with the media illiteracy of Myanmar’s population which primarily had no idea how to engage with the constant, around-the-clock flow of information that Facebook provided them led to a public arousal of hate and fear against the Rohingya. Experts say that Facebook played an essential role in manipulating an entire nation with a digital medium they barely understood, that emotionalized them and which is highly addictive (Fink, 2018; Rio, 2020; Sablosky, 2021; Whitten-Woodring et al., 2020). 


            Social media logics (Dijck & Poell, 2013) played a key role in Facebook’s success to mobilize millions of people against the Rohingya. Connectivity, datafication, programmability and popularity are social media logics and differ fundamentally from traditional media logics (Dijck & Poell, 2013, p.5). Popularity, fuelled by Facebook’s “like economy” (Gerlitz & Helmond, 2013) made it possible that nationalist and radical Buddhist leaders quickly became popular and that their popularity increased. Facebook’s algorithm recommended them to millions of users who were primarily digital media illiterate as their government only recently allowed them access to the internet.

            Myanmar’s Facebook users saw the big number of likes, comments and followers as indicators for legitimacy which amplified the propaganda’s persuasive effects (Whitten-Woodring et al., 2020). The broader information dissemination dynamics are also important to understand as people actively share information with family and friends in Myanmar. Facebook, seen as this exciting new digital medium which shows its users exactly what they are interested in, quickly became Myanmar’s primary news interface (Whitten-Woodring et al., 2020, p.414). Its 18 million users could share information quicker than ever and without additional costs (Rio, 2020, p.7).


            The Facebook algorithm successfully connects users and content effortlessly day and night in ways that traditional media could never be capable of doing. It is a “strategic tactic that effectively enables human connectedness while pushing automated connectivity” (Dijck & Poell, 2013, p.8). Connectivity can be understood here as the “advanced strategy of algorithmically connecting users to content, users to users, platforms to users [and] users to advertisers” (Dijck & Poell, 2013, p.9). Additionally, the programmability of social media enables its programmers to steer user experiences, content, and user relations through automated connectivity while monitoring and readjusting it to increase companies’ profits (Dijck & Poell, 2013).


            Social media’s automated connectivity however, was exactly the problem which escalated the spread of hate speech and propaganda on Facebook. This is because the algorithm does not follow ethical rules or checks the content it promotes. It is programmed to merely follow economic interests which lay in attracting as much attention as possible to prolong screen time and exposure to the platform’s contents and its advertisements (Dijck & Poell, 2013). Social media in this regard is fundamentally different to traditional forms of media such as newspapers or television as the content is not carefully checked by gate keepers before it is posted and spread through the network.

            Facebook’s datafication, which includes the collection and analysis of vast amounts of user data, lays the foundation for the effectiveness of its algorithm to connect users with content that will prolong their screen time (Dijck & Poell, 2013) as users are given algorithmic identities based on their generated data. Algorithmic identities (Cheney-Lippold, 2011, 2017) are created by social media platforms to group users with similar online behaviour together to recommend them similar contents. Such mechanisms amplify the creation and strengthening of extremist views in so called online filter bubbles (Pariser, 2011) whose creation is integrated in Facebook’s business model. User’s online experience is dominated by seeing posts which resonate with them and with which they already agree with, all other content is filtered, so that users end up in an online bubble full of content that neither challenges their political views nor contradicts them. Among those are of course, the most popular posts as those serve the same function, prolonging screen time to increase exposure to advertisements and sponsored contents. Such social media logics contribute to greater political polarization, homogenization and even radicalization as users’ views are not challenged anymore in the digital public sphere (Jungherr, Rivero & Gayo-Avello, 2020, p.14).


            If a post which calls for cutting Rohingyas’ heads off is popular, it gets recommended to more and more users automatically, reaching millions within hours. Facebook and other social media platforms are claimed to have a responsibility about the content which is shared on their websites (Fink, 2018). Thus, it is their legal and ethical duty to moderate it by deleting hate speech or calls for violence. However, it is often criticized as a form of censorship if a social media company can intervene and mute certain actors of the digital public sphere (Mchangama, 2022). On the other hand, websites like Facebook have a responsibility of not causing harm and of doing good for the world (Fink, 2022). Being a website with more than two billion users comes with the responsibility to not let one’s technology being used to spread propaganda and hate speech.


            Facebook had one employee at the time it started its “Free Basic” – app which was not enough to monitor millions of Burmese Facebook users (Davis, 2021).

Looking back from today, this can be seen as negligent behaviour by Facebook motivated by the greed for quick profits through a rapidly growing market (Sablosky, 2021; Whitten-Woodring et al., 2020). Instead of first hiring hundreds or even thousands of employees that will monitor Facebook posts in Myanmar, the company entered the Myanmar market with one employee that was responsible for millions of users. Consequently, the question arises whether Facebook can be considered (partly) responsible for the genocide of the Rohingya?


            Based on my aforementioned analysis, backed by theoretical concepts and frame works, I argue that Facebook has a major responsibility for the genocide of the Rohingya. As mentioned earlier, the company entered Myanmar with the sole intention of making as much profit as possible without considering political or social tensions nor oppression of ethnic groups and abuses of power. Therefore, the company acted negligent as it became the core infrastructure of the Tatmadaw to ideologically legitimize the genocide of the Rohingya and to publish hate speech which resulted in further atrocities. Even though Facebook knew for what purposes its platform is used, it did nothing to intervene and is now sued for more than 150 billion pounds as it “was willing to trade the lives of the Rohingya people for better market penetration in a small country in south-east Asia” (Milmo, 2021). Of course, soldiers of the Tatmadaw are the ones who murdered and raped, who committed the genocide, but Facebook’s excellent propaganda machine made possible through its social media logics (Dijck & Poell, 2013), made it possible for the military to construct a concept of the enemy, Rohingyas, over the years that made their actions agreeable by the Myanmar public and even motivated citizens to take part in the atrocities (Davis, 2021; Rio, 2020; Whitten-Woodring et al., 2020).


            This example shows how capitalistic social media platforms’ fundamental logic disqualifies them from being part of a democratic public sphere. Instead of being digital platforms that foster social, political or cultural interactions (Graham, 2015), they primarily foster increased user engagement to increase profits (Banaji & Bhat 2022, Chapter 2; Bucher, 2012; Dijck, 2014; Precht, 2018). One might argue that Myanmar is an extreme example but even in countries with a generally higher digital media literacy, political polarization through social media has become a major challenge for democracy (Chadwick, Dennis & Smith, 2016; Papacharissi, 2021).


            This also raises important questions about the moderation or censoring of social media platforms. Of course, it is quite obvious that sentences such as “Pour fuel and set [them on] fire”, are an incitement of the masses (Stecklow, 2018). However, free speech advocates argue that the problem does not lie in somebody saying such things but in people actually committing it and in too few people arguing against hate speech (Coleman, 2021; Mchangama, 2022). Especially when social media like Facebook or Twitter are so big that they are the digital public sphere instead of just being part of it, they get the power to completely mute individuals from the public discourse. An example is that more than thirty oppositional political parties and organizations, so called Ethnic Armed Organisations (EAOs), which fight against the Tatmadaw or expose human rights violations (Sablosky, 2021), were banned from Facebook once the company reacted to the situation in Myanmar. Facebook thereby took away their major possibility to mobilize people and excluded them from the global digital public sphere (Sablosky, 2021). This is just one example to illustrate how difficult it is to effectively monitor and regulate social media.

            Furthermore, the case of Facebook in Myanmar shows that one company has the power to deeply influence war and peace and whether NGOs such as Ethnic Armed Organizations (EAOs) have the possibility to reach out for help once a war breaks out. Facebook whistle blower Frances Haugen calls this digital colonialism as Facebook expands globally to extract resources from other countries and takes minimal responsibility for any damage it leaves (Banaji & Baht, 2022; Fink, 2018; Nover, 2021; Whitten-Woodring et al., 2020). Its social media logics has the power to destabilize nations through mass manipulation and let millions die. Besides that, countries like Myanmar heavily rely on foreign, American companies, to build up their digital infrastructure so that their digital public sphere becomes heavily dependent on American policies.


            The insights of my analysis of propaganda against the Rohingya on Facebook contradict overly positivistic views on social media (Chadwick, Dennis & Smith, 2016; Coleman, 2017; Graham, 2015; Jungherr, Rivero & Gayo-Avello, 2020; Mchangama 2022). Nevertheless, one has to consider that such literature was written with democratic, media literate audiences in mind. Of course, social media have the chance “to improve participation and representation”, as well as “current democratic practices and attempts to build stronger forms of democracy” (Jungherr, Rivero & Gayo-Avello, 2020, p.227). However, such processes do not have their origin in social media itself but in democratic public spheres (Habermas, Burger & Lawrence, 1989) that have developed over years. Additionally, the internet and social media do not exist in a vacuum and one always has to consider the history, culture and political structures of societies to truly understand and evaluate its political effects.


            Contradicting the negative consequences of Facebook in Myanmar, its social media logics played a major role in mobilizing thousands of protestors when the military staged a coup in February 2021 to forbid all political parties. Protestors shared photos, videos and crucial information with each other as well as with the whole world through Facebook and Telegram (Beech, 2021; Goldman, 2021). The connectivity and popularity of Facebook and Telegram made this possible and as the companies are not based in Myanmar, they could not be pressured by the Tatmadaw to reveal user’s private information or to stop connecting them with each other. Protestor’s with a “mobile phone [could] challenge injustice, fight for policy or regime change, and shed light on corruption and inefficiency in public life” (Margarets, 2018, p.2). The internet as a global network was also crucially important for protestors to transmit information to the whole world, especially photos and videos which have a strong persuasive effect (Fiske, 2010). Journalists within but also outside of Myanmar could use the huge amount of audio-visual material to further investigate police and military crackdowns of demonstrations, verify information and inform the global public of the atrocities committed during those days.

            Different to previous revolutionary attempts in Myanmar, people found themselves as part of the networked society (Castells, 2007). This made it more difficult for the Tatmadaw to crackdown the protests as there were a multitude of nodes in the network that could communicate in real-time, both synchronous and asynchronous, with media that weren’t subject to the Tatmadaw’s control. Instead of having to rely on one centralized leader who communicates from top to bottom, protestors organized themselves in decentralized networks without hierarchy and practiced a form of “mass-self communication” (Castells, 2007, p.246). This multi-modal exchange of messages helped protestors to organize themselves and react efficiently to crackdowns by the military. Concludingly, social media enabled the protestors to build alternative power networks to combat the Tatmadaw and inform the international public by entering the global digital public sphere. Despite this, the Tatmadaw remain in power until today through the use of prosecutions, mass arrests, bombing of resistance villages and torture of protestors and political opponents (Beech, 2021; Goldman, 2021; Owen & Aung, 2021).

Picture above: protestors document and share a rally in Mandalay, Myanmar. (photo credits: Reuters)
Protestors in Yangon, Myanmar. Photo Credits: Reuters

To summarize the analysis of this essay, social media remains a two-sided sword, in the form of a mass communication tool which can ruin countries completely (Davis, 2021; Fink, 2018; Sablosky, 2021; Whitten-Woodring et. al, 2020) and deliberate citizens by giving them agency in the public sphere (Chadwick, Dennis & Smith, 2016; Coleman, 2017; Graham, 2015; Jungherr, Rivero & Gayo-Avello, 2020; Mchangama 2022). Myanmar as a case study shows that there is no clear dichotomy to answer the question whether social media has positive or negative effects on democracy. However, my analysis has shown that cultural, economic, historical and political aspects have to be considered closely to understand social media’s effects on politics, especially in non-western countries like Myanmar. Due to my non-South-East-Asian or Anthropology background, my essay’s findings are limited. However, with such cultural and social backgrounds the question could be studied further whether an international representative body that would be capable of supplanting digital governance under constitutional understanding could lead to effective, democratic regulations of social media worldwide (Suzor, 2019, p.165). Another limitation of this essay is that I relied on theories and conceptual frameworks instead of conducting epistemological research myself.

            Furthermore, the international response to human right abuses and the genocide of the Rohingya as well as the little international aid that Rohingya refugees experience in comparison to refugees from other war-torn countries such as Ukraine could be critically accessed from an academic perspective. Until today, more than one million Rohingya refugees live under terrible conditions as state-less people in refugee camps in Bangladesh (Human Rights Watch, 2022).


Amnesty International. (2017). Caged without a roof: Apartheid in Myanmar’s Rakhine state.      Amnesty International.


Banaji, S., & Bhat, R. (2022). Social media and hate. When hate speech policies and        procedures fail. Routledge.


Bucher, T. (2012). Want to be on the top? Algorithmic power and the threat of invisibility on       Facebook. New Media & Society, 14(7), 1164–1180.


Chadwick, A., Dennis, J. and Smith, A. P. (2016). ‘Politics in the Age of Hybrid Media:   Power, Systems, and Media Logics’ (pp. 7–22.), In Bruns, A., Enli, G., Skogerbø, E.,      Larsson, A. O. and Christensen, C. (eds) The Routledge Companion to Social Media         and Politics. Routledge.


Castells, M. (2007). Communication, Power and Counter-power in the Network Society. International Journal of Communication, 1(1), 238-266.


Cheney-Lippold, J. (2011). A new algorithmic identity. Theory, Culture & Society, 28(6),            164-181. doi:10.1177/0263276411424420


Cheney-Lippold, J. (2017). We are data: algorithms and the making of our digital selves.             New York University Press.


Coleman, S. (2017). Political Hopes & Fears. Can the Internet Strengthen Democracy.    Polity.


Dahlgren, P. (2015). The Internet as a civic space (chapter 2). In Coleman, S. & Freelon, D.         Handbook of Digital Politics. (pp.17-34). Edward Elgar Publishing


Davis, A. (2021). Hate speech in Myanmar: The perfect storm. In S. Jayakumar, B. Ang, & N.     D. Anwar (Eds.), Disinformation and fake news (pp. 103–116). Palgrave Macmillan.


Dijck, J. v. and Poell, T. (2013). Understanding social media logic. Media and      Communication1(1), 2–14. doi:10.17645/mac.v1i1.70


Dijck, J. v. (2014). Datafication, dataism and dataveillance: Big data between scientific    paradigm and ideology. Surveillance & Society, 12(2), 197-208.                                              doi:10.24908/ss.v12i2.4776


Farrelly, N., & Win, C. (2016). Inside Myanmar’s turbulent transformation. Asia & the     Pacific Policy Studies, 3(1), 38–47.


Fink, C. (2018). Dangerous speech, anti-Muslim violence, and Facebook in Myanmar.      Journal of International Affairs, 71(15), 43–52.


Fiske, J. (2010). Television culture. Routledge.


Gerlitz, C., & Helmond, A. (2013). The like economy: Social buttons and the data-intensive         web. New Media & Society15(8), 1348-1365. doi:10.1177/1461444812472322


Gillespie, T. (2018). Custodians of the internet: platforms, content moderation, and the    hidden decisions that shape social media. Yale University Press.


Goldman, R. (2021, February 1).          coup.html


Graham, T. (2015). Everyday political talk in the Internet- based public sphere (chapter 14).         In Coleman, S. & Freelon, D. Handbook of Digital Politics (pp.247-263). Edward           Elgar Publishing


Habermas J., Burger, T., & Lawrence, F. G. (1989). The structural transformation of the public sphere: an inquiry into a category of bourgeois society (Ser. Studies in           contemporary german social thought). MIT Press.


Beech, H. (2021б March 24). The New York Times. ‘I Will Die Protecting My Country’: In          Myanmar, a New Resistance Rises.              protests.html


Human Rights Watch. (2022, April 4). Bangladesh: New Restrictions on Rohingya Camps.          Human Rights Watch.      restrictions-rohingya-camps


Jungherr, An., Rivero, G. & Gayo-Avello, D. (2020). Retooling Politics: How Digital Media       are Shaping Democracy. Cambridge University Press.

Margetts, H. (2018), Rethinking Democracy with Social Media (chapter 9), In Gamble An. &      Wright T. Rethinking Democracy. Wiley.


Mchangama, J. (2022). Free speech: a history from socrates to social media (First). Basic            Books, Hachette Book Group.


Milmo, D. (2021).    facebook-myanmar-genocide-us-uk-legal-action-social-media-violence


Nover, S. (2021, October 7). Quartz.    whistleblower-doesnt-want-the-company-broken-up/


Owen, L. and Aung, K. K. (2021, December 9). Myanmar coup: The women abused and tortured in detention. BBC.


Papacharissi, Z. (2004). Democracy online: Civility, politeness, and the democratic potential       of online political discussion groups. New Media & Society, 6(2), 259-283.


Papacharissi, Z. (2021). Democracy on the Run. After Democracy. Yale University Press


Precht, R. D. (2018). Jäger, Hirten, Kritiker. Goldmann.


Rachel, G. (2021, October 26). Frances Haugen’s leaks should show us the real significance        of digital sovereignty. SciencesPo.        numerique/en/2021/10/26/frances-haugens-leaks-should-show-us-the-real-  significance-of-digital-sovereignty/


Renshaw, C. S. (2013). Democratic transformation and regional institutions: The case of Myanmar and ASEAN. Journal of Current Southeast Asian Affairs, 32(1), 29–54.


Rigi, J., & Prey, R. (2015). Value, rent, and the political economy of social media.                                    The Information Society, 31(5).


Rio, V. (2020). The Role of Social Media in Fomenting Violence: Myanmar. [White paper].         Toda Peace Institute.     myanmar-v2.pdf


Sablosky, J. (2021). “Dangerous organizations: Facebook’s content moderation decisions and      ethnic visibility in Myanmar.” Media, Culture & Society43(6), 1017–  1042.


Selth, A. (2020). Interpreting myanmar: a decade of analysis. ANU Press. 


Steve Stecklow. (2018, August 15). Why Facebook is losing the war on hate speech in      Myanmar. Reuters.    hate/


Than, T. M. (2005). Dreams and nightmares: State building and ethnic conflict in Myanmar         (Burma). In K. Snitwongse & S. W. Thompson (Eds.), Ethnic conflicts in Southeast   Asia (pp. 65–108). ISEAS-Yusof Ishak Institute.


Whitten-Woodring, J., Kleinberg, M. S., Thawnghmung, A., & Thitsar, M. T. (2020). Poison       If You Don’t Know How to Use It: Facebook, Democracy, and Human Rights in     Myanmar. The International Journal of Press/Politics25(3), 407–            425.

Close Menu