Title: Deplatforming and Adaptation: Similarities Between Religious and Ideological Extremism
Outright deplatforming and suppression of religiously and ideologically motivated extremist users, groups, and content does not appear to effectively curb their influence or prevent the radicalization of new users. This article outlines the key similarities between the operational dynamics of religiously and ideologically motivated extremism on social media. The article argues that more nuanced and sophisticated tactics than deplatforming are necessary to counter these phenomena. These tactics might involve memetic counter-messaging in order to dilute stigmergic signals, encouraging centralization and the identification of participants, among others.
The Spectrum of Extremis
On March 17th this year, Mike Burgess, the Director-General of the Australian Security Intelligence Organisation (ASIO), announced in his Annual Threat Assessment speech that ASIO will be changing its categorization of violent threats. Instead of the usual categorization of ‘left wing’, ‘Islamic’, and ‘right wing’ extremism, ASIO will now refer to two categories: religiously motivated violent extremism, and ideologically motivated violent extremism. According to Burgess, the current terminology of ‘left’ and ‘right’ no longer adequately describes the phenomena with which ASIO and similar government agencies are contending. ASIO’s new terminology focuses ‘on an individual or group’s support for violence’, encompassing a broad spectrum of phenomena from ISIS’ religious motivations, to Christchurch shooter Brenton Tarrant’s ideologically motivated violence. This is a recognition that the labels of ‘extreme left’, ‘extreme right’, and ‘Islamic’ extremism are too crude to capture the complex spectrum of radicalization and social polarization along ideological and religious lines.
This shift in categorization also suggests that there are parallels between the operational dynamics of religious and ideological extremism. Some religious and ideological radicals have exhibited similar tactics in their efforts to resist suppression across social media. These tactics involve mimicry of moderate and traditional views, strategic use of memes and memetic images, decentralization, frequent community renaming, and migration to alternative media spaces. While to an outside observer these appear as tactical decisions, from a network perspective the decision to decentralize, migrate to an alternative media platform, or adopt a new mode of communication involves a change in organizational structure and operational dynamics. In other words, when a collection of influencers espousing extremist views are deplatformed from popular social media, their followers do not dissolve in the ether, but migrate to more secure platforms, decentralize, and develop modes of communication that are much harder to suppress. These operational dynamics are agnostic of religion and ideology and can be observed across the extremist spectrum. An examination of the similarities between the operational dynamics of religious and ideological extremists lends context to the questionable success of deplatforming extremist content and groups from popular social media platforms.
Decentralized Networks
As Vice Admiral Arthur Cebrowski (USN) and John Garstka observed in their seminal 1998 paper Network-Centric Warfare – Its Origin and Future, the communication and control affordances of the Internet act as a force multiplier for weaker combatants opposing stronger forces. Cebrowski and Garstka acknowledged that extremist organizations would take advantage of these affordances, as indeed they have, making use of a range of social media and content sharing platforms for their propaganda purposes. Today, diverse, dispersed, and geographically isolated extremist groups can collaborate to produce, aggregate, disseminate, and access information in real-time across multiple platforms. This allows them to maintain an advantage relative to law enforcement agencies in the operational cycles of information warfare, regardless of their religious or ideological motivations.
From the perspective of extremist networks, the operational cycle of information warfare involves the adaptation to suppression efforts by social media platforms and law enforcement, followed by the sharing of extremist content and coordination between extremist groups. In this context, it is important to understand that the deplatforming of content and users perceived as extremist from popular social media such as Facebook, Twitter, and YouTube, does not make them disappear into the ether. Rather, platform-centric suppression forces religious and ideological extremist groups to decentralize or migrate their content and users to less known and often more secure communication platforms. Moderate content remains available on popular social media while more extreme content moves to harder to monitor spaces only available through Tor secure and encrypted social media platforms such as Telegram or federated social networks such as Mastodon. The diffusion of content and decentralization of group dynamics seem to appear organically, as is the case with Telegram, where smaller and less visible channels proliferate and act as gateways, funneling traffic to more influential channels by linking to and resharing their content.
Case studies on the effect of deplatforming and content suppression for religious and ideological extremist networks demonstrate that these tactics are not effective in curbing their influence and preventing the radicalization of new users. When social media platforms apply pressure on extremist groups, either through targeted bans or deplatforming of entire groups and content categories, groups usually react by forming highly decentralized networks often subdivided into multiple loosely linked accounts. In addition, what may appear as network fragmentation from the outside can also be viewed as the formation of overlapping networks along a spectrum of extremist views, with moderate accounts acting as gateways to more extreme content.
Another effect of decentralization is the adoption of autonomous or semi-autonomous operations towards a shared common goal, which author John Robb describes as a ‘plausible promise.’ Small, autonomous and geographically dispersed groups may band together for ‘social media raids’ unified by the plausible promise of propaganda impact. Social media raids such as #DraftOurDaughters, #OperationStarbucks, or #OperationGoogle, coordinated on 4chan’s /pol/ board, are good examples of this tactic. These raids resemble the swarm pulse tactics described by RAND scholars John Arquilla and David Ronfeldt, in that they quickly overwhelm unsuspecting targets, confuse suppression attempts, and boost recruitment and morale. Swarm tactics are agnostic of religion and ideology and involve spontaneous coordination between mission-affiliated groups in a variety of information warfare tasks. In this context, encouraging the centralization of extremist groups might be a more effective suppression tactic, as it counters the strengths of decentralized networks.
Stigmergic Signaling
Both religious and ideological extremists make use of the mimicry of moderate and traditional views, values, and lifestyles, as well as traditional quotes and images which act as semantic payloads. In other words, images are infused with specific meaning legible to a target audience but often incomprehensible to a wider public. Signaling through memes with a specific semantic payload allows decentralized extremist networks to communicate and coordinate in the open on social media. This type of signaling is an essential component of memetic warfare, which has been described by Jeff Giesea as ‘competition over narrative, ideas, and social control in a social-media battlefield’, in other words, a form of information operations focused specifically on social media. Memes injected with religious or ideological semantic payloads can act as pheromone trails in a process of asynchronous communication similar to the way ants use pheromones to create chemical trails for other ants to follow. This form of delayed communication through signs is termed stigmergy, from the Greek stigma (sign) and ergon (to act). In stigmergic signaling there is no direct communication and coordination between individual participants. Instead, the semantic payload injected in a meme acts as the connective tissue of the communication network. These targeted memetic payloads can be left across a number of public and anonymous platforms, in effect allowing communication and decision making through memetics.
Stigmergic communication and coordination can operate on a number of levels simultaneously, without ever presenting a static communication network that can be targeted and suppressed by social media companies or law enforcement. Environmental signals indicating a change in conditions, specific symbolic markers produced by individuals or groups, an observable success of a specific operation that can be quickly copied and replicated by other groups or individuals, or various combinations of the above can all overlap. Brenton Tarrant painted his weapon with the names of European historical figures and battles such as Charles Martel and the battle of Tours, the Elder Futhark Odal rune, the number ’14’ referring to a passage from Hitler’s Mein Kampf, and the Black Sun symbol, knowing that each of these elements carry specific memetic payloads which will be circulated widely by global media. In this instance they act as symbolic markers to be associated with his operation and replicated by others. There is a clear stigmergic link between Tarrant’s symbolic markers and the subsequent attacks by John Earnest, the perpetrator of the Poway synagogue attack, and Patrick Crusius, the perpetrator of the El Paso Walmart attack, both of whom posted their own manifestos to the 8chan /pol/ board in a clear effort to replicate Tarrant’s markers.
Moreover, in the context of active network suppression by social media platforms and law enforcement, decentralized communication and coordination can be achieved through the self-synchronization of participants using stigmergic communications. Operating openly across public social media platforms allows decentralized extremist networks to massively scale their content generation, as there is no cost of entry for unaffiliated participants. In addition, operating in public generates fast feedback loops across the network, as diverse groups can observe each other’s stigmergic signals, replicate successful tactics, and share their own iterations. Blanket deplatforming of content is also ineffective in suppressing these tactics, as groups can easily iterate through memetic payloads. Instead, a more nuanced range of memetic counter-messaging tactics could be used to disrupt stigmergic signaling. This could involve the diffusion of extremist memetic payloads through the injection of alternative associative links into the symbols used as markers, or the saturation of a stigmergic trail with garbage information.
Anonymity
The blanket deplatforming from popular social media platforms of users, groups and content perceived as extremist, pushes them to unpoliced and obscure content sharing platforms that are usually unknown to the broader public and therefore more anonymous. This dynamic was first clearly observed in the migration of ISIS-related users, groups, and content to Telegram, followed by the migration of pro-Trump and right-wing extremist users, groups, and content to the same platform.
Anonymity, when coupled with the ability to freely aggregate, distribute and access content, renders social media platforms into automated message amplifiers. This allows decentralized networks to maintain latent ties, perform stigmergic communication, and, in the case of meme warfare attacks, coordinate swarm formation. As has been observed both in ISIS information operations on Telegram and pro-Trump meme warfare operations on 4chan, anonymity allows individual users not connected to existing extremist networks to produce, access, and share propaganda content in information warfare operations. Anonymity is also a fundamental component of memetic warfare campaigns, such as the #DraftOurDaughters campaign from the 2016 US presidential elections. In this case, encouraging centralization and social capital building might be more effective counterstrategies to blanket deplatforming, as they lead to a more nuanced understanding of the participants in the network, the identification of participants acting as influencers or hubs, and more effective disruption of violent extremism.
Conclusion
Religious and ideological extremism on social media is characterized through common elements such as decentralized networks operating across public and secure media platforms, anonymous participants, and the use of stigmergic signaling. Outright deplatforming and suppression of religiously and ideologically motivated extremist users, groups, and content does not appear to effectively curb their influence or prevent the radicalization of new users. An alternative approach might involve more nuanced and sophisticated tactics aimed at disrupting stigmergic signaling and the strengths of distributed networks exhibiting swarm-like behavior. These might range from memetic counter-messaging in order to dilute stigmergic signals, to encouraging centralization and the identification of participants, among others.
. . .
Dr. Teodor Mitew is a Senior Lecturer in Digital Media and Communication, and Discipline Leader of the Creative Industries at the University of Wollongong, Australia. He has previously published on meme warfare, ISIS content distribution tactics, stigmergic operations and extremist use of encrypted communication apps.
Image Credit: Pixahive (via Creative Commons)
Recommended Articles
How the New Geopolitics of Energy Informs the Current Oil Price-Risk Relationship in the Middle East
The traditional correlation between Middle East conflict risk and accelerating oil prices is now broken. Oil markets are well-supplied by non-OPEC production, and weak demand in Asia and a longer-term…
The “America First” movement enthusiastically supports the use of military force in Mexico to solve the drug crisis afflicting the United States today. President-elect Donald Trump, his running mate JD…
Under President Jair Bolsonaro, deforestation of Brazil’s Amazon rainforest surged as the government cut the budgets of environmental protection bodies and submitted them to military control. In 2023, newly elected…