The Coronavirus pandemic is becoming a watershed moment for how states tussle on the international stage. Spies, special operations and high-stakes negotiations are no longer the tools of choice. Instead, it has become clear that Disinformation, an information manipulation technique born in conflict zones, is becoming increasingly normalised as a political weapon – particularly in times of crisis.
The World Health Organization (WHO) declared on February 2nd 2020 that the Coronavirus outbreak (as it was described at the time) was accompanied by an ‘infodemic’.[1] WHO’s characterisation covered an array of false information. At this point, in late March, almost everyone has seen posts and gotten WhatsApp forwards suggesting dubious remedies and advice. Disinformation, however, is slightly different but highly toxic subset of the infodemic, and China has become the most recent convert to its benefits.
The UK’s Digital, Culture Media and Sport committee,[2] refers to Disinformation as the ‘deliberate creation and sharing of false and/or manipulated information that is intended to deceive and mislead audiences, either for purposes of causing harm, or for political, personal or financial gain.’[3]
Much of the information about homebrew treatments and even misplaced blame can be categorised as ‘misinformation’ because, although it may be false, is believed to be true by the people making and sharing it. This simple distinction gives rise to much more fundamental differences in the way the two forms of false information are created and spread. Since Disinformation is being done purposefully, for real-world advantage, it has evolved a ‘playbook’; certain techniques and (mostly digital) assets which are commonly used to make sure the information payload hits the right targets.
The digital Disinformation playbook started taking shape during the Crimea conflict in 2014. A number of Russia specialists have noted that the Kremlin used largely conventional means during the 2008 conflict with Georgia. Alongside the military effort, spokesman and media specialists were dispatched to engage the international media.[4] The Kremlin even contracted the services of Washington based public relations and lobbying firms much as Western allies tend to do.[5] The result, however, from the Kremlin’s perspective was disappointing. Josha Foust, a former US intelligence analyst and national security fellow at the Foreign Policy Research Institute, pointed out in an article on the influential US national security blog War on the Rocks that 18 months after the Georgia conflict, Russia published an updated military doctrine that laid out the policy framework for its use of information as a tool of international competition.[6]
In the document, the Russian ministry of defence outlined its view of effective modern conflict management as including; ‘[T]he prior implementation of measures of information warfare in order to achieve political objectives without the utilization of military force and, subsequently, in the interest of shaping a favourable response from the world community to the utilization of military force.’ It went on to add that to be more effective the Russian military needed ‘to develop forces and resources for information warfare.’[7]
It was during the conflict with Ukraine over Crimea that it became clear that one of the key characteristics of the new approach involved the use of traditional media, an official spokesman and online trolling to seed counter narratives (conspiracies) and cast doubt over well-established facts. Much has been written about how Russia used Disinformation as part of an ongoing tussle with the West. But it is as a crisis management tool that Disinformation is taking shape as an effective and widely accepted practice.
Russia, of course, used Disinformation as a crisis management tool when its allies were accused of shooting down Malaysian airlines flight MH17. But it was already being trailed abroad by Russia a year previously when very similar techniques were used in Syria to deflect blame from its allies in Damascus when they were accused of using chemical weapons for the first time. However, by 2017, as digital technology evolved and the techniques were refined through trial and error, it became clear social media would form the infrastructure for crisis communications by way of Disinformation.
As an adviser to Syrian opposition groups, I watched the playbook unfold when Damascus was again accused of using chemical weapons in 2017. The process by this point had become almost clock work; the core line of effort involved a ‘seed’ article published by a regime-controlled English-language outlet. This article, which claimed rebels had used the chemical weapons on themselves as a ‘false flag’ attack was then linked to and quoted by friendly Western conspiracy news outlets such as InfoWars. Anonymous internet accounts were used to amplify the narrative by tweeting links to the various outlets publishing the claims. Finally, the narrative reached right-wing mainstream US outlets like Fox News and became a legitimate – if not prevalent – point of view. All the while, Russian official figures including ministers and ambassadors would fall in behind the central effort, pointing out that ‘legitimate questions’ were being asked about the Western official version of events.
Networks of interlinked Twitter and Facebook accounts pumping out video, photos and graphics of false and misleading information form the digital backbone of an effective Disinformation campaign. As with many military innovations, although a state first developed this technique, it was inevitable it would become an offering on the open market. A number of investigations in the public domain show that with enough money, it is not difficulty to buy commercial support capable of building that infrastructure. In August last year, Facebook announced it had removed 102 pages engaged in what the company calls ‘coordinated inauthentic behaviour’ (an activity it does not allow on its platform).[8] Two firms, one in Egypt and the other in the UAE, were found to be running the accounts in order to support Libyan warlord Khalifah Haftar and other factions in the Middle East supportive of the foreign policy objectives of allies: Egypt, the UAE, Bahrain and Saudi Arabia.
Although it has yet to be investigated sufficiently, it is highly probable that similar capabilities were deployed in the UK 2019 general election when the Conservative Party faced a moment of crisis following public outcry after the photo of a sick young boy forced to lie down on the floor of a hospital in Leeds went viral. Questions were asked but we do not yet know where a post containing the counter narrative that the boy’s mother faked the photo originated from. It is also not clear who was behind the bot accounts on Twitter and Facebook that promoted the false narrative to prominent journalists.[9]
It is, therefore, hardly surprising that China has resorted to similar techniques to counter the Trump administration’s efforts to suggest the Chinese government bears the responsibility for the Coronavirus pandemic and its damage to the global economy.
Observers have noted that traditionally China’s approach to information operations did not include coordinated activity across multiple actors to promote alternative narratives (i.e. conspiracy theories) – the essence of the Russian Disinformation playbook. This has now changed. “What we see here is a story of action and distraction with a convergence in tactics between Russia and China that we haven’t previously seen,” said Edward Lucas, senior vice president at the Center for European Policy Analysis (CEPA) during an online panel discussion hosted by the think tank.[10] Whereas in the past, Lucas went on to explain, Chinese external messaging hinged around a narrative that portrayed China as a reliable partner in a joint mission to benefit mankind, the post-pandemic messaging was veering closer to Russia’s more conspiratorial and combative approach.
Alexandre Alaphilippe, Executive Director at EU DisinfoLab, made the point during the panel discussion that a new feature of the Disinformation playbook now being practised by China is the use of mass direct messaging via apps such as WhatsApp and Telegram on Western audiences, which had previously been confined to Asia and South America. The nature of these apps makes it easier for China to include content that ‘impersonates authority’, he added. In other words, messages that are mocked up to look like they are from local governments or other leadership figures.
There is little doubt that the Coronavirus pandemic represents a threat and an opportunity to China’s international standing, and therefore its national interests. The fact that the state chose to abandon its traditional approaches and adopt the Disinformation crisis communications approach demonstrates that this approach is here to stay. If anything, it is likely to be seen as effective and adopted and adapted further. Democracies and rules-based systems have to come to grips with this now not-so-new phenomenon and develop short-term counter measures, and re-examine their communications and influence practices from the ground up.
[1] WHO, Novel Coronavirus (2019-nCoV) Situation Report – 13, February 2020, https://www.who.int/docs/default-source/coronaviruse/situation-reports/20200202-sitrep-13-ncov-v3.pdf
[2] Digital, Culture, Media and Sport Committee, Disinformation and ‘fake news’: Final Report, House of Commons, February 2019, https://publications.parliament.uk/pa/cm201719/cmselect/cmcumeds/1791/1791.pdf
[3] Ibid. 2, p.10
[4] Clifford J. Levy, Russia Prevailed on the Ground, but Not in the Media, The New York Times, August 2008, https://www.nytimes.com/2008/08/22/world/europe/22moscow.html
[5] David Teather, PR groups cash in on Russian conflict, The Guardian, August 2009, https://www.theguardian.com/media/2009/aug/24/public-relations-russia-georgia-ketchum
[6] Joshua Foust, Can fancy bear be stopped? The clear and present danger of Russian info ops, War on the Rocks, September 2016, https://warontherocks.com/2016/09/can-fancy-bear-be-stopped-the-clear-and-present-danger-of-russian-info-ops/
[7] Carnegie Endowment, Text of newly-approved Russian military doctrine, Text of report by Russian presidential website on 5 February 2010, http://carnegieendowment.org/files/2010russia_military_doctrine.pdf
[8] Nathaniel Gleicher, Removing Coordinated Inauthentic Behavior in UAE, Egypt and Saudi Arabia, Facebook, August 2019, https://about.fb.com/news/2019/08/cib-uae-egypt-saudi-arabia/
[9] Factchecking service Full Fact tracked the information manipulation efforts around the incident: https://fullfact.org/electionlive/2019/dec/10/LGI-photo-boy-facts/
[10] CEPA, Infektion Points: Russian and Chinese Disinformation on the Pandemic, YouTube, March 2020, https://youtu.be/sPQs2Eh28oI