Skip to content

Amil Khan

Research Fellow

Amil Khan is the founder of Valent Projects, a digital communications agency for social impact. Until recently, Amil was a UK government senior strategic communications expert with a special focus on international conflict. Amil advised several UK government departments as well as senior decision makers from governments across Middle East and Africa. His work has ranged from countering Disinformation to supporting complex socio-economic policy shifts. A former Chatham House associate fellow, Amil came to government after an award-winning career in journalism, working for the BBC and Reuters as a foreign correspondent and documentary film maker focusing on violent insurgencies in Iraq and Sudan. Since Valent's founding in late 2019, Amil has designed complex online research projects and developed and implemented data-based digital strategy for election candidates.

Array ( [0] => WP_Post Object ( [ID] => 4597 [post_author] => 67 [post_date] => 2020-03-30 11:57:23 [post_date_gmt] => 2020-03-30 11:57:23 [post_content] => The Coronavirus pandemic is becoming a watershed moment for how states tussle on the international stage. Spies, special operations and high-stakes negotiations are no longer the tools of choice. Instead, it has become clear that Disinformation, an information manipulation technique born in conflict zones, is becoming increasingly normalised as a political weapon – particularly in times of crisis. The World Health Organization (WHO) declared on February 2nd 2020 that the Coronavirus outbreak (as it was described at the time) was accompanied by an ‘infodemic’.[1] WHO’s characterisation covered an array of false information. At this point, in late March, almost everyone has seen posts and gotten WhatsApp forwards suggesting dubious remedies and advice. Disinformation, however, is slightly different but highly toxic subset of the infodemic, and China has become the most recent convert to its benefits. The UK’s Digital, Culture Media and Sport committee,[2] refers to Disinformation as the ‘deliberate creation and sharing of false and/or manipulated information that is intended to deceive and mislead audiences, either for purposes of causing harm, or for political, personal or financial gain.’[3] Much of the information about homebrew treatments and even misplaced blame can be categorised as ‘misinformation’ because, although it may be false, is believed to be true by the people making and sharing it. This simple distinction gives rise to much more fundamental differences in the way the two forms of false information are created and spread. Since Disinformation is being done purposefully, for real-world advantage, it has evolved a ‘playbook’; certain techniques and (mostly digital) assets which are commonly used to make sure the information payload hits the right targets. The digital Disinformation playbook started taking shape during the Crimea conflict in 2014. A number of Russia specialists have noted that the Kremlin used largely conventional means during the 2008 conflict with Georgia. Alongside the military effort, spokesman and media specialists were dispatched to engage the international media.[4] The Kremlin even contracted the services of Washington based public relations and lobbying firms much as Western allies tend to do.[5] The result, however, from the Kremlin’s perspective was disappointing. Josha Foust, a former US intelligence analyst and national security fellow at the Foreign Policy Research Institute, pointed out in an article on the influential US national security blog War on the Rocks that 18 months after the Georgia conflict, Russia published an updated military doctrine that laid out the policy framework for its use of information as a tool of international competition.[6] In the document, the Russian ministry of defence outlined its view of effective modern conflict management as including; ‘[T]he prior implementation of measures of information warfare in order to achieve political objectives without the utilization of military force and, subsequently, in the interest of shaping a favourable response from the world community to the utilization of military force.’ It went on to add that to be more effective the Russian military needed ‘to develop forces and resources for information warfare.’[7] It was during the conflict with Ukraine over Crimea that it became clear that one of the key characteristics of the new approach involved the use of traditional media, an official spokesman and online trolling to seed counter narratives (conspiracies) and cast doubt over well-established facts. Much has been written about how Russia used Disinformation as part of an ongoing tussle with the West. But it is as a crisis management tool that Disinformation is taking shape as an effective and widely accepted practice. Russia, of course, used Disinformation as a crisis management tool when its allies were accused of shooting down Malaysian airlines flight MH17. But it was already being trailed abroad by Russia a year previously when very similar techniques were used in Syria to deflect blame from its allies in Damascus when they were accused of using chemical weapons for the first time. However, by 2017, as digital technology evolved and the techniques were refined through trial and error, it became clear social media would form the infrastructure for crisis communications by way of Disinformation. As an adviser to Syrian opposition groups, I watched the playbook unfold when Damascus was again accused of using chemical weapons in 2017. The process by this point had become almost clock work; the core line of effort involved a ‘seed’ article published by a regime-controlled English-language outlet. This article, which claimed rebels had used the chemical weapons on themselves as a ‘false flag’ attack was then linked to and quoted by friendly Western conspiracy news outlets such as InfoWars. Anonymous internet accounts were used to amplify the narrative by tweeting links to the various outlets publishing the claims. Finally, the narrative reached right-wing mainstream US outlets like Fox News and became a legitimate – if not prevalent - point of view. All the while, Russian official figures including ministers and ambassadors would fall in behind the central effort, pointing out that ‘legitimate questions’ were being asked about the Western official version of events. Networks of interlinked Twitter and Facebook accounts pumping out video, photos and graphics of false and misleading information form the digital backbone of an effective Disinformation campaign. As with many military innovations, although a state first developed this technique, it was inevitable it would become an offering on the open market. A number of investigations in the public domain show that with enough money, it is not difficulty to buy commercial support capable of building that infrastructure. In August last year, Facebook announced it had removed 102 pages engaged in what the company calls ‘coordinated inauthentic behaviour’ (an activity it does not allow on its platform).[8] Two firms, one in Egypt and the other in the UAE, were found to be running the accounts in order to support Libyan warlord Khalifah Haftar and other factions in the Middle East supportive of the foreign policy objectives of allies: Egypt, the UAE, Bahrain and Saudi Arabia. Although it has yet to be investigated sufficiently, it is highly probable that similar capabilities were deployed in the UK 2019 general election when the Conservative Party faced a moment of crisis following public outcry after the photo of a sick young boy forced to lie down on the floor of a hospital in Leeds went viral. Questions were asked but we do not yet know where a post containing the counter narrative that the boy’s mother faked the photo originated from. It is also not clear who was behind the bot accounts on Twitter and Facebook that promoted the false narrative to prominent journalists.[9] It is, therefore, hardly surprising that China has resorted to similar techniques to counter the Trump administration’s efforts to suggest the Chinese government bears the responsibility for the Coronavirus pandemic and its damage to the global economy. Observers have noted that traditionally China’s approach to information operations did not include coordinated activity across multiple actors to promote alternative narratives (i.e. conspiracy theories) – the essence of the Russian Disinformation playbook. This has now changed. “What we see here is a story of action and distraction with a convergence in tactics between Russia and China that we haven’t previously seen,” said Edward Lucas, senior vice president at the Center for European Policy Analysis (CEPA) during an online panel discussion hosted by the think tank.[10] Whereas in the past, Lucas went on to explain, Chinese external messaging hinged around a narrative that portrayed China as a reliable partner in a joint mission to benefit mankind, the post-pandemic messaging was veering closer to Russia’s more conspiratorial and combative approach. Alexandre Alaphilippe, Executive Director at EU DisinfoLab, made the point during the panel discussion that a new feature of the Disinformation playbook now being practised by China is the use of mass direct messaging via apps such as WhatsApp and Telegram on Western audiences, which had previously been confined to Asia and South America. The nature of these apps makes it easier for China to include content that ‘impersonates authority’, he added. In other words, messages that are mocked up to look like they are from local governments or other leadership figures. There is little doubt that the Coronavirus pandemic represents a threat and an opportunity to China’s international standing, and therefore its national interests. The fact that the state chose to abandon its traditional approaches and adopt the Disinformation crisis communications approach demonstrates that this approach is here to stay. If anything, it is likely to be seen as effective and adopted and adapted further. Democracies and rules-based systems have to come to grips with this now not-so-new phenomenon and develop short-term counter measures, and re-examine their communications and influence practices from the ground up. [1] WHO, Novel Coronavirus (2019-nCoV) Situation Report – 13, February 2020,[2] Digital, Culture, Media and Sport Committee, Disinformation and ‘fake news’: Final Report, House of Commons, February 2019,[3] Ibid. 2, p.10[4] Clifford J. Levy, Russia Prevailed on the Ground, but Not in the Media, The New York Times, August 2008,[5] David Teather, PR groups cash in on Russian conflict, The Guardian, August 2009,[6] Joshua Foust, Can fancy bear be stopped? The clear and present danger of Russian info ops, War on the Rocks, September 2016,[7] Carnegie Endowment, Text of newly-approved Russian military doctrine, Text of report by Russian presidential website on 5 February 2010,[8] Nathaniel Gleicher, Removing Coordinated Inauthentic Behavior in UAE, Egypt and Saudi Arabia, Facebook, August 2019,[9] Factchecking service Full Fact tracked the information manipulation efforts around the incident:[10] CEPA, Infektion Points: Russian and Chinese Disinformation on the Pandemic, YouTube, March 2020, [post_title] => Coronavirus response shows Disinformation is the new normal [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => open [post_password] => [post_name] => coronavirus-response-shows-disinformation-is-the-new-normal [to_ping] => [pinged] => [post_modified] => 2020-04-14 10:50:22 [post_modified_gmt] => 2020-04-14 10:50:22 [post_content_filtered] => [post_parent] => 0 [guid] => [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 0 [filter] => raw )[1] => WP_Post Object ( [ID] => 3492 [post_author] => 67 [post_date] => 2019-05-31 12:02:26 [post_date_gmt] => 2019-05-31 12:02:26 [post_content] =>

In April 2017, I was sitting in an office with Syrian rights activists watching, in real time, as the Syrian regime and Russia manipulated a second US president.

The current discussion about disinformation is focused very firmly on Trump’s election and the Brexit referendum. As critical as both issues are to the future of international stability, the emotion vested in both topics obscures rather than illuminates the impact disinformation has on the way influence is being wielded on the global stage.

Somewhere between the United States (US)-led invasion of Iraq and the Arab uprisings against dictatorial rule, something fundamental in the way global audiences receive and react to information changed. Political insurgents had long been watching, waiting for a chance to challenge the established power of their enemies. Ayman al-Zawahiri, Al Qaeda’s number two at the time, famously declared in the early years of the 21st century that “more than half of this battle is taking place in the battlefield of the media”. [1] As a Middle East reporter, I saw the effort the group put into trying to circumvent traditional media and reach its intended audiences directly. 

Russia had also been watching. From 2013 to 2017, I worked with Syrian activists hoping to remove the corrupt and abusive regime in Damascus. My time working on Syria was bookended by highly sophisticated disinformation campaigns that melded diplomatic, media and military activity in order to achieve real-world outcomes. Examining the techniques used by Russia (and adopted by other actors since) raises questions about the measures policy makers are hoping will return us to a simpler era.

The two most sophisticated Russian disinformation operations I observed came in response to the Damascus regime’s use of chemical weapons against civilians. Russia’s efforts in both cases is not surprising when you consider that the Kremlin’s investment in the Assad regime was most seriously threatened – not by the military or political efforts of its adversaries – but by international blowback to its ally’s use of poison gas.

Between 2013 and 2017, Russia had hugely developed its techniques in line with changes in the global media environment. Earlier efforts focused on manipulating established media outlets, or at least leaning on their credibility, while the later campaign largely bypassed established media in favour of fringe outlets and social media networks. In both cases, however, the underlying strategic approach displayed a keen understanding that Western decision makers will ultimately be constrained by popular opinion; while the implementation plan recognised that Western media systems can be gamed or bypassed.

On the morning of the 21st August 2013, I was told by distraught Syrian colleagues that the regime had used poison gas on sleeping civilians in an area besieged by government forces and the Lebanese militia, Hezbollah. The Syrian opposition, advised by individuals, such as myself, with backgrounds in journalism or political lobbying, called on locals to send through videos showing proof of the regime’s actions, which were forwarded on to journalists covering the story. The Kremlin was quick to establish an alternative depiction of events. Russian Foreign Minister Sergei Lavrov told journalists the rebels had used the weapons on their own families in the hope of provoking a Western military response. [2] Russia’s effort seemed doomed to failure. Responding to the weight of facts emerging from the ground, most media outlets concluded the regime was guilty. A Western military response seemed inevitable.

When I first noticed photos and comments purporting to be from ordinary Americans calling on their government not to come to the “aid of Al Qaeda in Syria”, I wasn’t unduly concerned. [3] However, the posts seemed to become more numerous as the United Kingdom’s (UK’s) House of Commons geared up for a vote on military action against the regime. On the morning of the vote, journalists, political analysts and even political figures started mentioning they had heard that respected US news agency Associated Press (AP) was reporting that the rebels were actually responsible. In several public statements, Lavrov coolly questioned the Western focus on military action when the “evidence is not something revolutionary. It’s available on the internet”. [4]

It was days after the government of David Cameron lost the Commons vote and the Obama administration climbed down from the threat of military action that we were able to understand what was happening. AP had not found that the rebels were responsible for killing their own sleeping children. Instead, a little-known outlet called Mint Press operating from the US with assistance from a supporter of the regime had printed an article from a stringer who had travelled to Syria and repeated claims he had heard whilst there. As an Arabic speaker, he asked a friend – who also sometimes freelanced for the AP – to help with copy editing. Mint Press very prominently played up the tenuous connection, and other outlets, online trolls and regime supporters obscured the issue further until social media Chinese whispers transformed a rumour into a popularly-believed ‘fact’.

Of course, it’s not possible to determine what part any specific manipulative technique played in the final outcome of the situation. What is clear, however, is that the Kremlin had realised winning the battle for popular perceptions meant winning wars. It was also clear that Russia had been able to game Western news media through a very sophisticated understanding of how stringers work, how to pitch to editors and the way information moves between outlets - morphing slightly as it does so according to the editorial policy of individual websites, newspapers and channels. For context, it is important to understand that Russia had developed and successfully exploited an intimate understanding of the weaknesses of free media – of which it has little domestic experience of itself.

The second time Russia needed to bail out the Syrian regime from the possibility of direct and catastrophic Western military intervention, the White House had a new occupant and the news environment looked radically different. Russia’s experience seeding narratives and moving them between different information eco-systems was put to good use.

On the morning of the 4th April 2017, Syrian social media networks exploded with harrowing photos and videos of men, women and children choking to death as they convulsed on the ground. Western journalists raced to uncover what had happened, much as they did four years previously. Similarly, Syrian activists raced to find them footage and eye witnesses in the hope they would be able to prove beyond doubt what was happening on the ground. Activists expected to have to counter spurious claims on news media. However, the Russian playbook had evolved. It now didn’t need traditional media.

On the afternoon of the day of the attack, Al Masdar News (AMN), an English-language outlet run by the Syrian regime, published an article that presented several arguments that suggested claims the regime carried out the attack were false. The arguments used techniques such as ‘foreknowledging', to misrepresent time stamps on social media posts as ‘proof’ timelines did not add up, as well as claiming various online comments by a collection of unrelated actors ‘proved’ the mainstream narrative was false. Ordinarily, such an article would be lost amongst the cacophony of online argument. However, the initial article was not meant to achieve the desired impact itself. It was merely a seed.

The next day, the AMN article was reprinted, quoted heavily or copied without attribution across a number of conspiracy-orientated ‘news’ sites such as Global Research, 21st Century Wire and Russophile’s Blog. Amongst the sites that used information from the AMN article without attribution was the infamous InfoWars. By the time the alternative narrative had reached the Alex Jones-fronted outlet, it had developed into an anti-Soros conspiracy.

InfoWars is itself highly influential in the US right-wing news eco-system. Data analysis based on link sharing in 2016 shows it is more often quoted in US right-wing networks than the Washington Post and NBC News are in centrist or left-wing circles. However, InfoWars was not the ultimate aim. Rather the outlet was a gateway into far more pervasive and influential networks - the right-wing US social-media sphere. InfoWars and Alex Jones himself (or someone running his account) tweeted the article. [5] The article was then picked up and amplified by bots – automated Twitter accounts – using a shared hashtag. Many retweeted the article hundreds of times in a two-day period. The final step of the strategy involved high-profile right-wing figures – the American equivalents of UK’s Katie Hopkins – retweeting the article, but also – crucially – repeating the arguments and talking points during their many appearances on traditional media platforms. Ultimately, the idea that a huge conspiracy involving Al Qaeda, George Soros and Western governments acting in unison became a plausible counter argument and was treated with equal weight to facts being checked by international organisations and trustworthy news organisations.

The impact of the campaign can be gauged by the fact that Donald Trump’s instinctive reaction to punish the Syrian regime through airstrikes led to members of his own base protesting outside the White House and threatening to withdraw their support on right-wing media platforms. [6]

The difference in tactics used in 2013 and 2017 shows that the ongoing rise of socially distributed and consumed news and the fall in influence of traditional news outlets has made it easier to influence events by manipulating the information key audiences consume. In 2013, it was still felt necessary to lean on AP’s credibility, to engage a real-life journalist and persuade him of the merits of publishing an incredulous story. In 2017, bots and pliable news outlets were enough. There was no need to risk engaging established organisations or to work around their editorial policies. There was also little need to deploy expensive Facebook advertising of the sort examined in recent inquiries in the UK and US.

Like other sorts of arms races, these techniques are being closely watched by other actors. Recent tension between Gulf countries and Iran has resulted in an explosion of fake accounts and coordinated hashtag promotion in Arabic. [7] Facebook also recently closed down ‘inauthentic’ accounts it said were being used by India and Pakistan in their own political tussles. [8]

Disinformation works because it fits within the grain of a new reality. Two key underlying factors in its success are the increased relative importance of mass groups of individuals, and the need to understand the world through the lens of those individuals. Recognising both these factors is vital for any organisation wishing to have any sort of influence. So far, we have seen malign actors recognise these factors and put them to use. The question for those working to lift living standards, increase the remit of international human rights, and achieve other positive outcomes is how to ethically and morally adapt their methods to the same new reality.

Amil Khan, an associate fellow at Chatham House, is a former Reuters journalist and government adviser. He now runs Valent Projects, an agency focused on the challenge posed by the new information environment.

Photo by Randall Munroe, published under Creative Commons with no changes made.

[1] David Esnor, Al Qaeda letter called ‘chilling’ - Al-Zawahiri to al-Zarqawi: Prepare for U.S. to leave Iraq soon, CNN World, October 2005,

[2] William Echols, Lavrov Sounds False Alarm Over ‘Staged’ Syria Strikes, Polygraph, September 2018,

[3] Paul Szoldra, Some US Troops Appear To Be Posting Photos In Protest of Syrian Intervention, Business Insider, September 2013,

[4] Ibid.

[5] Alex Jones’ twitter account has been suspended so his archive of tweets is no longer live on the platform. However, reference to his tweets from the time are recorded on other sites. Joshua Gillin, Conspiracy claims that Syrian gas attacks was false flag’ are unproven, Politifact, April 2017,

[6] Maxwell Tani, Some of Trump’s more hardline online supporters are slamming him over striking Syria, Business Insider, April 2017,

[7] Marc Jones and Alexei Abrahams, A plague of Twitter bots is roiling the Middle East, The Washington Post, June 2018,

[8] Aditya Kalra and Saad Sayeed, Facebook deletes accounts linked to India’s Congress party, Pakistan military, Reuters, April 2019,

[post_title] => International Affairs in the Disinformation Age [post_excerpt] => [post_status] => publish [comment_status] => open [ping_status] => open [post_password] => [post_name] => international-affairs-in-the-disinformation-age [to_ping] => [pinged] => [post_modified] => 2019-09-24 11:17:03 [post_modified_gmt] => 2019-09-24 11:17:03 [post_content_filtered] => [post_parent] => 0 [guid] => [menu_order] => 0 [post_type] => post [post_mime_type] => [comment_count] => 1 [filter] => raw ))

 Join our mailing list 

Keep informed about events, articles & latest publications from Foreign Policy Centre