Appendices
Appendix A: Types of grey zone influence activities
This section outlines some types and examples of influence activities at various levels, from Grand Strategy and Nation Branding (long term and foundational) to strategic campaigns and operations (which include a series of actions involving a number of actors) and tactical and micro-targeted (shorter-term and focussed).
It is important to warn against conceptually aligning the strategic with only the macro, the operational only with the meso, or the tactical with only the micro. Instead, the emphasis ought to be on the integration between the strategic, operational and tactical aspects of influence; how these levels are intrinsically combined in influence networks or systems.
Additionally, influence can have malign intent, benign intent, or it can have benign intents for some while being indifferent to or hostile towards others.
Examples below illustrate the variety of levels and intentions of influence. It is imperative to underline that these types are not categorically distinct or mutually exclusive. As the examples suggest, there are blurred lines where intentions are unclear or mixed. Such blurriness is unsurprising, given that ambiguity (or ‘greyness’) is a defining element of the complex and uncertain contemporary strategic environment.
In support of grand strategy
Influence activities in support of general foreign policy aims are foundational to a nation’s outlook and its interests. They involve many aspects of government, and non-government sectors like civil society and the private sector, engaging many dimensions of national power including military, diplomatic, economic, and informational and integrating them into ‘statecraft’ – the aligned pursuit of national interests.
In its more benign manifestations, these dimensions of national power may be deployed to project an overall image that is favourable, even flattering. Typically, activities aimed at achieving such positive images are referred to as Nation Branding, and the type of influence aimed for is characterised, initially in the sunnier climes of the immediate post-Cold War period, by Joseph Nye (1990), as soft power – the power to persuade through attractive and positive examples, contrasted with hard power – the power to influence through force. Soft power has been compared with more malign forms of influence, referred to as sharp power (Walker, 2018). Nye himself updated his view to account for post-post-cold war realities to advocate a combination of hard and soft power into ‘smart power’ (Nye, 2009).
Regardless, of the type of power involved, the aim is to identify and pursue national interests. At its most ambitious, this may be referred to as Grand Strategy. Major powers, and those who seek to be great powers, may aim to shape geopolitics in significant ways. The Cold War was characterised by such efforts; more recently, great power geopolitics is resembling earlier struggles. While most nations are not major powers, all nations may seek to shape their relations with other nations, especially with major powers and with those they are most closely engaged with or impacted by. These foreign policy goals are typically long-standing, although they will change due to change in domestic political circumstances or strategic developments. For example, with the collapse of the Soviet Union, for some time Russian foreign policy was based on accommodation and engagement with the West, especially integration with Europe. In 1996, then Foreign Minister Yevgeny Primakov announced a major shift in Russian foreign policy. The Primakov doctrine, which has shaped Russian policy ever since, emphasised the break: “Russia left the path of our Western partners … and embarked on a track of our own” (Rumer, 2019, p.4). The general aims of this policy include pursuit of a multipolar world and hence the end of United States’ unilateral power; Russian primacy in post-Soviet space; opposition to NATO expansion, and partnership with China. This general policy aim is supported by specific aims via campaigns, such as the Internet Research Agency (Ebbott, et al., 2021) and information operations elsewhere, as well as by ongoing actions in Ukraine encompassing hybrid warfare (kinetic warfare in all domains, including cyber, as well as information operations.)
Some strategic shifts can be profound – such as the shift from militant imperialism in the name of a ‘Greater Asian Co-Prosperity Sphere’ to an international policy that eschews military force outside of national defence, enshrined in Article 9 of the Japanese Constitution at the insistence of the United States occupying authority in the aftermath of their defeat in their World War 2 surrender. Others can be more subtle, such as to-and-fro in Japanese foreign policy between Prime Minister Shinzo Abe’s ‘values-oriented diplomacy’, which was introduced during his first period in office in 2006-7, was resurrected at his return to the leadership in 2012, and remains influential, and Prime Minister Yasuo Fukuda’s ‘synergy diplomacy’, which replaced it in the intervening years. These are not mere branding exercises; they set the priorities for international relations’ activities at various levels. Abe’s values-oriented diplomacy includes support for the principles of the global rules-based order, the development of the Quad as a security partnership, and the regional concept of the ‘Indo-Pacific’.
As is typical of middle powers, Australia’s foreign policy is shaped by its relationships, its circumstances and position within the international system, as well as by its capacities and decisions. Though settings and situations shift, Australian foreign policy, as Allen Gyngell (2017) notes, has since the Second World War rested on three main pillars. First, the alliance with the United States is Australia’s main security guarantee. Second, engagement via multilateral institutions is a means to affect international affairs in its own and in global interests. The third is engagement with the region, through trade (especially with East Asia: China, Japan, South Korea), development and humanitarian assistance (especially in the Pacific, but also on occasion with significance in South East Asia, such as Indonesia after the 2004 Tsunami) as well as through diplomatic efforts, population movements, and cultural exchanges. These pillars can be in tension, for example when security and economic relationships are misaligned. They can also be mutually reinforcing, such as when Australia was instrumental in supporting multilateral efforts towards reconciliation and political stability in Cambodia.
In pursuit of specific foreign policy goals
Australia’s contribution to Cambodian peace, reconciliation and development via the Paris Peace Accords are an example of how general foreign policy goals (multilateralism, and regional engagement) are supported by a specific foreign policy campaigns and activities (in this case, diplomatic activism, support for the Paris Peace Accords, and provision of security and constabulary assistance during the subsequent elections).
Other examples may be more self-interested yet remain benign. Nation Branding and soft power activities typically fall into this category. Public diplomacy efforts, promotion of national export industries, marketing campaigns touting a nation as a destination for tourism, education, or migration, and the hosting of international sporting, trade and diplomatic events are examples of benignly competitive efforts in support of nation branding.
Sharp power, however, involves operations in pursuit of specific, often identifiable, or inferred, foreign policy goalsin ways that interfere with the sovereignty, territorial integrity, domestic politics or social cohesion of another nation. Such operations may be in support of kinetic and cyber warfare, either overtly or remaining covert and deniable — at least below the threshold that is likely to provoke kinetic military responses. The most noted recent example of this type of influence campaign is the Russian attempts at foreign interference in the 2016 United States Presidential election, conducted by the Internet Research Agency, operating out of St Petersburg under the direction of the Russian Government.
Russian efforts at malign foreign interference align closely with its grand strategy mentioned above. Its efforts at interfering in democratic processes aim to undermine United States primacy through the election of candidates more favourable to Russia’s interests, and through destabilising democracy and undermining trust in political institutions. Similar activities in NATO countries additionally seek to undermine support for pro-Europe (and by proxy pro-NATO) policies by promoting nativist populism.
More directly, influence operations in Georgia, Ukraine, Belarus and the Baltic states support their strategic goal of primacy in these post-Soviet nations – although these efforts have been more successful in some places (namely, Belarus) than in others.
Some operations may target a single event, such as an election or referendum. Others are ongoing yet similarly aim to influence domestic political outcomes. Harold, Beauchamp-Mustafaga and Hornung (2021), for example, categorise three types of interference in Taiwan involving information campaigns: ongoing efforts aimed at deepening division and depressing confidence in democracy; more discrete and time-bounded efforts to interrupt a scheduled event like an election or a visit by a head of state, and opportunistic attacks aimed at amplifying an adversary’s misstep.
Whether acting with malign or benign intent, influence efforts are – in theory – aligned with a nation’s overall grand strategic goals. In practice, this is a significant challenge and requires coordination between various arms of government and integration of influence activities across all dimensions of national power.
Ideally, efforts of civil society and the private sector would also support national goals. This is difficult for centralised governments who exercise closer control over businesses and wider society; it is more difficult for liberal democracies.
Unofficial but aligned with foreign policy
The challenge of coordination and integration is made more problematic by unofficial and unauthorised efforts undertaken by loosely organised groups bound by shared identities, values, or objectives. These groups can be highly visible and very active, if unreliable, even problematic; their activities have become more possible and more prevalent with the popularisation of the internet 2.0 (the interactive internet in which individual users can create, curate, and distribute content).
Influence operations not officially endorsed by governments may be aligned with foreign policy and therefor supported informally, such as by the relaxation of restrictions on accessing international social media sites (Harold, Beachamp-Mustafaga & Hornung, 2021). These unauthorised efforts may include organised activities including propaganda networks – groups that engage in deliberate efforts to both shape perceptions and actions through targeted messaging, and to co-opt their targets into actively spread these messages via their own networks (Wanless & Berk, 2021). This increases the reach of the messages and adds a level of personal endorsement which may result in the message being more favourably received. These types of campaigns involve groups ranging from the more institutionalised (such as political and civil organisations and religious groups) through to the less formally-constituted (such as ethno-nationalist networks) and even to loosely coordinated online networked publics (sometimes called ‘affinity networks’) of individuals connected via shared interests, ideologies and outlooks. Such groups and their activities are difficult to control, even by the members themselves, and may exceed what is considered appropriate and undermine influence efforts elsewhere.
An illustrative example occurred from January 2016, when moderators of the discussion forum Baidu Dibu urged over 20 million users to target the Facebook pages of the Taiwanese presidential candidate Tsai Ing-Wen and Taiwan media outlet Apple Daily, and other supporters of Taiwanese independence. The Dibu Expedition struck a chord, trending on social media platforms such as WeChat, QQ and Weibo, where it garnered more than 610 million views (Lui, 2019). This combination of fandom, nationalism and digital communication strategies and modalities (including memetic imagery, shared shibboleths and argots, irony, and performative outrage) constitute a form of transnational discourse that at least impacts on diplomatic relations, and may be incorporated into national strategic discourses.
Associated with domestic politics
Influence efforts also includes campaigns, or loosely organised networked online activities, associated with domestic politics. These are plentiful, and common. They are a routine characteristic of political communication in contemporary democracies and associated with a great range of causes and groups. In many ways, current influence efforts are manifestations of the political public sphere and civil society’s engagement with institutional government that has evolved over centuries.
Influence efforts, especially those associated with social media, have also been associated with new forms of political activism in politically repressive societies, and responses by political elites in those societies against activist groups and (other) political opponents. A prime example is the 2011 Arab Spring. In its early phases, activist networks used social media platforms, especially Facebook, Twitter and YouTube, to share content, organise, motivate, publicise events, and shape narratives in order to place political pressure on repressive regimes. As events progressed, these same social media platforms became used by these same repressive regimes and their supporters to identify and target activists. One illustrative experience is that of key Egyptian activist (and former Google employee) Wael Ghonim, who was an early advocate for social media as a vector for popular democratic uprising, what he called ‘Revolution 2.0’ (2012), and who soon thereafter became a vocal critic of Facebook’s laissez faire response to the use of its platform as a vector for disinformation campaigns targeting pro-democracy activists.
Domestic influence campaigns can thus range from benign – the cut and thrust of robust debate that characterises democratic politics — to malign, based on deception or base manipulation, or used as tools for surveillance and repression. The distinction between these is sometimes moot, based upon political preferences and attitudes towards free speech and the responsibility for truth and trust in political campaigning. In other cases, it is less equivocal: the most notorious example to date involving social media is probably the case against Facebook (through its parent company Meta) for its role in facilitating genocide by the Myanmar regime against the Rohingya people.
Domestic influence campaigns are significant for understanding foreign influence efforts in a number of ways. Domestic campaigns targeting identity-based or issue-oriented affinity networks, such as those aligned around controversial and socially divisive concerns, create or highlight how such groups can be exploited by outside actors. Foreign malign interference operations can, and have, infiltrated such groups through adopting false online identities and impersonating normative group behaviours, seeking to influence group attitudes and behaviours.
Also, foreign actors may engage with the same commercial third-party actors used by domestic political actors, such as social media platforms, advertising and political marketing companies, including so-called ‘Black Op’ PR agencies specialising in fake news production (including through automated content creation tools using artificial intelligence), troll and sock puppet accounts, hashtag targeting and other tools of disinformation and political manipulation (Ong & Cabañes, 2018).
Appendix B: From propaganda
to persuasion, from mass to social influence
Though centuries of transformation, in important ways much has remained unchanged regarding the role of propaganda, even as media and communications technologies have advanced. For the purposes of understanding the present set of techno-social conditions, the developments of the 20th century merit some attention. Ethno-nationalism and new political ideologies replaced or absorbed religion as the main driver of propaganda narratives, but the requirements to morally justify war remained.
The need to motivate entire populations increased as warfare extended from the battlefield to the whole of society, especially with the advent of the long-range bombers targeting cities, industries, infrastructures, and civilian populations with destructive force. The means to motivate societies (and undermine opponents) also increased its range and speed, firstly through the use of aircraft to distribute printed material, and then through the inventions of the radio, then the television, the satellite, the internet, and the mobile smartphone – making communication ubiquitous, instantaneous, and personal.
The methods of motivation also developed through the rise of the advertising and public relations industries and growing scholarly communities examining the psychological effects of mass communication on mass populations. Early media scholar Harold Lasswell (1927) for example defined his research focus as “the management of collective attitudes by the manipulation of significant symbols” (p. 627). The early assumptions were that publics could be directly influenced,
Appendices 41 40
en masse, through persuasive messaging, and that this could be of benefit to society as a means to unify and educate mass populations, for the social good.
Attitudes towards such mass influence efforts changed significantly through the interwar years, and propaganda became a pejorative term, for a variety of reasons. The deployment of graphic ‘atrocity stories’ by the patriotic British press against the German adversaries, later found to be exaggerated, resulted in a generational slaughter and subsequent distrust of propaganda. Adolf Hitler pointed to the success of British propaganda and, with Goebbels, based the Nazi Party’s propaganda efforts in part on it: in Mein Kampf, he wrote “Propaganda, propaganda, propaganda. All that matters is propaganda” (cited in Taylor, 2003, p. 241).
Propaganda efforts accelerated during World War Two, with many tactics and narratives emerging that may resonate today, such as: the use by Japan of historical experiences of European colonialism in Asia to promote a negative view of interventions by non-Asians; the claims by Germany to be restoring their ‘rightful’ place in the world, and that German actions were a ‘defensive war’ against aggressive attempts to encircle them, and efforts to promote discord between the Allies through narratives that depicted the British as governed by corrupt elites who would, for example, ‘fight to the last Frenchman’ (cited in Taylor, 2003, p.45).
In the post-war period, propaganda was further discredited as it was blamed for the popular support for the German and Japanese regimes. Campaigns against propaganda were required to redeem these nations and bring them (back) into the fold. Propaganda became associated with totalitarian regimes’ activities aimed at limiting the freedoms of their citizens and contrasted with the free speech and public debate celebrated in democracies.
In the decades that followed, three further developments shaped attitudes toward propaganda.
First, the rise of mass advertising as a driver of consumer culture, especially in the United States, prompted concerns about the role of ‘hidden persuaders’ (Packard, 1957) targeting ordinary people and everyday life:
The use of mass psychosis to guide campaigns of persuasion has become the basis of a multimillion-dollar industry. Professional persuaders have seized upon it in their groping for more effective ways to sell us their wares – whether products, ideas, attitudes, candidates, goals, or states of mind (3).
Second, the experience of the Vietnam War resulted in two lessons: for governments, that coverage of conflict needed to be controlled lest political support diminish (which it did); for citizens, that robust investigative journalism is a requirement for holding governments to account (which it was).
These two developments broadened the concept of propaganda beyond totalitarian regimes, to include the use of media and communications in democratic, free market societies to shape coverage, promote narratives, frame debates and undermine opponents.
A third development provided an alternative, and much less bleak, view of how influence operates by challenging the basic idea that people are easily and directly influenced by propaganda, advertising and the like. Research in the fields of media and communication studies demonstrated instead that people actively engaged with the messages they received, interpreting and interrogating them in complex ways that deny easy analysis.
This ‘limited effects’ theory was summed up as “Some kinds of communication on some kinds of issues, brought to the attention of some kinds of people, under some kinds of conditions, have some kinds of effect” (Berelson, 1959, p. 1).
The concern of media effects researchers then and since, and one of the underlying premises of this report, is that it is difficult, but necessary and to some degree possible, to identify and analyse the ‘some kinds’ of communication, of issues, of people, of conditions.
More recently, contemporary media and communications technology has shifted from the centralised, mass broadcast model to the distributed, digital network formed from the combination of the internet, the World Wide Web, the platforms (such as social media sites), the devices people use, and people themselves.
This techno-social system has several characteristics that distinguish it from earlier times. One is the capability for any online user to create, curate, engage with and distribute content. The large number of active users producing and circulating content results in the added difficulty of monitoring, moderation, and regulation.
In the United States, regulation is further limited by free speech protections including the 1996 Communications Decency Act (Kosseff, 2019), although these protections are challenged elsewhere (O’Hara & Hall, 2021). Together, these characteristics result in content that is created outside of the typical norms and regulations governing the professional knowledge industries such as journalism, academia, government bodies, and other institutions such as think tanks, commercial research organisations and non-government organisations.
The consequence is that content aimed at influence can be created anywhere, by anyone. Attribution is often difficult (without some specialised training) and uncertain, and impersonation is easy.
In addition to increases in unregulated content, the new media system is characterised by increases in connectivity. Where broadcast models of communication are essential from one point of origin (a television channel, a radio station), internet-based communication is a large, complex and dynamic network comprised of unevenly grouped clusters of connections. It is a complex adaptive system of interdependent actors, and as such resists forms of centralised control (Bousquet, 2008).
A further important characteristic of contemporary digital networks is their capacity to target specific people, and groups, made possible using digital marketing methods. Audiences – or as they are more typically nowadays called, ‘users’ or ‘publics’ – are identified and targeting for messaging that can be tailored to them based on their identity, their preferences, their affiliations and other attributes. Appendix D goes into further detail about how this occurs.
A consequence for influence efforts is that it has become possible to target those individuals and groups based upon, and using messages that align with, pre-existing values, interests, world views and social identities. This process, known as surveillance capitalism (Zuboff, 2018) or platform capitalism (Smicek, 2016) combines the acquisition, accumulation and analysis of a user’s personalised data with the capacity to target that user online via programmatic advertising, tailored search results, or curated content on social media feeds. This is done using a range of algorithms and databases that are confidential, highly prized assets.
The promise of such digital marketing methods is to get the right message to the right person at the right time, to make influence bespoke. Although, as Aral (2021) notes, there is considerable conjecture about how successful digital marketers are at keeping such a promise.
In sum, these characteristics of contemporary information environments result in information excess and attention scarcity, such that the information economy has been renamed, by Davenport and Beck (2001), the ‘attention economy’.
The consequences include on one hand, the capacity for people to choose which information, and sources, to pay attention to. This is supported by, and contributes to, a greater percentage of the population now being open to change, contesting accepted beliefs and questioning the authority of experts and institutions.
On the other hand, the logic of attention economy supports the targeting of people based upon what they are mostly likely to pay attention to, and ideally to engage with and act upon.
Appendix C: Online social network influence operations: typical campaign elements
Although variations are evident, and although historical data is limited, and although campaign tactics are evolving and dynamic, it is possible to suggest a broadly applicable set of elements that combine to form the basis of online and social media influence campaigns. These are: research, profiling, predicting, targeting (people, message, timing), persona creation, connection, communication, influencing, reviewing.
What follows here is an ideal type of such a campaign, comprised of these constituent elements and presented as a staged order of activities. Actual, empirical case studies of campaigns may vary in that some elements are minimised, or not observable as the data is not available, or are de-prioritised, or that the order of activities is scrambled, the events are repeated in an iterative process, and so on. In short, campaigns can be considerably more disordered than this idealised outline.
The first stage involves research: the gathering of data on target individuals and groups for the purpose of understanding them psychologically (their identities, values, vulnerabilities, desires) and socially (their connections, affinities, affiliations). This is analogous to Target Audience Analysis.
Research at scale has a long history in the fields of marketing and advertising, typically undertaken through surveys, interviews, focus groups and in situ observations. Referred to as ‘advertising engineering’, this included recommendations that families should be studied weekly for years (Root & Welch, 1948). In 1980s-90s hacking cultures, research included ‘trashing’ – literally going through the garbage outside the offices of, for example, phone companies to find discarded information (old manuals, carbon papers with credit card details, scrap paper with log ins or passwords) that they could put to use to penetrate the system (Gehl & Lawson, 2022).
Contemporary, digitalised forms of research include versions of these earlier practices. Hackers search emails and databases. Online surveys, forms, and applications such as quizzes and games are all used by digital marketing agencies to gather data. Digital media corporations store the digital traces of activities undertaken online or using digital devices including web searches, commercial transactions, locations and movements, photographs, social media engagements, fitness data like heart rates and oxygen levels, and so on.
Subsequent stages – profiling, prediction and targeting – follow research. Data, accumulated at scale into identifiable databases, are used to profile individuals and create what Shoshana Zuboff (2019) refers to as ‘prediction products’ – calculations used to target advertising at the people most likely to be persuaded, using the messaging most likely to be persuasive, at the optimum time and via the optimum communications channel. This is the promise and premise of the digital advertising industry – referred to as surveillance capitalism (Crain, 2021; Doctorow, 2020; Zuboff, 2019) and platform capitalism (Srnicek, 2017).
The same underlying principle informs the targeting of individuals and groups and the shaping of messages used in online influence campaigns. Additionally, profiling can be used to develop personas and create online profiles and social media pages and groups for the purposes of group identification with the targets.
The advantages of the creation of personas that appeal to group-based identification are twofold. First, appearing to share the group identities, values and motivations as the intended targets increases the likelihood that the messengers will be trusted as reliable and relatable sources of information and opinion, and that messages will be given attention and positive consideration. This basic premise also holds true when the messenger is presented as having expert opinion based on specialised knowledge as long as this expertise appears to have the interests of the targeted audience at heart, and that it aligns with the targets’ aforementioned identities, values and motivations. This is a long-held assumption of influence campaigns, including those that construct deceptive personas, groups, and organisations: an aspect of the public relations industry that has been decried as unethically deceitful and manipulative in seminal texts such as The Hidden Persuaders (Packard, 1957), Toxic Sludge is Good for You (Stauber & Rampton, 1995) and The Merchants of Doubt (Oreskes & Conway, 2012).
The second advantage of group-based identification is more closely aligned to the structure and function of online social networks, based on the notion that individuals often connect with like-minded others in ‘social selection networks’ based on ‘homophily’ – shared interests, values and worldviews (Aral, 2020; Prell, 2012) – known also as ‘affinity networks’ (O’Connor and Weatherall, 2019; Gehl and Lawson, 2022). Thus, by appearing to share identities, values and motivations, it becomes possible to create or join online forums, groups and the like where target communities gather and socialise.
In short, performing an online persona generates opportunities for connection via online networks to online social groups, and this connection creates the opportunity for successful communication with these groups, based on shared group interests, values, and worldviews, resulting in the desired influence effects.
Homophilous groups are particularly important for understanding how online influence networks operate. Homophily, as Prell (2012) outlines, occurs in two ways. One is dependent on the organisational settings and focus: actors will be drawn to such organisations and seek membership of it based on those settings and focus. Typical examples include sporting clubs, formal religious organisations like churches, mosques and temples, political organisations and organised social activism like non-government organisations. The other type of homophily arises out of similarity without formal organisation, based on shared backgrounds such as age, education, ethnicity, family ties, cultural identities and the like.
Social media networks provide opportunities for both kinds of homophily to occur: organisations can create groups which members can join; recommendation algorithms suggest ‘friends’ based on shared interests, and – principally – shared connections. These friend-recommendation algorithms are a form of ‘induced homophily’, based on triadic closure – the connection of two people who have a mutual strong relationship with a third person (Aral, 2020; Asikainen et al., 2020; Kossinets & Watts, 2009). They are one of the principle means by which social media platforms seek to create connections between users and encourage ongoing engagement on the platform.
There is considerable evidence suggesting that these algorithms are successful in creating meaningful connections: since about 2013, for example, romantic relationships formed from connections recommended by algorithms have surpassed those arising out of introductions by friend and family (Rosenfeld, et al., 2019). Moreover, homophily in online networks appears to be quite strong. When based on ideology, ethnicity, opinions, gender, age, behaviours and country of origin it appears at times to be stronger than offline networks (Aral, 2020; Wimmer & Lewis, 2010).
The role of these selection networks, self-organised groups sharing strong ideational bonds based on expressed beliefs and observed behaviours, and how they can become influence networks, shaping individual opinions within the group as members are influenced by group attitudinal and behavioural norms, are crucial for understanding online influence campaigns.
O’Connor and Weatherall (2019) for example, outline how this penchant for agreement within selection networks, including specialist expert groups such as those compromised of members of a scientific community, can lead to conformity of beliefs within that group and polarisation between groups with opposing views. This aligns with social psychological concepts outlined in the micro section of this report.
Along these lines, Jeffrey’s rule (O’Connor & Weatherall, 2019) suggests that the relationship between individuals affects the credence one gives to information, even within scientific communities where evidence is ostensibly considered rationally, or at least free from biases based on interpersonal relationships.
Jeffrey’s rule posits that beliefs depend on an individual’s degree of (un)certainty, and is thus subject to motivated reasoning, especially confirmation bias, resulting in information from those with whom one has strong ties, such as from within a selection network, is more influential than that from elsewhere. This can, at scale and over time, lead to strongly held beliefs becoming more entrenched within these groups. Some extreme versions of this process develop into manias, what Bernstein (2021) refers to as the delusions of crowds.
[The alternative, wherein less connectivity and communication (temporarily), and more diversity, within groups, can improve reasoning and result in more scientifically accurate results, is known as the ’Zollman effect’ (O’Connor & Weatherall, 2019, p. 61). This aligns with the principles of the ‘wisdom of crowds’ (Surowiecki, 2004) which specifies the conditions required for such wisdom: “independent individual analysis, diversity of individual experience and expertise, and an effective method for individuals to aggregate their opinions” (Bernstein, 2021).
The notion that trust, based as shared social identity, has a pivotal role in shaping influence is prevalent in strategic communications theory and public relations practice. The use of ‘third-party advocates’, in which an advocate with a pre-existing standing and a favourable reputation in a community is deployed to endorse a group’s position or product, is a common tactic in commercial marketing and political campaigns (such as candidate endorsements). Similarly, trust based on shared membership of and participation in affinity networks online can be the basis for effective persuasion.
As O’Connor and Weatherall neatly summarise: “one way to influence the opinions of members of a group is to find someone who already agrees with them on other topics and have that person share evidence that supports your preferred position (138, original emphasis).
Two examples – the first unsuccessful, the second, apparently more successful – serve to illustrate the point.
In the first, pro-democracy protests in Hong Kong were subject to an online campaign which utilised a pre-existing marketing spam network comprised of Twitter accounts that had previously tweeted content in a variety of languages and on a range of topics, “from British Football to Indonesian tech support, Korean Boy Bands and pornography” (Uren, Thomas & Wallis, 2019, p.6). Although this ensured that the content reached a large potential audience, there was no evidence that this audience was interested in (or in many cases, one suspects, able to understand) the content, as the content did not come from a recognisable, let alone trusted, source. In other words, the “accounts did not attempt to behave in ways that would have integrated them into – and positioned them to influence – online communities” (p.4).
The second, much more widely documented, example is the Russian Internet Research Agency (IRA), especially its efforts to affect the 2016 Presidential election campaign in the United States (see, inter alia, Dawson & Innes, 2019; Gehl & Lawson, 2022; Howard et al, 2018; Jamieson, 2018; Jankowicz, 2020). One of the defining characteristics of the IRA’s campaign was the use of multiple false online identities through which they were able to infiltrate, or to instigate and develop, social media groups based on existent communities of interest. Using personas, IRA agents developed relationships with American citizens based on apparent, but confected, homophily. Once these relationships were developed, a process taking months or years, based on shared views about, for example, gun control, race relations, immigration, LGBTI rights, or even mundane localised community matters, IRA agents would ‘narrative shift’, “from banal to pro-Russian views but also switched abruptly between different political positions according to current Russian operational priorities, or even just to create confusion” (Dawson & Innes, 2019, 250).
Appendix D: Discussion of select key terms
In this section, select key terms relating to social influence are discussed regarding their common usages, which vary. Ambiguous, imprecise and inconsistent definitions are a common feature of many of these key terms as they appear in research and in defence documents. This is a problem without clear or easy solution.
Rather than propose yet another set of definitions, this report offers some contextual commentary on these terms to illuminate some of their nuances and variations.
Influence, Interference, and Propaganda
Variations of the use of the term influence occur based on (1) which elements the term is used to describe and (2) the malign or benign intentions of the influence actors.
Regarding the first, definitions of influence can refer to three main elements. They can denote characteristics or capabilities (one has influence, or a ‘sphere of influence’); actions or efforts (influence operations, campaigns, and similar attempts to influence); or the effect/s (the change in attitude, behaviour). All these elements are discussed in the report as they are all relevant and are not mutually exclusive.
While it is possible to use influence to mean any or all these three elements, this can lead to imprecision and therefore confusion in practice.
The second variation in use regards influence as being either malign, or benign, or neutral. These are mutually exclusive definitions: influence cannot be both benign and malign. If it is associated with manipulation, misinformation or coercion, influence will be seen as a pejorative term describing undesirable, even hostile actions.
In the literature, it is common to see influence operations used to describe activities (more than actors or effects) that have malign intent or which target opponents.
The RAND Corporation for example defines influence operations as “the collection of tactical information about an adversary as well as the dissemination of propaganda in pursuit of a competitive advantage over an opponent” (RAND, n.d.).
NATO’s Strategic Communications Centre of Excellence has offered a definition of an Information Influence Operations as “the organized attempt to achieve a specific effect among a target audience, often using illegitimate and manipulative behaviour … one or more actors have planned and conducted an operation that serves the interest of, for example, a hostile state” (Pamment & Smith, 2022, p.7).
The Carnegie Endowment for International Peace similarly refers to influence operations as ‘organized attempts to achieve a specific effect among a target audience” but includes a wider range of actors and less hostility: “a variety of actors—ranging from advertisers to activists to opportunists—employ a diverse set of tactics, techniques, and procedures to affect the decision making, beliefs, and opinions of a target audience” (Thomas, Thompson & Wanless, 2020, p.1).
Some definitions of ‘legitimate’ influence refer to it in contrast with interference. For example, in his introduction to the National Security Legislation Amendment (Espionage and Foreign Interference) Bill 2017, then Australian Prime Minister Malcolm Turnbull outlined the legislation’s focus on “activities that are in any way covert, coercive or corrupt. That is the line that separates legitimate influence from unacceptable interference.” (Turnbull, 2017, para 9).
Nevertheless, there are indications that the term is becoming more pejorative in its connotations. A RAND report, for example, notes that many “view influence pejoratively, equating it with manipulation, disinformation, or propaganda” (Paul et al., 2023, p. 1). These negative connotations are likely reasons for other terms, such as strategic communications (see below) or engagement, to sometimes be preferred, although these terms are, similar to influence, problematically imprecise.
Grey zone and hybrid warfare
The grey zone is typically understood to refer to activities operating “beyond those associated with routine statecraft and below means associated with direct military conflict between rivals” (Hicks and Friend, 2019, p.4). There are ‘shades’ of grey zone operations – lighter shades blend with forms of coercive diplomacy (see below) and darker shades blur to combine with elements of hybrid warfare.
Hybrid warfare, in earlier definitions, referred to “a range of different modes of warfare, including conventional capabilities, irregular tactics and formations, terrorist acts including violence and coercion, and criminal disorder” (Hoffman, 2007, p.14). Its meaning has expanded to include a much broader view: “the blending of conventional and non-conventional methods to achieve political-military objectives by both state and non-state actors” (Aoi, Futamura & Patalano, 2019, p. 701).
Complexities around the meaning/s of grey zone and hybrid warfare have arisen due to use in different contexts, such as adoption of the terms from Russia-Ukraine conflicts for use in describing activities designed to affirm sovereignty in contested areas of the Indo-Pacific. Terms such as ‘non-War military operations’ and ‘quasi-warfare operations’ as used in Chinese military planning documents, contain elements of both grey zone and hybrid warfare (Insisa, 2023).
It is no surprise that the terms grey zone and hybrid warfare are characterised by ambiguity and amalgamations – such is their nature.
In this report, for the sake of clarity, grey zone refers to that which remains below the threshold for conflict, including coercive statecraft to shape strategic environments and deter hostile actions, and hybrid warfare refers to the combination of military with other means of conflict, including cyber and – especially — influence operations. As the report focusses mostly on situations short of conflict, most of the report is concerned with the grey zone and strategic competition.
Strategic communication/s
Strategic communication has various meanings in defence and security discourses, and a related but broader meaning in the communication industries and related scholarly fields.
For the latter, strategic communication is related to the fields of public relations and advertising. Its emphasis is usually on being institutionalised, organised, and targeted.
Institutionalisation refers to strategic communication being typically undertaken by a large organisation such as a company, a government department, or a civil society organisation such as non-government organisation, political party or similar. Being organised refers to the process of strategic communication, which is structured and planned along prescribed lines. Targeted refers to part of this planning, which identifies and seeks to understand those significant individuals and groups that campaign success is depended upon.
A typical outline of strategic communication planning, by Botan (2021), suggests it includes:
“two minimum characteristics. First, research is conducted about the environment and the situation in which a campaign is to be carried out. This research has to assess, again at a minimum, the current opinions of the significant publics including an assessment of how the purpose, or goals, of a proposed campaign comport with the reality on the ground. Second, a plan is developed encompassing available resources, timing, sequencing of steps, and assignments that takes into account both the goals of the organization and the feelings, needs, and attitudes of the publics. This plan is the actual strategy.” (p.7)
(Some further notes on how social media influence campaigns incorporate research into campaign planning are included in Appendix D.)
In military parlance, the term strategic communication is similarly used to denote communication activities that are coordinated and planned. Some of the variation in terminology arises from the question of what is, or is not, strategic communication.
In some definitions, strategic communication supports other activities, including defence operations but also public affairs and diplomacy. Strategic communication in this sense has a supporting role.
Recent definitions of strategic communication from the United Kingdom and NATO (which uses the plural term, strategic communications) characterise it differently. Rather than being in support of other activities, the UK and NATO define strategic communication as “using all means of communication – comprising actions, images and words – to appropriately inform and influence an audience’s attitudes and behaviours through a narrative-led approach in pursuit of the desired end state” (NATO, 2023, p.3).
The crucial distinction is that the former sees strategic communication as supporting other defence activities, whereas the latter sees all defence activities as being means of strategic communication. Additionally, the latter, more extensive, conceptualisation of strategic communication includes those activities that may not be undertaken as communicative acts in the first instance, that have other reasons for occurring but nevertheless have an impact:
“Everything NATO and its partners say and do, or omit to say and do, has intended and unintended consequences. Every action, word and image sends a message, and every member of the military is a messenger, from the individual soldier in the field to the theatre commander” (NATO, 2023, p.20).