Influence Operations Overview
Influence operations, particularly in the digital age, involve strategic efforts to shape perceptions, beliefs, and behaviors through various means, including social media and online platforms. These operations aim to manipulate information to achieve specific goals, such as influencing public opinion, disrupting societal harmony, or advancing political agendas.[1]
Key elements of influence operations include content creation, dissemination, and amplification. Content-based features play a crucial role in predicting and identifying these operations[2].
Such operations often involve the spread of disinformation and propaganda to deceive or mislead target audiences. Cyber influence operations specifically focus on leveraging digital platforms to conduct influence campaigns. These operations are characterized by their use of online tools and technologies to target specific audiences and achieve desired outcomes[3][4][5].
Effective influence operations are built on a framework that enhances capabilities in areas such as information warfare, psychological operations, and strategic communication. This framework is crucial for military organizations seeking to counter adversarial influence efforts[4].
History of Influence Operations
Influence operations throughout history have been used by national governments and entities to advance their interests, whether security, economic, or political, with examples dating back to Genghis Khan’s disinformation campaigns in the 12th century and propaganda efforts during World War I and II, the Cold War, and more recent conflicts in Iraq, Afghanistan, and Syria[32]
General Tactics of Influence Operations
Influence operations involve a coordinated, integrated, and synchronized application of various capabilities to shape attitudes, behaviors, or decisions by foreign target audiences in alignment with specific interests and objectives. These operations encompass diplomatic, informational, military, economic, and other means to influence without relying excessively on force. Influence operations aim to foster favorable conditions for advancing national interests and policies through strategic communications, public diplomacy, clandestine actions, economic development, and military capabilities. The tactics involve reinforcing communications with real-world capabilities and can target specific leaders, decision-making groups, military organizations, population subgroups, or mass publics[36][37]. Influence operations can involve various techniques such as psychological operations (PSYOPS), military deception (MILDEC), public affairs strategies, and military-civilian relations to affect adversaries’ will, behavior, and morale[32]
Do Influence Operations Use Trolls?
Yes, Influence Operations do use trolls as part of their strategies. Trolls are utilized in influence campaigns to spread disinformation, manipulate public opinion, and create chaos on social media platforms. Trolls can be both automated (bots) and human-operated accounts that engage in coordinated activities to achieve specific goals within influence operations[50]. These operations leverage various tactics such as bots, trolls, spamming, disinformation, and fake reviews to influence public sentiment and behavior[50]. Trolls play a significant role in these campaigns by amplifying certain narratives, spreading misinformation, and shaping online discourse to achieve desired outcomes.
What Tactics Do Trolls Use in Influence Operations?
Trolls use tactics like amplification through social media, hashtag hijacking, emotional manipulation, astroturfing, targeting influential individuals, creating memes, and exploiting existing divisions to achieve their goals[61].
What is Astroturfing?
Astroturfing is the practice of creating a false impression of a grassroots movement by concealing the sponsors of a message or organization to make it appear as though it originates from and is supported by genuine grassroots participants. This deceptive strategy is commonly employed by corporations, political entities, and other organizations to influence public opinion without revealing their true motives or affiliations[62][63][64][65].
How do Trolls Target Influential Individuals?
Trolls working in influence operations may target influential individuals by launching a campaign of repeated insults, creating memes, and exploiting existing societal divisions[67][69]. These tactics are commonly employed to provoke strong reactions, manipulate social media algorithms, and spread disinformation. By targeting influential figures, trolls aim to amplify their messages, gain attention, and sow discord within online communities.
Emotive Tagging, A technique of Influence Operations
Influence operations utilize emotive tagging to suppress information by leveraging emotional triggers to manipulate perceptions and control narratives. Emotive tagging involves associating specific emotions with content to influence how it is perceived and shared. By strategically using emotive language or imagery, those conducting influence operations can shape public opinion, discredit opposing viewpoints, and limit the spread of certain information.
The process of emotive tagging in influence operations is aimed at evoking strong emotional responses in the audience, such as fear, anger, or excitement, to steer their reactions and behaviors. This tactic can be particularly effective in swaying opinions, polarizing audiences, and diverting attention away from critical information. By attaching powerful emotions to specific content, manipulators can distort reality, create confusion, and hinder the dissemination of accurate information.
Emotive tagging serves as a tool for controlling narratives and suppressing dissent by exploiting human psychology and cognitive biases. It plays a crucial role in shaping public discourse, influencing decision-making processes, and ultimately undermining the integrity of information spaces. Through the strategic use of emotive language and imagery, influence operators can distort perceptions, sow discord, and hinder the free flow of information essential for a healthy democratic society.
By understanding how emotive tagging is employed in influence operations, individuals can become more discerning consumers of information, better equipped to identify and counteract attempts to manipulate emotions for deceptive purposes.
Evidence of Effectiveness on Social Media
The evidence from the sources indicates that influence operations on social media can have significant effects on individuals and societies. These operations can lead to shifts in political beliefs, increased xenophobic or discriminatory sentiments, and heightened skepticism around vaccines and medical information. Social media activities by prominent actors like political parties can even impact racially motivated violence in specific areas[11]. Research highlights that long-term exposure through traditional mass media and short-term exposure via social media can influence people’s beliefs and behaviors[13]. Furthermore, content-based features derived from social media activity can effectively predict influence operations, distinguishing them from organic content[13]. Influence operations often involve spreading misinformation or purposefully skewing perceptions to manipulate how people view the world, with the aim of swaying public opinion or behavior[14].
In summary, sources provide substantial evidence of the impact of influence operations on social media, showcasing how these operations can shape beliefs, behaviors, and even political outcomes.
Specific Examples of Influence Operations
1. Facebook: Meta took down a sprawling network of fake accounts linked to Chinese law enforcement in the largest ever Chinese influence operation removed by the company[40].
2. Twitter: Twitter took down nearly 300,000 terrorist accounts and has been a platform used for various influence operations by nation-states[41].
3. LinkedIn Fake Accounts: The research found deceptive practices on LinkedIn, specifically, 1,003 fake accounts with AI-generated profile pictures, shedding light on the deceptive use of AI-generated images in the economic sphere [23][25].
4. Doppelgänger is an influence campaign linked to Russia that spreads disinformation and propaganda through a vast network of social media accounts and fake websites. In the U.S., the operation promoted hostile articles criticizing the LGBTQ+ movement and raised doubts about military competence, particularly ahead of the 2024 US election[38].
5. Twitter Caught the Pentagon: In another instance, Twitter identified and removed accounts linked to the Pentagon that were engaging in influence operations, highlighting the prevalence of such activities on social media platforms[26] The recent reports reveal that Twitter assisted the Pentagon in its covert online propaganda campaign, allowing U.S. military accounts to run foreign influence operations. Despite Twitter’s public stance against state-backed disinformation, it whitelisted accounts at the request of the government, including those affiliated with the U.S. Central Command (CENTCOM). These accounts promoted narratives supporting U.S. military activities in the Middle East, such as criticizing Iran, backing the war in Yemen, and highlighting the accuracy of U.S. drone strikes. The Pentagon concealed its ownership of these accounts, sometimes using fake profiles to appear civilian-operated. Twitter executives were aware of this covert activity but did not shut down these accounts, contrary to their actions against foreign state-backed propaganda efforts[27][28][29][30].
6. TikTok concerns: While the U.S. Intelligence community has raised concerns about TikTok being a potential threat in theory, there is no concrete evidence linking TikTok to coordinated efforts with the Chinese government for influence operations[39]. Concerns consider ByteDance as having elements of a Chinese influence operation due to its close ties to the Chinese government, control over TikTok’s operations, access to user data, and potential for influencing or surveilling users in alignment with Chinese interests.
7. Instagram: A pro-U.S. influence operation promoting U.S. foreign policy interests abroad was removed from these platforms after running for almost five years[42]. The operation was discovered by researchers from the Stanford Internet Observatory and Graphika, marking the first time such an influence campaign supporting U.S. interests was identified and taken down from social media platforms. The campaign involved accounts that posed as news outlets or fictitious personas, posting content in multiple languages like Russian, Arabic, and Urdu[43][44][45][46].
Detecting Influence Operations
1. Misinformation or False Information: If you come across information that seems too good to be true, overly sensational, or contradicts established facts, it could be a sign of an influence operation[56][59].
2. Unusual Calls to Action: Be cautious of messages urging you to take extreme actions, spread information widely without verification, or incite hatred or violence[59].
3. Lack of Transparency: Influence operations often involve hidden agendas or undisclosed sources behind the information being presented[59].
4. Quality of Content: Assess the accuracy, completeness, and coherence of the information being shared. Misleading or false content is a red flag[59].
5. Targeted Messaging: If you notice that certain information is being pushed towards specific groups or communities with the intent to sway opinions, it could be part of an influence operation[59].
6. Disguised Origins: Pay attention to whether the source of the information is clear and transparent. Operations that hide their origins are often suspect[59].
7. Influence on Decision-Making: If you feel pressured or manipulated into making decisions based on certain information, it might be a tactic used in influence operations[59].
Being aware of these signs and critically evaluating the information you encounter can help you identify and protect yourself from potential influence operations aimed at manipulating your beliefs or actions.
Countermeasures to Influence Operations
Examples of Countermeasures to Influence Operations:
1. Disruption: Disruption involves impeding harmful behavior by increasing the costs or level of effort for bad actors, thereby reducing the frequency, intensity, or scale of their activities[17]. This can be a cost-efficient intervention when criminal behavior surpasses available resources[17].
2. Displacement: Displacement aims to redirect bad actors’ activities to less harmful areas, effectively moving their efforts away from critical targets or objectives[17]. By displacing influence operations, the impact on vulnerable populations can be minimized.
3. Deterrence: Deterrence strategies involve dissuading bad actors from engaging in harmful activities through various means such as deplatforming, takedowns, sanctions, indictments, or public attributions[19]. By enhancing content and implementing deterrent measures, the effectiveness of influence operations can be reduced[19].
These countermeasures are essential components in combating influence operations and safeguarding against the dissemination of false information and propaganda.
Conclusion
In conclusion, influence operations are sophisticated strategies that exploit digital mediums to manipulate perceptions and behaviors. By understanding the mechanisms behind these operations and fostering collaboration across sectors, it becomes possible to develop effective countermeasures to safeguard against their harmful effects.
Citations:
[1] https://carnegieendowment.org/2020/06/25/collaborative-models-for-understanding-influence-operations-lessons-from-defense-research-pub-82150
[2] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7439640/
[3] https://css.ethz.ch/content/dam/ethz/special-interest/gess/cis/center-for-securities-studies/pdfs/Cyber-Reports-2019-10-CyberInfluence.pdf
[4] https://www.rand.org/content/dam/rand/pubs/monographs/2009/RAND_MG654.pdf
[5] https://ndupress.ndu.edu/Media/News/News-Article-View/Article/2404329/7-social-media-and-influence-operations-technologies-implications-for-great-pow/
[6] https://lageneralista.com/whats-working-and-what-isnt-in-researching-influence-operations/
[7] https://carnegieendowment.org/2021/06/28/measuring-effects-of-influence-operations-key-findings-and-gaps-from-empirical-research-pub-84824
[8] https://www.intelligence.senate.gov/sites/default/files/hearings/CHRG-115shrg30959.pdf
[9] https://www.europarl.europa.eu/RegData/etudes/STUD/2021/653635/EXPO_STU%282021%29653635_EN.pdf
[10] https://www.16af.af.mil/Newsroom/Article/2389118/the-evolution-of-authoritarian-digital-influence-grappling-with-the-new-normal/
[11] https://carnegieendowment.org/2021/06/28/measuring-effects-of-influence-operations-key-findings-and-gaps-from-empirical-research-pub-84824
[12] https://www.jstor.org/stable/27033648
[13] https://esoc.princeton.edu/publications/content-based-features-predict-social-media-influence-operations
[14] https://www.thebureauinvestigates.com/stories/2023-07-27/what-are-influence-operations-and-why-are-we-investigating-them/
[15] https://www.brookings.edu/articles/the-breakout-scale-measuring-the-impact-of-influence-operations/
[16] https://www.jstor.org/stable/26894685
[17] https://carnegieendowment.org/2020/10/28/using-criminology-to-counter-influence-operations-disrupt-displace-and-deter-pub-83058
[18] https://foreignpolicy.com/2019/08/12/8-ways-to-stay-ahead-of-influence-operations/
[19] https://carnegieendowment.org/2021/09/21/measuring-efficacy-of-influence-operations-countermeasures-key-findings-and-gaps-from-empirical-research-pub-85389
[20] https://misinforeview.hks.harvard.edu/article/review-of-social-science-research-on-the-impact-of-countermeasures-against-influence-operations/
[21] https://misinforeview.hks.harvard.edu/article/research-note-this-salesperson-does-not-exist-how-tactics-from-political-influence-operations-on-social-media-are-deployed-for-commercial-lead-generation/
[22] https://www.linkedin.com/pulse/china-russia-involved-largest-known-covert-influence-ops-mihir-bagwe
[23] https://www.npr.org/2022/03/27/1088140809/fake-linkedin-profiles
[24] https://www.linkedin.com/posts/josephmenn_a-russian-agency-has-improved-its-online-activity-7053423848603455488-CwrD?trk=public_profile_like_view
[25] https://www.linkedin.com/blog/member/product/an-update-on-how-were-fighting-fake-accounts
about:blank
[26] https://carnegieendowment.org/2021/06/28/measuring-effects-of-influence-operations-key-findings-and-gaps-from-empirical-research-pub-84824
[27] https://theintercept.com/2022/12/20/twitter-dod-us-military-accounts/
[28] https://www.rollingstone.com/politics/politics-news/twitter-helped-pentagon-foreign-propaganda-campaign-1234650938/
[29] https://www.washingtonpost.com/national-security/2022/09/19/pentagon-psychological-operations-facebook-twitter/
[30] https://www.newsnationnow.com/business/tech/twitter-aided-pentagon-influence-operations-report/
[31] https://www.aljazeera.com/economy/2022/12/21/twitter-secretly-boosted-us-military-propaganda-investigation
[32] https://css.ethz.ch/content/dam/ethz/special-interest/gess/cis/center-for-securities-studies/pdfs/Cyber-Reports-2019-10-CyberInfluence.pdf
[33] https://ndupress.ndu.edu/Portals/68/Documents/Books/CTBSP-Exports/Cyberpower/Cyberpower-I-Chap-15.pdf?ver=2017-06-16-115054-210
[34] https://carnegieendowment.org/2022/02/09/global-perspectives-on-influence-operations-investigations-shared-challenges-unequal-resources-pub-86396
[35] https://www.linkedin.com/pulse/influence-operations-in-depth-study-psychological-niels-groeneveld
[36] https://www.rand.org/content/dam/rand/pubs/monographs/2009/RAND_MG654.pdf
[37] https://apps.dtic.mil/sti/citations/ADA503375
[38] https://therecord.media/doppelganger-influence-operation-new-activity
[39] https://theintercept.com/2024/03/16/tiktok-china-security-threat/
[40] https://time.com/6310040/chinese-influence-operation-meta/
[41] https://ndupress.ndu.edu/Media/News/News-Article-View/Article/2404329/7-social-media-and-influence-operations-technologies-implications-for-great-pow/
[42] A pro-U.S. influence operation promoting U.S. foreign policy interests abroad was removed from these platforms after running for almost five years
[43] https://www.nytimes.com/2022/08/24/technology/facebook-twitter-influence-campaign.html
[44] https://www.voanews.com/a/for-first-time-facebook-twitter-take-down-pro-us-influence-operation-/6717461.html
[45] https://www.aljazeera.com/economy/2022/8/25/facebook-twitter-disrupt-pro-us-influence-operation-report
[46] https://www.euronews.com/my-europe/2022/09/01/first-major-covert-pro-us-propaganda-campaign-taken-down-by-social-media-giants
[47] https://carnegieendowment.org/2021/06/28/measuring-effects-of-influence-operations-key-findings-and-gaps-from-empirical-research-pub-84824
[48] https://www.ncbi.nlm.nih.gov/pmc/articles/PMC7439640/
[49] https://cyber.fsi.stanford.edu/io/publication/house-vs-outsourced-trolls-how-digital-mercenaries-shape-state-influence-strategies
[50] https://bipartisanpolicy.org/blog/coordinated-influence-operations/
[51] https://www.trails.umd.edu/news/are-influence-campaigns-trolling-your-social-media-feeds
[52] https://www.nature.com/articles/s41598-023-49676-z
[53] https://www.ll.mit.edu/r-d/projects/reconnaissance-influence-operations
[54] https://arxiv.org/abs/2305.16544
[55] https://www.lawfaremedia.org/article/finding-language-models-in-influence-operations
[56] https://www.thebureauinvestigates.com/stories/2023-07-27/what-are-influence-operations-and-why-are-we-investigating-them/
[57] https://quizlet.com/761489126/influence-awareness-test-right-or-wrong-doesnt-matter-flash-cards/
[58] https://www.indeed.com/career-advice/interviewing/tell-me-about-when-you-influenced-someone
[59] https://carnegieendowment.org/2023/08/07/what-makes-influence-operation-malign-pub-90323
[60] https://foreignpolicy.com/2019/08/12/8-ways-to-stay-ahead-of-influence-operations/
[61] https://gijn.org/resource/investigating-digital-threats-trolling-campaigns/
[62] https://en.wikipedia.org/wiki/Astroturfing
[63] https://www.merriam-webster.com/dictionary/astroturfing
[64] https://www.bigcommerce.com/glossary/astroturfing/
[65] https://dictionary.cambridge.org/us/dictionary/english/astroturfing
[66] https://www.fastcompany.com/90540452/we-analyzed-1-8-million-images-on-twitter-to-learn-how-russian-trolls-operate
[67] https://gijn.org/resource/investigating-digital-threats-trolling-campaigns/
[68] https://theconversation.com/political-trolls-adapt-create-material-to-deceive-and-confuse-the-public-135177
[69] https://www.aspistrategist.org.au/how-memes-are-becoming-the-new-frontier-of-information-warfare/
[70] https://www.newyorker.com/news/our-columnists/why-the-russian-influence-campaign-remains-so-hard-to-understand