2024 Elections: Déjà Vu With a Twist
Synthesizing Information Operations Targeting 2024 Elections - Actor/Behavior/Content Trends. Beyond Narratives: Threat Actor's Vulnerability.
Hey there. Two months into the 2024 Election cycle, let’s take a moment to pause and reflect on what we have observed so far regarding Election-related Information Operations.
This time, I've chosen not to use the now notorious FIMI acronym. It seems to me that focusing solely on Foreign Information Manipulation and Interference activities limits our understanding of the motives, infrastructure, and strategies behind online threat actors. There have also been more cases of domestic interference lately than foreign interference, displaying several interesting tactics and narratives pattern that could be employed by foreign threat actors in the future.
What to expect:
2024 Elections: Where Do We Stand?
Since the beginning of 2024, more than a dozen elections have already taken place worldwide. Elections occurred in Taiwan, Finland, Tuvalu, Indonesia, El Salvador, Costa Rica, Azerbaijan, Pakistan, Iran, Belarus, Cambodia, Belize, Jamaica and Portugal. Additionally, ongoing primaries are being held in the U.S. One election was postponed in Senegal, triggering civil unrest and demands for new elections before the end of the President’s term.
Cases of FIMI were reported in:
Taiwan: Chinese cognitive warfare was largely documented, as well as detailed here and in this newsletter. It includes local audience polarization over existing issues, survey manipulation, attacks against candidates. All these tactics are supported by an increased use of video format and a growing hyper-personalization of content to target specific Taiwanese audiences. There were accompanied by other hybrid threats, including economic, diplomatic and military pressure.
Tuvalu: Chinese’s tactics to co-opt local newspaper were disclosed.
Cases of unattributed information manipulation activities were reported in:
Finland: While no foreign interference was observed, many cases of information manipulation activities were reported by Faktabaari, including electoral fraud narratives. Only the future will tell if these were solely domestic cases or if a foreign actor such as Russia or China was involved.
Indonesia: An information operation campaign related to Rohingya refugees, and thousands of political ‘hoaxes’, some generated through AI, on mainly YouTube, Facebook and TikTok remain unattributed. Russia, China and extremist movements are all credible suspects.
United-States: during Hampshire primary elections, genAI was used to impersonate Biden with deepfake robocalls.
Cases of domestic interference - by governments themselves - were reported in:
Azerbaijan: independent journalists were jailed to silence dissenting voices.
Belarus: opposition parties could not participate in the parliamentary election, and independent media were targeted as well.
El Salvador: Reporters without Borders decried the decline of press freedom, after observing more than 80 press freedom violations during the election.
Pakistan: opposition candidate and former PM Imran Khan was sentenced to 10 years in prison just before the election. Mobiles phones services were suspended on the day of the election, and X was shut down for weeks during and in the aftermath of the elections to prevent access to information.
We can also add Russia to the list, as disclosed in the latest leaks that I discuss in last week’s newsletter.
2024 Elections - Actors
Using one of my favorite frameworks, ABC - Actor, Behavior, Content - developed by French Researcher Camille Francois, I have come up with a few takeaways that I hope can contribute to the ongoing assessment of upcoming attacks.
Regarding actors, taking both threat actors and audiences, we often attempt to depict our current world as divided between democracies - who supposedly do not conduct offensive information operations to destabilize other countries - and authoritarian regimes - who are the bad guys blamed for all the problems of alleged democracies.
As I examine cases of information manipulation activities, both foreign and domestic, I believe a more relevant distinction can be made between:
Actors who control their internet infrastructure, such as Russia, China, Pakistan.
Actors who defend Internet freedoms while trying to institute some regulations to safeguard privacy and limit online harms, such as the EU and Taiwan.
Actors who supposedly defend a total freedom of the internet, such as the U.S.
Digital authoritarianism versus digital libertarianism, with a data privacy-focused approach in the middle.
You might ask, why does it matter? Because this categorization provides the opportunity to describe the unequal cyberspace in which information operations take place and to have a first measurement of the strategic advantage of threat actors. Take a look.
This table is an oversimplified version of a state’s ability to control or exploit its own or foreign infrastructure. It demonstrates the clear strategic advantage that Digital Authoritarians have over cyberspace. They can control their infrastructure and online assets while exploiting the infrastructure and assets of both Digital Libertarians and Data Privacy Defenders.
This has been observed since the beginning of the Ukrainian war. On one hand, in the case of the Doppelganger/RRN campaign, Russia is able to create many domain names to impersonate Western traditional media. On the other hand, it has been challenging for Digital Libertarians and Data Privacy Defenders to limit the spread of propaganda narratives by Russian state media, despite collective sanctions and EU regulations.
Digital Libertarians are clearly the ones competing with the highest number of disabilities. Their cyberspace is completely vulnerable to threat actors and offer many opportunities, such as the political and social ads ecosystem, collective suspicion towards platforms’ moderation policies, and the lack of safeguards regarding private data.
Data Privacy defenders are in between, with legislation in place limiting to some extent the exploitation of their infrastructure by external threat actors. On the other hand, they remain vulnerable to numerous opportunities. This comparison leads to two conclusions:
On one hand, the strategic advantage can be narrowed by increasing control over Digital Libertarians and Data Privacy Defenders’ infrastructure through legislation, education and funding transparency. Decreasing their infrastructure vulnerabilities can limit exploitation by threat actors.
On the other hand, Digital Libertarians and Data Privacy Defenders cannot take over Digital Authoritarian Infrastructure without engaging in offensive operations, including economic sanctions, cyberwarfare, and counter-interference measures. This can only be done by increasing internal offensive capabilities.
Now, where should we put emphasis?
2024 Elections - Behaviors
From Taiwan to Pakistan, one recurring tactic in nearly all elections has been the use of Generative Artificial Intelligence (GenAI). However, depending on whether the election took place in a cyberspace where plurality of expression, representation and information were restricted or permitted, GenAI held two different meanings:
In cyberspaces with a free flow of information and plurality of expression, GenAI content, including deepfakes audios, videos and creation of avatars, is perceived as a threat to election integrity. This was evident in U.S. primary election in New Hampshire and the Taiwan Presidential election.
Conversely, in cyberspaces with limited or banned information access, GenAI content contributes to restoring election integrity, by providing, for example in Belarus and Pakistan, an GenAI candidate or GenAI speeches.
Both strategies center around impersonation. However, impersonation can be both positive and negative, depending on whether the election is lacking plurality or not. In some cases, the picture is even blurrier.
Introducing diverse voices by impersonating deceased figures does not directly appear as a restoration of election integrity and raises ethical questions. To what extent can we manipulate the face and voice of a real human? Should we limit creations to fictitious avatars? Is there a right to maintain a unique human identity, despite the technologies available in 2024?
While these questions may seem distant from information operations, they directly relate to a crucial issue: who is part of the national public debate that threat actors attempt to manipulate?
These questions surrounding the status of elections and democracy in the context of our tech environment have been recently addressed by Researcher Asma Mhalla in her new book: “Technopolitique: comment la technologie fait de nous des soldats”. She describes our current geopolitical landscape as reshaped by Big Tech. She identifies two major trends:
we have transitioned from a system of mass democracy system to a system of mass hyper-personalization, which is a system where all humans are microtargeted.
our technologies are all dual, there is no more distinction between civil and military technologies. These technologies are designed by Big Tech companies carrying ideological and political worldviews.
These two trends aligned with observed tactics this year. The use of GenAI, micro-influencers, and political ads in recent elections is demonstrative of the intent to micro-target voters. The recently revealed Russian Portal Kombat operation illustrates how news are micro-selected based on audience localization.
Platforms and technologies developed by Big Tech are, per se, neutral but constantly weaponized to disseminate content. For instance, the recently exposed Chinese Paperwall campaign demonstrates how websites creation is used to disseminate targeted attacks and conspiracy theories.
The weaponization of Internet infrastructure and microtargeting are tactics that will likely prevail this year. In U.S. elections for instance, where Trump supporters targeted black voters with inauthentic AI content.
These two trends are further announced in the latest DFRLab Report on Russian information warfare across the world. In this report, DFRLab team states that “Russia shifted toward more targeted and tailored influence operations” while developing new messages and techniques”.
These tactics can be countered, but few efforts are made in this direction. Potential countermeasures include: 1) increasing transparency bulwarks such as identity verification or money tracking and 2) increasing education through raising-awareness activities and digital citizenship courses.
2024 Elections - Content
When we examine narratives disseminated in recent information operations, one trend stands out. It is the recurring pattern of leveraging existing fears of migrants/refugees to sow discord within countries.
Asia: in Indonesia, a hate campaign was conducted against Rohingyas refugees, in an attempt to turn Indonesians against Rohingyas.
Americas: At the border between Mexico and Texas, fears of an ‘invasion’ have been leveraged by domestic extremist and conspiracy groups. Additionally, two information campaigns have been conducted by Russia and China to present the US as facing a ‘civil war’.
Africa and the Middle East: As the Israeli-Palestinian conflict continues, UNRWA, the United Nations Relief and Works Agency for Palestinian Refugees is being targeted by an information campaign by Israel.
Europe: The Ukrainian war has brought, for the last two years, several Russian information manipulation activities aimed at creating fears against Ukrainian refugees to diminish support for Ukraine. Besides already well-publicized campaigns, the latest Russian leaks have revealed that the International Federation of the Red Cross is integrated into Putin’s plan to disseminate its propaganda in Ukraine. In Finland, Russia has used and continues to use migrants to create internal tensions.
This is not a new phenomenon. It was already the case in 2015, when I was monitoring Europeans’ reactions to the so-called ‘migrant crisis’ in Europe. Security discourses and extreme views suddenly increased, revealing the fears and anxieties of European society. You may wonder how this is related to elections…
Firstly, it is rare during elections for topics directly related to a candidate’s program to be manipulated by a foreign threat actor. These are technical topics and do not capture the public’s attention. Divisive issues with a high potential to trigger online debate are more likely to be leveraged.
Secondly, as Russia remains one of the main threat actors, it is also likely, as the DFR Lab states, that: “2024 is an election year in dozens of countries where Russia may try to meddle in an effort to push support toward its allies or, at minimum, away from pro-Ukrainian parties. In the least friendly countries, Russia will likely continue to push the idea—through more covert means—that aid to Ukraine is a net loss to those residing in those countries.” And decreasing that aid means supporting candidates and parties who often use fears of refugees and migrations and promises of economic security as populist measures to be elected, as experts told the Washington Post last week.
Beyond refugees and migrants, it appears that the question of self-identity and a sense of belonging to a broader community is more widely targeted. Not only during hybrid attacks targeting abroad audiences but also during internal operations to control domestic audiences.
What the latest leaks about Russia have revealed is that public opinion still matters to Digital Authoritarians.
In Russia, the Kremlin has deployed measures called pre-rigging tactics to ensure as little discrepancy as possible between Russian voters’ ballots and the proclaimed results.
This is also true elsewhere. For instance, pro-Pakistan media outlets have manipulated international surveys’ results to present a positive image of the election, despite international recognition that the elections were rigged.
For authoritarian leaders, alleged defenders of ‘state sovereignty’ and ‘territory integrity’, there is a vital necessity to create and continuously amplify a narrative that tells a story of reclaiming something to which they are entitled. This narrative allows them to defend themselves against ‘double-standard’ critics and circumvent accusations of interference. And to do so, they need to create a sense of belonging and gain international recognition.
The sense of belonging can be achieved through all grey tactics that have already been used by threat actors during the Ukraine war, the Israeli-Palestinian conflict, and the Taiwan Strait tensions.
International recognition can be achieved through electing the right leaders abroad. When elections appear unlikely to favor like-minded leaders, information operations can be leveraged to create rejection of current leaders and lead to a ‘Coup d’Etat’ as has been the case in West Africa in recent years.
These content strategies inherently linked to the ideological and political survival of Digital Authoritarian States present, however, one vulnerability.
Let’s discuss it after the Press Corner.
Your Press Corner
Here’s the weekly readings to keep you connected to all the conversation on global elections and information operations:
Russian disinformation is about immigration. The real aim is to undercut Ukraine aid - The Washington Post - According to company Logically, Russian threat actors started 2024 with a focus on the U.S., likely in an attempt to undercut Ukraine aid.
Russia Seeks to Exploit Western "War Fatigue" to Win in Ukraine (recordedfuture.com) - New research from Recorded Future’s Insikt Group examines Russia's strategic approach towards the Ukraine conflict and how it interacts with Western perceptions and policies.
Undermining Ukraine: How Russia widened its global information war in 2023 - Atlantic Council - Important Report from the DFRLab on where Russian information operations stand in 2024.
Ukraine Braced for Russian Disinformation Attacks on Zelenskiy - Bloomberg - Kremlin campaign to focus on legitimacy as vote scrapped.
Another Ukraine: a disinformation platform run by an exiled Ukrainian oligarch in Russia (france24.com) - The project is officially led by Viktor Medvedchuk, a leading figure pushing pro-Kremlin interests, but it is orchestrated behind the scenes by Ilya Gambashidze’s Social Design Agency, a Russian IT company linked to the Kremlin.
Putin has new cyber-tools that threaten democracy, Ukraine warns (thetimes.co.uk) - Oleksiy Danilov, Ukraine’s national security adviser, warns the UK and US that Vladimir Putin will meddle in 2024 elections.
French intelligence services investigate pro-Russian campaign ahead of EU elections (lemonde.fr) - The DGSI is looking into a campaign led by a former far-right French MEP in the European elections in June, with links to several pro-Russian figures.
Germany accuses Moscow of ‘disinformation attack’ in leaking senior officers’ call – POLITICO - Russia obtained a call by German officers discussing a hypothetical export of Taurus cruise missiles to Ukraine.
Moldova warns against Russian meddling as it gears up for EU referendum and presidential election | AP News - The Intelligence and Security Services has gathered data indicating Russia plans to launch vast hybrid attacks against Moldova through 2024-2025 to try to bring the former Soviet republic back under Moscow’s influence.
Vladimir Putin hardly needs to interfere in American democracy (economist.com) - Domestic politicians are happy to spread dysfunction on their own.
Trump supporters target black voters with faked AI images - BBC News - Donald Trump supporters have been creating and sharing AI-generated fake images of black voters to encourage African Americans to vote Republican.
Magician says a Democratic operative paid him to make the fake Biden robocall that spread in New Hampshire (nbcnews.com) - Creating the fake audio took less than 20 minutes and cost only $1, for which he was paid $150, according to Venmo payments.
South Korean Police Develops Deepfake Detection Tool - Infosecurity Magazine (infosecurity-magazine.com) - The Korean National Police Agency announced that its National Office of Investigation (NOI) will deploy new software designed to detect whether video clips or image files have been manipulated using deepfake techniques.
Russian scam network circulates Maria Ressa deepfake through Facebook, Microsoft's Bing (rappler.com) - A deepfake video attempts to push a scam, impersonating Rappler and CNN Philippines to discredit Maria Ressa.
Deepfakes and Elections: The Risk to Women’s Political Participation | TechPolicy.Press - The impact of online violence against women will have a silencing effect on the political ambitions and engagement of women and girls, decreasing their presence and voice in politics and public life.
With elections looming worldwide, here’s how to identify and investigate AI audio deepfakes | Nieman Journalism Lab (niemanlab.org) - Here’s a step-by-step process for analyzing potential audio deepfakes.
Safeguarding EU elections amidst cybersecurity challenges — ENISA (europa.eu) - Update of the compendium
Facts not Fakes: Tackling Disinformation, Strengthening Information Integrity OECD - New report exploring how to respond to reinforce democracy.
Newsguard debuts new automation tools for tracking election-related misinformation - Digiday - NewsGuard, the news rating service, is adding more automation tools as it works to track misinformation efforts ahead of the 2024 elections.
Who Voters Trust for Election Information in 2024 | Bipartisan Policy Center - I forgot to publish it last week, very informative study for the U.S. election !
The Urgency of Social Media Data Access for Electoral Integrity | TechPolicy.Press - Over the past year, however, there’s been a marked shift in several of these companies’ data-sharing practices.
What Is Rate Bait? Why Influencers Are Making People Mad On Purpose (rollingstone.com) - The fall of Twitter and the rise of TikTok have created the perfect breeding ground for rage-bait influencers to take over feeds.
Narrative Vulnerabilities: Responding With Indirect Targeting
In a recent CNN article titled “The dangerous parallels between Putin’s ambitions in Ukraine and Xi’s claims on Taiwan”, I was reminded of how China was not entirely enthusiastic about Putin’s idea to invade Ukraine in February 2022. There were even a few hours in the morning of February 24 when Chinese commentators shared confusing headlines, as it appeared that they had not yet received clear instructions on how to address this event.
For the last two years, Western commentators have been comparing the situations in Taiwan and Ukraine, suggesting that China might be tempted to military attack Taiwan. China has pushed back against these comparisons, emphasizing that only a handful of countries recognize the island’s sovereignty.
These comparisons indeed impact China’s narrative of being a peaceful actor, a strategic narrative dating back to 2003 that draws on its vision of China’s past history and its future role in the world.
And this is why it is important to keep in mind these long-term strategic narratives. As Digital Authoritarian actors such as Russia and China seek to use information warfare to promote their narrative while securing their respective domestic audiences’ opinions, they become vulnerable to each other’s narratives. These narratives could be mutually counter-productive and undermine their own influence and credibility.
Furthermore, as they aim to increase their authenticity and credibility, Digital Authoritarians outsource content, but they may not be able to oversee its details. The more local and hyper-specialized proxies Digital Authoritarians are tempted to use, the more likely they will lose control over the shaping of narratives and potential negative secondary effects.
Therefore, in our modern narrative warfare, it is crucial to compare threat actors’ narrative to underline their inconsistencies from one threat actor to another. As each domestic event is an opportunity for a threat actor to leverage existing fears, each narrative a threat actor draws upon this event is an opportunity for targeted audiences to discredit an existing threat actor.
Indirect targeting as a response to direct targeting can be a way to restore our strategic advantage.
Mimicking the way Russia seeks support on all continents, we then need to take our pilgrim’s stick to amplify these inconsistencies to vulnerable audiences.
I don’t know what carrots our fragile democracies can offer in return, but the strategic reflection has just begun.