The Case of African Elections: Why This Title Is Misleading
New book on Africa related disinformation - The cases of "Russian interference" in previous elections in Zimbabwe, South Africa and the Democratic Republic of Congo
Hey there,
In the next months, several elections are coming on the African continent. Togo election just happened, while Chad, South Africa, and Madagascar are next in May. As expected, the English-speaking news cycle is less vocal about these elections than when it comes to America, Europe, and Asia.
However, the African continent has hosted several elections in the past that were targeted by Foreign Information Manipulation and Interference (FIMI) operations from different threat actors. But here again, the expertise focused on FIMI deploying in the Sahel region, with the infamous Wagner/Prigozhin operations.
As I was looking for resources to know more about FIMI activities targeting the rest of the continent, I was glad to see that a new book was just published: “Digital Disinformation in Africa,” edited by Tony Roberts and George Hamandishe Karekwaivanane and composed of many contributions discussing the situation in ten African countries: Nigeria, Cameroon, Angola, Mozambique, Egypt, Ethiopia, Zimbabwe, Kenya, Uganda, and the Democratic Republic of Congo.
Today’s newsletter will be dedicated to summarizing one part of this new resource, hoping it will help this publication to be discovered and read. And as Africa can’t be covered in one article, a few other chapters will follow.
Warning: the book's main argument is pretty clear: there is not ONE case of African election-related information manipulation activities, but a plurality of cases connected to a diversity of context, actors, and audiences. Summarizing parts of this book is also a contribution to supporting the plurality of experiences in our field, to grow stronger and more resilient in the context of information manipulation and interference.
What to be expected:
What Do We Mean By “Digital Disinformation in Africa”?
The book “Digital Disinformation in Africa” was developed through the contribution of the African Digital Rights Network, which comprises activists, academics and analysts who are experts in digital rights across Africa. It is the first attempt to gather case studies from different regions of Africa into a single publication.
Its aim is to cover the broader concept of “disinformation”, defined in the book as "the intentional deployment of lies to manipulate people’s beliefs and behavior in order to further political interests". The focus is primarily on disinformation strategies, goals, and effects related to African governments across several African countries. While it also addresses the question of FIMI, its main concern lies in the interrelation between "power, politics, and propaganda", which can manifest in various forms across African countries.
The authors break down disinformation into three elements: the deployment of intentional falsehoods, the manipulation of beliefs and behaviors, and the furthering of power interests. These components serve one purpose: advancing someone’s political interests. This definition and breakdown of disinformation are crucial to the rationale of the book.
To understand these three elements better, the authors also identified four aspects of disinformation, which will structure the analysis of each case study:
Dimensions of disinformation: Who are the actors involved, the scale and what differentiate this type of digital disinformation?
Dynamics of disinformation: What are the tools, techniques or tactics used?
Drivers of disinformation: What are the motives? What effect does it seek to have on beliefs and behaviours?
Directions of disinformation: What are the future trends and scenarios as well as lessons learnt?
The categorization of disinformation into four aspects—dimensions, dynamics, drivers, and directions—is not a well-known framework like ABC or ABCD, but rather a remix of these concepts. Nonetheless, it proves to be quite efficient, especially concerning the third category.
The third category connects what most other frameworks have failed to do, which is linking motives to the targeted audience. This forces analysts to pause and reflect on the nature of the content and see beyond actors and tactics. It is inherently political, particularly when considering its connection to "power interests." However, it may suggest that other metrics can be provided to evaluate behavioral and cognitive changes without delving into troubled waters.
This category gains even more relevance when we understand the book's angle. Disinformation is considered by the authors as intrinsically linked to power relationships. They highlight a lack of research in this field, noting that commonly known theories such as "hard power," "soft power," and "smart power" mostly apply to geopolitical power between different states. What the authors allude to here is the use of digital disinformation to assert power in a domestic context between the state and the people, as well as within various levels of society.
The book aims to fill a void in research, which has long focused on Western countries despite the increasing spread of digital disinformation on the continent. Disinformation itself is not new on the continent; there are examples dating back to ancient times, such as in Ancient Egypt, and it was also deployed during the colonial period. Colonial powers established newspapers and radio media to spread manipulated information to support their interests. The book provides examples such as the coverage of the Anglo-Boer war at the end of the 19th century and South Africa's apartheid-related disinformation, both illustrating the use of disinformation and propaganda to legitimize state actions.
According to the authors, this historical approach demonstrates that digital technologies are not suddenly causing disinformation in Africa. Focusing on these new technologies should on the contrary support the understanding of what facilitates today digital disinformation compared to previous forms of disinformation. Furthermore, the book underscores that the origin of disinformation is not necessarily only associated with Russian information operations but can also be found in the context of Western colonialism.
It follows an empirical, contextual, and situated approach to the study of digital disinformation in Africa. The authors acknowledge the differences in the use of social media and internet access in Africa compared to other geographic regions. It is an effort to identify the unique nature of digital disinformation in Africa without applying a mostly Western perspective.
The organization of the case studies is guided by three themes:
the role of digital disinformation in creating and maintaining social and political hierarchies,
the relationship between digital disinformation and conflict,
the role of digital disinformation in recent elections across the continent.
While all three themes are significant, I will only summarize the last theme to stay focused on our common topic of interest: elections integrity.
Are Claims of Russian Disinformation in Africa Founded?
The title of this chapter, focusing on Russian-related disinformation in Zimbabwe, South Africa, and the Democratic Republic of the Congo, is quite absorbing. Behind the word "claims" lies the difficult question of attribution.
While there have been past attributions of electoral interference to Russian-related threat actors, such as Prigozhin/Wagner operations or Archimedes activities, the author of this chapter, Seyoung Jeon, asks if the claims concerning the three mentioned election cases can be validated based on existing evidence.
Without denying the question of Russian disinformation activities in certain African countries, the author shows that the existing data on Twitter does not provide any evidence of Russian disinformation in the 2018 presidential election in Zimbabwe, the 2019 South African general election, and the 2018 Republic of the Congo general election.
To reach this conclusion, the author analyzed a Twitter dataset containing more than 15 million tweets and conducted several interviews to understand the context of each election. The dataset is composed of accounts affiliated with the IRA, Prigozhin, and the GRU, both in Africa and elsewhere.
The rationale behind the use of Twitter is further explained. The platform's policy regarding data sharing supported the analyst's goal to have the most comprehensive look. Meta's platforms such as Facebook and WhatsApp did not provide the same scope. This is important to keep in mind as we will reach the conclusions made by the author because Twitter may also not be the most likely platform used by Russian-linked FIMI actors.
The author started with a literature review of documented Russian information operations in the context of African elections, which is quite helpful to ensure you didn’t miss any significant reports on the topic! The author identified three common characteristics of this existing literature, which I strongly agree with:
Most of the publications emphasize the intentions of the Russian threat actor, not necessarily its accomplishments. This is an important distinction in our field of open-source intelligence, where the claims we make must be based on solid evidence. Observables, not inferences.
Reports often look at samples of activity, not the whole dataset, which can give a misrepresentation of the actual significance of the accounts compared to the whole observed activity.
The specificity of each African country is often overlooked, which diminishes the investigation into country-by-country manifestations of disinformation operations.
This last point is particularly true when we think, for example, in the Sahel region, about the propagation of the "anti-Western" narrative. I rarely read about it with a comparative approach between Mali, Burkina Faso, Niger, Senegal, or Chad. Besides local fact-checking organizations dedicated to specific countries and the project "All Eyes on Wagner" dedicated to a specific threat actor, there is no particular open-source effort, to my knowledge, to identify country-specific disinformation cases.
So, what are the findings?
Zimbabwe 2018 Presidential Election
Zimbabwe’s presidential election was significant as it marked the first time since 1980 that the current President, Robert Mugabe, was not running as a candidate. The two running candidates, Mnangagwa (ZANU-PF, pro-Mugabe) and Chamisa (MDC, opposition), appeared very close in the surveys. The election results were delayed, and political violence followed the announcement of the official results. When Mnangagwa was announced as the winner, Chamisa accused Russia of manipulating the election.
According to the dataset analysis, most of the tweets related to the Zimbabwe election were either pro-ZANU-PF or anti-opposition MDC. Only 4% of the tweets were identified as cases of disinformation, while the vast majority of the tweets were hyper-partisan. Three accounts flagged by Twitter as Russian-linked accounts, namely "EMarshall," "DavidKangwa," and "Tavonga21," were among the most active users. They portrayed themselves as local users.
The activity of these accounts shows that they were responsive to local events, and the proportion and nature of content posted on the platform varied with the appearance of local events.
According to the author, these results confirm some patterns of Russian information operations but also provided other types of patterns. For example, Russian information manipulation activities were not solely dedicated to polarizing opinions on both sides of the spectrum but focused on promoting one candidate or criticizing another.
I want to pause here before moving to the next election case because this remark, apparently trivial, is in fact the manifestation of a significant source of divergence between experts regarding the patterns of Russian operations in elections.
On May 7th, I will be presenting, thanks to the EU Disinfo Lab and their invitation to their webinar series, my research regarding FIMI and elections. Spoiler: it includes the DISARM framework and constructing databases…
When I started looking at the case of Russian information operations in the specific context of elections, comparing them cross-countries, cross-audiences, I was surprised by the different types of discourses that came with the strategic goals attributed to these campaigns by governments, platforms, companies, and researchers. Some claimed that Russia was solely aiming to sow discord and chaos, polarizing on both sides of the political spectrum, while others claimed that Russia was promoting candidates linked to its geopolitical interests.
I feel a bit relieved to read that someone else noted this discrepancy when assessing Russian strategic goals. And that leads me to my following thought for all of you: Can we realistically assess strategies and goals based solely on open-source information? Or should we refrain from drawing such ambitious assumptions about FIMI threat actors in our reporting, recognizing that we may lack other types of information to adequately assess the goals and strategies? I feel we may sometimes let ourselves be intoxicated by the illusion of power given by OSINT tools and forget that the information we gather, even if it becomes intelligence, is only one small piece of the puzzle.
South Africa 2019 General Election
The South Africa 2019 General Election occurred during a period of political tensions, with the African National Congress incumbent leader Jacob Zuma achieving his second final term amid calls for resignations and street protests. The general election was a contest between pro and anti-Zuma factions, ultimately won narrowly by the anti-Zuma faction led by Ramaphosa.
Claims of Russian election interference in South Africa mainly stemmed from the Dossier Centre, which published documents about plans from AFRIC, a Russia-linked organization, to interfere in favor of the Zuma side.
Before the election, South Africa was targeted by a domestic disinformation campaign in 2017. The Zuma side hired a UK public relations firm named Bell Pottinger to spread pro-Zuma narratives and influence the domestic audience. However, the tweets were primarily hyper-partisan and not necessarily labeled as disinformation.
In the 2019 election, according to the author, the tweets were also primarily hyper-partisan. Those that could be labeled as disinformation mostly mentioned the "white replacement conspiracy theory." Two accounts, "DominantUS" and "PeytonCash," were the most active in these disinformation-related tweets. However, the study of their timeline showed that they were not covering only South Africa.
The author concludes that there was a very low volume of Russia-linked disinformation. Meanwhile, domestic actors appeared to be much more significant in spreading disinformation targeting the country, as revealed by the UK firm's operation.
Democratic Republic of the Congo General Election
The 2018 general election in the Democratic Republic of the Congo (DRC) took place in a highly tense environment, triggered by the postponement of the election by then-President Joseph Kabila, who was attempting to extend his term. Two years later, the DRC election was finally held with three main candidates: Shadary (Kabila’s successor, FCC), Tshisekedi (UDPS, opposition), and Fayulu (Lamuka alliance, opposition).
Several observers claimed that the election was manipulated in favor of Tshisekedi, without linking it, however, to foreign interference. Furthermore, there wasn’t much data available in the Twitter dataset to study the DRC case, even for French-language tweets. Despite the precautions taken, such as conducting the search over a three-year period or adding other thematic key terms related to the security situation in the DRC, there was not a big volume of tweets related to Russian-linked actors concerning this election.
The overall findings showed that there was no specific content that could be classified as disinformation, no specific engagement around certain events, and the only account particularly active, 'SansTravailFixe,' did not appear to specifically target the DRC.
From the three case studies, the author drew the conclusion that there was "little to no Russian disinformation targeting these three countries." Furthermore, the observed disinformation content did not arouse engagement from the targeted audience.
While it is not technically wrong, building upon the author’s findings in the dataset, I nevertheless find it hard to draw such an authoritative conclusion without adding the following nuance "on Twitter".
While it is understandable that less data may be provided by Meta on Facebook and WhatsApp, it is difficult to provide such an assessment based solely on an investigation on Twitter. A simple search on Google shows that WhatsApp appears to be the favorite platform in the three countries, followed by Facebook. I find it really difficult then to overlook the importance of these platforms in the development of Russian operations, even though it may be true there was no particular case of Russian "interference" in these elections.
I put the word "interference" in quotation marks because this is another thing that I wish to address. The author switches constantly between "disinformation" and "interference," as if the first term necessarily caused the appearance of the second term. However, allegations of "interference" have rarely been associated with disinformation and information operations but rather with cybersecurity operations. The term "influence" is most likely to be used in the case of a disinformation campaign.
This might seem trivial and purely aesthetic, but when the U.S. constantly repeats since 2018 that there was no case of Russian, Iranian, and Chinese interference in U.S. elections, despite the many information operations cases reported, it means something. It means there was no hack and leak. No electoral infrastructure targeted. While the author makes an important case about the significance of words, it may have missed defining the term "interference" in the context of these case studies.
Despite these areas of concern on my side, I feel the author has a point regarding the importance of looking at local contexts and refraining from applying general findings, even though it is very tempting…
Your Press Corner
Here’s the weekly readings to keep you connected to all the conversation on global elections and information operations:
EDMO Launches ‘Be Election Smart’ Campaign ahead of 2024 European Parliament Elections – EDMO - This campaign will run for six weeks, aiming to support and empower European citizens in navigating the information landscape surrounding the European Parliament elections.
Albania Parliament’s Disinformation Committee: Welcome Vigilance – or Threat to Free Expression? | Balkan Insight - While addressing disinformation is a crucial task, the concern is that Albania's new parliamentary committee could be just another tool to silence critical voices.
Russian disinformation network “Pravda” grew bigger in the EU, even after its uncovering – EDMO - A few weeks after the publication of the VIGINUM report, EDMO can confirm that the campaign has expanded significantly in Europe, and in particular in the EU.
InfoEpi Lab – Doppelgänger Responds to U.S. Funding Vote - The Doppelgänger Operation has responded to the passage of aid to Israel and Ukraine by posting content to undermine trust in democratic alliances. Emphasizing unreliability and the resultant vulnerability to potential aggression, the content casts doubt on the U.S.’s commitment to allied nations.
Conspiracy theorists have turned from COVID to climate. How will it impact the EU elections? | Euronews - Euronews tracked nearly 4,000 conspiracist-leaning Telegram channels and groups across more than 20 European languages - here’s what they found.
EU to investigate Meta over election misinformation before June polls | Meta | The Guardian - Brussels to act later this week against Facebook and Instagram owner over policies on deceptive advertising and political content, reports say.
What U.S. Policymakers Can Learn from the European Union’s Probe of Meta (justsecurity.org) - One would like to think that U.S. politicians and policymakers are taking notes. Unfortunately, that’s probably a fanciful hope.
New Free Press Analysis: Latest Pledges from Tech Platforms Fail to Sufficiently Protect Elections in 2024 | Free Press - On Thursday, Free Press released Democracy Deferred: Social-Media Companies’ Meager Commitments to Election Integrity in 2024, an analysis of 12 major technology companies’ readiness to address political disinformation, manipulation and hate on their networks.
Elon Musk’s Malign Influence in Brazil | TechPolicy.Press - Even when he plays the fool, Musk and his ilk should be considered with deadly seriousness.
Surging Violence and Far-Right Extremism: Unpacking Social Media’s Role in the 2024 Election – Georgetown Security Studies Review - Underpinning the alarming shifts in extremist activities is the rapid evolution of the Internet and social media technologies, which have become the tools to facilitate extremist communications and radicalization endeavors across the United States.
Russian propaganda trying to manipulate topic of presidential elections in Ukraine - CCD (ukrinform.net) - Manipulative messages about presidential elections in Ukraine are circulating on the internet.
Ukrainian military intelligence claims attack on website of Russia’s ruling party (therecord.media) - The attack targeted United Russia’s servers, websites and domains, rendering the party’s digital platforms "partially inaccessible." The agency didn’t provide any further details about the operation.
Sweden provides €120,000 to counter disinformation related to elections | IPN - Sweden offers Moldova new support, in the amount of about €120,000, to counter disinformation related to elections.
The United States of America and Côte d’Ivoire Sign Memorandum of Understanding to Expand Collaboration on Countering Foreign State Information Manipulation - United States Department of State - and one more. From Europe to Africa? Let’s see if the U.S. starts covering the other regions of Africa or identifies one partner on each continent besides Europe.
Why China Is So Bad at Disinformation | WIRED - is it really?
China's efforts to sway the U.S. are bigger than TikTok : NPR - While much of the discussion about foreign interference in elections has focused on Russia since 2016, China presents a growing threat, according to the intelligence community, tech companies, and independent researchers.
Top GOP ‘election integrity’ lawyer charged in Arizona fake elector scheme Nebraska Examiner - Less than a week after the Republican National Committee unveiled a “historic” new program to monitor the polls for fraud, a top lawyer with the committee was among those indicted for an alleged scheme to use false fraud claims to overturn the results of Arizona’s presidential election.
Poll of Election Officials Finds Concerns About Safety, Political Interference | Brennan Center for Justice - The vast majority of local election officials have taken new precautions to secure the 2024 election.
Senate pursues legislation to ban AI deepfakes in election materials - The Washington Post - Perspective.
We must target the root cause of misinformation. We cannot fact check our way out of this | Samantha Floreani and Lizzie O'Shea | The Guardian - One of the best tools we have to clean up this mess is already in our hands.