Reading the Past: From Cameroon to Uganda, Unveiling African Elections TTPs.
Cameroon 2018 Presidential Election - Uganda 2021 Presidential Election - Your Press Corner
Hey there,
Thank you to everyone who attended this week's EU Disinfo Lab webinar where I had the chance to present the findings of my research on Foreign Information Manipulation and Interference (FIMI) threat patterns in the context of elections.
If you didn’t watch it but are curious to hear how my voice sounds (spoiler: heavy French accent), you can watch it here. I am also curious to hear your comments, remarks, and feedback, so feel free to let me know!
Back to elections. This week, I continue discussing the African specificities of information manipulation and “disinformation”, summarizing the findings of the recent publication “Digital Disinformation in Africa,” edited by Tony Roberts and George Hamandishe Karekwaivanane and composed of many contributions discussing the situation in ten African countries: Nigeria, Cameroon, Angola, Mozambique, Egypt, Ethiopia, Zimbabwe, Kenya, Uganda, and the Democratic Republic of Congo.
Today’s newsletter will be dedicated to summarizing the 2018 Cameroon and 2021 Uganda presidential elections related cases of “disinformation”. Next week, we will finish this African tour with two more countries, Angola and Kenya, as well as a few insights from myself.
What to be expected:
Information Manipulation and Disinformation in Cameroon’s 2018 presidential election
Information Manipulation and Disinformation in Uganda’s 2021 Election
Information Manipulation and Disinformation in Cameroon’s 2018 presidential election
In “Digital Disinformation in Africa”, Simone Toussi discusses the “unprecedented levels of public concern raised about the influence of digital disinformation on the election process” in the context of the first social media election in Cameroon.
She first provides a very concise but detailed contextual background of the situation in Cameroon. I recommend reading it for anyone who wants to catch up on Cameroon’s political history. For the readers in a hurry, let’s say that she highlights several vulnerabilities:
Internal divisions and tensions caused by the colonial interests of Germany, France and the United-Kingdom. France and the United-Kingdom created artificial borders that ignored the reality of the plurality of ethnic groups and languages. Today, we can count 240 ethnic groups and languages in the country.
Increasing terrorism threat related to the Boko Haram group making Cameroon “the second-largest target of the Islamist group after Nigeria” and generating thousands of displaced refugees and an influx of Nigerian refugees.
A marginalized community “the Anglophone”, which represents 20 percent of the population, and calls for independence of the Anglophone regions of Cameroon despite violent military repression.
the weaponization of laws related to online rights and freedom, jeopardizing access to information and freedom of expression.
the weak media landscape, made of unreliable media outlets, and affected by government censorship and intimidation campaigns.
Simone Toussi then moves to the information manipulation and disinformation cases in Cameroon. “Disinformation” is not a new phenomenon in Cameroon and can be traced back at least to the beginning of the post-colonial era. For example, the United-Kingdom was responsible for attempting to influence the calls for independence from Nigeria from Southern Cameroons. In the 2004 Presidential election, President Paul Biya is said to have hired American public relations firms to influence the results. However, the research remains scarce related to the use of social media to disseminate information manipulation and disinformation. What were, therefore, the extent of these tactics in the 2018 presidential election, the first social media election of the country?
The author first describes the trends that were highlighted in the reporting available:
According to a report by Belgian journalist Arne Gillis, President Biya manipulated the online environment to improve his image and reputation after the Anglophone and Boko Haram crisis. He relied on the American firms “Mercury Public Affairs”, “Glover Park Group” and “Clout Public Affairs”. Some accounts disseminating regime’s propaganda were simultaneously observed such as @CameroonTruth and @AgenceCamPresse.
According to two reports, one by Swedish researcher Christian Tatchou Nounkeu and one by Cameroonian researchers Kingsley Lyonga Ngange and Moki Stephen Mokondo, “citizen journalists”, “bloggers” and “activists” played a crucial role in the dissemination of disinformation during the Anglophone crisis in 2018.
According to CIPESA’s report, three factors are responsible for the disinformation in the country: the security situation related to the Anglophone crisis, the terrorist threat and the political instability due to the nature of the authoritarian regime. These factors appear closely linked to the concept of “post-election” crisis developed by the NGO ADISI Cameroun, which refers to a period of “protests, claims of electoral fraud and violent government suppression”.
Once these trends presented, the author presents 10 cases of election-related information manipulation and disinformation that occurred during and beyond the election cycle. Despite the lack of academic literature on the matter, the author was able to gather a unique compilation of cases drawing upon existing civil society reports and media commentary.
I tried to highlight the main Tactics, Techniques and Procedures (TTPs):
public denial of accurate claims: the government denied the Internet shutdown that occurred during the election period as well as the authenticity of a video showing Cameroonian soldiers torturing and killing civilian women and children.
impersonating legitimate entities: two individuals posed as independent elections observers working for Transparency International. They conducted an interview to spread the claim that the election was free and fair, before being denounced by Transparency International as imposters. The Cameroonian Minister of Employment and Vocational Training was also impersonated by inauthentic accounts, who disseminated several allegations in his name. The PCRN member Nourane Fotsing also claimed her identity was stolen to falsely advertise her products for money gains.
spreading false claims: the opposition party spread the claim that it had won before the official election results, which led the presidential party, without any further evidence, to accuse its adversary of disinformation and to general repression. Another false claim was spread through the website camerounweb(.)com which announced the resignation of President Biya. The claim was based on a screenshot of an inauthentic article from the German newspaper Die Zeit. It was followed by the rumor that the President had died. Another propagated claim was that the NGO Doctors without borders was helping separatists in the Anglophone region, leading to the cessation of activities in the region.
censoring journalist sources: several journalists were arrested for “spreading of fake news and cybercrime”. While we don’t know whether it was the case or not, it is noted by the author as a recurring TTP from the government to generate self-censor within the media ecosystem.
intimidation campaign: an inauthentic video disseminated the claim that the government authority in an Anglophone region was threatened by the people of Lebialem, a town in Southwest Cameroon.
Deepfake video: perhaps the first case of deepfake video in Africa? In June 2020, a deepfake video circulating on Facebook and WhatsApp, represented the French Ambassador making inauthentic claims. The video played on the controversial colonial past of France in the country.
Building upon these cases, the author further explains the role played “disinformation”, which is analyzed through three angles:
disinformation as a way to take power, both used by the government and the opposition party during the whole cycle of election.
disinformation as a strategy to close civic spaces. This strategy is linked to the censoring of journalists and media to diminish the civic space for democratic debate.
disinformation as a power maintenance strategy by the government, which is made of the denial of government’s abuses. It is developed through the use of foreign firms and is strengthened by the gaps of the national regulation.
To conclude the author makes the following recommendations: the government should produce new legislation to protect free expression online for all stakeholders; researchers should continue documenting these disinformation cases; the media should respect its code of ethics and deontology; civil society organisations have a responsibility to protect the integrity of elections and should form effective alliances with key stakeholders.
Information Manipulation and Disinformation in Uganda’s 2021 Election
This chapter, written by Tony Roberts and George Hamandishe Karekwaiwanane, focuses on the 2021 presidential election in Uganda, which was targeted by many cases of information manipulation and disinformation.
The authors start by presenting the contextual background surrounding Ugandan politics. They underscore the specific historical relationship between media technology and power since the country gained independence in 1962. Media technology, populism and radical ideologies are the three ingredients of the authoritative regime to hold onto power.
Disinformation is not new either. It is used to hide the crimes of the government in 1977. Uganda is considered a “hybrid regime” according to The Economist in 2022. And “digital repression” appears to be the main characteristic of Uganda’s 2021 election, relying on two primary strategies “technological and legal content regulation”. Digital repression’s TTPs have been summarized by the researcher Feldstein as the followings:
These TTPs did not suddenly appear in 2021 but existed already since the first multiparty election in 2006. While the internet was new in the country, it was already seen by the government as cutting-edge technology that could threaten its interests. The internet started to be controlled: the website Radio Katwe was blocked based on the claim that it was publishing “malicious and false information against the party and its presidential candidate”. This claim would become the leitmotiv of the government for the next years.
It is worth pausing a moment here to look at the date 2006. The more we learn about how the internet was leveraged by democratic and authoritarian states, the more it seems to me that authoritarian states very early on understood the power that this new technology could bring if they controlled it. Meanwhile, it looks to me as democratic countries acknowledged rather late the possibilities offered by the internet if they integrated it into their organisation.
For example, only in 2008 did the United-States appear to recognize the role that online spaces can play in supporting its public diplomacy’s efforts. In this statement “Public Diplomacy 2.0: A New Approach to Global Engagement”, James K. Glassman, Under Secretary for Public Diplomacy and Public Affairs, describes the power of Facebook Groups in Colombia to drive protests against the FARC. He presents a specific project, part of the new U.S. approach to Public Diplomacy, where he sees the internet as an enabler against violence and extremism in many countries in Latin America, Africa, the Middle-East and Asia. At the time, Google, MTV, AT&T, Howcast.con, Access360 Media, Columbia University and Facebook are designated partners to support this project to produce an online hub and a “giant global conversation about how young people can oppose violence and extremism”.
Reading this in 2024, in the context of the students protests worldwide and highly publicized in the U.S., this speech seems taken out of a fiction.
Furthermore, as James Glassman lays down his new approach to the “war of ideas”, one can wonder if anyone has ever taken the time to step back and assess whether the principles he developed and that have likely guided the U.S. approach for the last 20 years, effectively worked.
Back to Uganda. While the U.S. was implementing its idealistic vision of public diplomacy in a new digital era, Uganda was facing the harsh reality of the misuses of the internet. The 2010 Arab Spring only reinforced the crackdown on the internet, for example, through blocking SMS which comprised words related to the Arab Spring. A key terms search, not to discover online communities but to close them. Facebook and Twitter also became blocked for “national security” reasons.
Starting in 2016, the President started using proxies in two different ways, the first one a bit folklore, the other one more similar to online information manipulation as we know it today. Local musicians were paid to compose songs supporting the President. Meanwhile, “a bot army” was activated, promoting on social networks his image and narratives. It is possible that a U.S. firm was recruited for the second tactic, but it was never verified.
The ability of the Ugandan President to innovate on digital repression is quite stunning. In 2018, he put forward a “social media tax” to allegedly reduce gossip and disinformation on WhatsApp, Facebook, Skype, and Viber (do you remember Viber?). It was later replaced by a 12% direct tax on internet data. Well, at least it’s a measure based on more objectivity than the precedent one. Whether it is fair and necessary is another debate. Meanwhile, the crackdown on journalists and media continued at scale…
How did information manipulation and disinformation spread during the 2021 Ugandan election?
In 2021, the Covid-19 pandemic was already harming the democratic stage as physical rallies were banned, pushing the candidates and the public to resort to an already censored and weak online space to campaign. The authors sought to identify the trends of disinformation, based on analysis of Twitter, complemented by Facebook and YouTube. Twitter’s choice is justified for two reasons: it is the place mostly used by politicians and journalists and the social network where content was freely available at the time of the research. They analyzed a sample of tweets using NodeXL in November 2020. Here are the main TTPs they found:
Developing inauthentic images and videos to spread false and misleading allegations about the candidates. These images and videos were taken out of their initial context to support these allegations, sometimes this content was even imported from other countries.
Integrating audience’s vulnerabilities in the narratives such as fears and hate of police violence, foreign interference and of minorities such as the LGBTQ+ community.
Denial of authentic reporting on domestic abuses.
Hoaxes and plain false information, as document by the fact-checking platform PesaCheck.
Exploitation of influential figures such as U.S. President Biden and former President Barack Obama, allegedly supporting one candidate.
The government also took several measures to limit free speech, such as requesting Google to take down YouTube accounts of the opposition (which Google likely refused). Facebook also suspended government’s accounts which were characterized as “CIB” according to the platform’s policy by the DFRLab. The investigation interestingly unveiled that two Ugandan private companies “Kampala Times” and “Robusto Communications” were behind the inauthentic accounts.
And eventually on 12 January, 48 hours before polling, the internet was shut down.
In the Ugandan election, the authors demonstrate that both state and opposition groups manipulate online information to spread false claims against each other. The Ugandan state demonstrates nevertheless a superiority in this battle of disinformation narratives, relying on more resources to implement its strategy.
The authors point out that the experience of platform’s refusal to block opposition disinformation while blocking disinformation produced by the state may further push Uganda’s government in repressive legislative measures. In 2022, a new law was established “The Computer Misuse Act”, adding to the pile of repressive measures to ban “the misuse of social media”. It was immediately implemented, suppressing bloggers’ online speech.
The authors further develop the argument that platforms, because of their lack of understanding of the local contexts, can take decisions which have a negative effect on the country instead of leveraging their power to bring positive change.
The authors recommend further research and documentation in the dynamics of this digital disinformation and to look at ways to legislate and regulate in ways that protect citizens’ digital rights.
That’s it for today’s edition! Next week, we will finish the book’s review as well as the brief flight over the African continent the chapters on Angola and Kenya’s elections, looking at “meme-fication” tactics and the role of “mercenaries”.
Your Press Corner
Here’s the weekly readings to keep you connected to all the conversation on global elections and information operations:
Don’t Hype the Disinformation Threat | Foreign Affairs - Downplaying the Risk Helps Foreign Propagandists—but So Does Exaggerating It.
The vocabulary of disinformation (economist.com) - because the challenge of taxonomy is real.
Guarding the Ballot: Addressing Foreign Disinformation and Election Interference | Council on Foreign Relations (cfr.org) - for those who missed the conference organized by CFR.
Truth and reality with Chinese characteristics | Australian Strategic Policy Institute | ASPI - The building blocks of the propaganda system enabling CCP information campaigns.
China’s Chilling Cognitive Warfare Plans – The Diplomat - War is entering a new, and very frightening, domain.
Europe’s election campaigns are under the constant threat of foreign interference (france24.com) - about Chinese interference in Europe.
Foreign interference may have 'impacted' 2021 result in one riding, inquiry finds (citynews.ca) - Foreign meddling attempts didn’t change who won the last two federal elections in Canada, but may have changed the result in one riding in 2021, a public inquiry concluded Friday.
Russia and China co-ordinate on disinformation in Solomon Islands elections | The Strategist (aspistrategist.org.au) - Moscow and Beijing likely worked together to sow disinformation globally that was propagated locally by political parties in the lead-up to Solomon Islands’ national and provincial elections on 17 April 2024.
Elections are battlefields for the Kremlin: go after the leaders - EUvsDisinfo - Technique No. 1: go after the leaders.
Pro-Russia disinformation floods Slovakia ahead of crucial parliamentary elections (msn.com) - "The disinformation ecosystem in Slovakia is now reaching its peak," said Peter Duboczi, editor-in-chief of Infosecurity.sk, noting the upcoming election is the first in recent years to reflect the "full potential" of its effects.
Sowing division — Russian disinformation becoming more sophisticated | Yle News | Yle - Meanwhile in Finland.
The Bulgarian Factor in North Macedonia’s Elections and EU Prospects | German Marshall Fund of the United States (gmfus.org) - The May 8 elections will be crucial for North Macedonia’s EU membership negotiations, with the campaign dominated by complicated relations with Bulgaria and a pro-Russia party leading in the polls.
DeSmog Launches Investigation Into Food and Farming Misinformation Ahead of EU Elections - DeSmog - WhatsApp and other closed messaging platforms have proven to be a popular channel to circulate disinformation and hate speech with a view to gaining electoral advantage.
Far right newspaper promotes climate disinformation on Meta | Global Witness - A Global Witness investigation has found that The Epoch Times is targeting people in the UK with adverts on Facebook and Instagram that deny the existence of climate change and question its severity.
US election disinformation targets non-citizen voting (france24.com) - Illegal immigration on the US southern border is a top talking point among Republican politicians, but some are taking it a step further by promoting disinformation about non-citizens voting in the presidential election.
Extremist Militias Are Coordinating in More Than 100 Facebook Groups | WIRED - After lying low for years in the aftermath of January 6, exclusive reporting shows, militia extremist groups and profiles have been quietly reorganizing and ramping up recruitment and rhetoric on Facebook.
Trump campaign debates joining TikTok, seeing opportunity, fearing backlash - The Washington Post
India’s TikTok ban didn’t even slow disinformation. We need a better solution. | The Hill - Comparison between American and Indian contexts.
TTP - Facebook Black Market for Ad Accounts Looms Over India Election (techtransparencyproject.org) - Facebook hosts a thriving black market for fake and stolen accounts. Some sellers are offering accounts that can run political ads in India, raising election interference fears.
Indian politicians spend big on AI election content - Rest of World - As India’s political parties pay for AI ads, startups say they have set their own “ethical” boundaries around deepfakes.
OpenAI Releases ‘Deepfake’ Detector to Disinformation Researchers - The New York Times (nytimes.com) - The prominent A.I. start-up is also joining an industrywide effort to spot content made with artificial intelligence.
Facebook’s AI Spam Isn’t the ‘Dead Internet’: It’s the Zombie Internet (404media.co) - Facebook is the zombie internet, where a mix of bots, humans, and accounts that were once humans but aren’t anymore mix together to form a disastrous website where there is little social connection at all.
Chatbots recommend disinformation and fear mongering, tech companies tighten restrictions (nos.nl) - Google and Microsoft are limiting the answers their AI chatbots provide in response to queries about the European elections. Their move follows an investigation by Nieuwsuur, which found that the chatbots provided answers violating their own policies and promises.
Fact checking around the elections starts now - Namibia Fact Check - Questionable claims and statements are already starting to dominate as political rhetoric ramps up ahead of the elections later this year.
How journalists can combat political disinformation in a world of echo chambers and deepfakes - PEN America - The National Press Club Journalism Institute hosted a conversation on May 1 between journalists and experts who laid out the scope of the disinformation problem – and how reporters can combat it.
Why the U.S. Intelligence Community Needs an OSINT Agency | Lawfare (lawfaremedia.org) - The establishment of a dedicated OSINT agency would be a step towards reconfiguring the IC for the challenges of the information age.