Overt / Covert Cooperations & Collaborations
The U.S. 'Civil War' Narrative - Big Tech Like to Talk Big - Revival of the Weimar Triangle - 'Portal Kombat' Infrastructure and Tips to Undercover It - DISARM Certified Courses Announcement
Hey there.
This week, instead of discussing current elections, I think it’s time to mention actors and the cooperation they appear to be developing to respond to this year’s multiple elections.
Both threat actors and targeted audiences are displaying attempts to coordinate efforts, aiming either to attack or to secure electoral processes. Will these attempts showcase results? It’s hard to tell, but if they do, these collaborative efforts might have a long-lasting impact on shaping international affairs after 2024.
Let’s dive in!
The Far-Right - Russian - Chinese Connection
Whether or not you live in the U.S., you may have seen images or videos of U.S. citizens driving trucks and campers across the country to gather at Texas’ border with Mexico.
Participants arrive to attend the Take Our Border Back caravan in Texas on February 3, 2024. Credits: Lokman Vural Elibol/ Getty Images.
You may have read or heard words about an ‘invasion’ and a ‘civil war’. You may even have heard the word ‘Texit’.
But who did you see, read, and hear it from?
The decision of the U.S. Supreme Court to remove razor wires installed previously at Texas’ border by Texas Governor Abbott has fueled a lot of anger and distress within the American debate.
Fears of an ‘invasion’ from Mexican migrants have been exploited. Resentment and anger against the Federal government and the Biden administration have been used to encourage actions to contest that decision while sowing doubt towards U.S, democratic institutions. This instrumentalization of local vulnerabilities has turned into amplifying an existing narrative of a struggle between Texas and the federal government, which could lead to a ‘Texit’.
‘Texit’ refers to Texas secession movements, in particular to the secession of Texas from the Union in 1861 to form the Confederacy, which contributed to the start of the American civil war. It is a term that sparks a lot of emotions and has been leveraged for years. Nonetheless, it was again given coverage in early 2024 by various groups with one shared interest, to shake the U.S. democracy.
According to Devin Burghart, the executive director at the Institute for Research and Education on Human Rights quoted in this Wired’s investigation, “Data we collected tells us emphatically that the standoff between Texas and the federal government has become a magnet for far-right vigilantism,”.
U.S. far-right, also called the Alt-right, extremist groups such as Proud Boys and conspiracy movements, including QAnon, leveraged the Supreme Court’s decision to encourage political action to defend Texas’ border. According to Wired, Telegram was used to coordinate efforts to organize the rally at Texas-Mexico border. For extremist groups such as Proud Boys and the Aryan Freedom Network, it is also a way to recruit new members according to Heidi Beirich, co-founder of the Global Project Against Hate and Extremism.
What is striking about the U.S. extremist and conspiracy movements is how they appear to directly put into action American officials’ statements. They do not look like separated movements, operating on their own. During these events, they seemed to directly take their orders from influencers and political figures, such as Governor Abbott and the 25 Republican governors who called to defy the federal government.
At the same time, calls for civil war were pushed, this time not from domestic actors, but from foreign actors, Russia and China.
Russian actors conducted mostly an overt operation with a few covert activities to amplify the ‘civil war’ narrative. According to Wired’s investigation, Russian state officials, state medias, influencers and bloggers commented on the events and sought to amplify calls for political violence. For instance, one Sputnik correspondent tweeted a video stating: ‘There’s a big convoy of truck drivers going down there. So, it can very easily get out of hand. It can genuinely lead to an actual civil war, where the US Army is fighting against US citizens’.
Russian actors used all kinds of assets from state media to bots and trolls to amplify this narrative and encourage the organization of rallies in the U.S. This tactic is reminiscent of what the Internet Research Agency (IRA) tried to do in 2016 during the U.S. election. According to Caroline Orr, in her newsletter Weaponized, there was a sudden surge of Russian speaking Twitter accounts promoting Texit, a tactic that was already used in 2016 by the IRA. This time, bots accounts linked to Doppelganger/RRN campaign sought to amplify the narrative as well.
What is striking in this Russian information manipulation campaign is that Russian actors, who infiltrated U.S. far-right and extremist Telegram channels, were spotted by channels’ users according to Wired’s investigation. Wired reported that one of the US-based members of one Telegram channel said, talking about Russians infiltrating the group: “They want a civil war/chaos more than anything. What’s bad for America is great for Russia.”
This is another example that it is hard to distinguish domestic actors from foreign actors. In this particular case, the domestic actor acknowledges the foreign threat actor’s activities and how these foreign activities and motives might align with the domestic activities and motives. Does it make the domestic actor an accomplice of the foreign threat actor?
While Russia was overtly calling for civil war, Chinese international English-speaking media outlets did not specifically frame their headline to support this narrative, although they evoked how the standoff “might further split America” or how “nearly 90% believe ‘U.S. against U.S.’ may become the norm”.
Nonetheless, there are signs that Chinese actors conducted a covert online information operation. The BBC reported that in China, the narrative that Texas had ‘officially declared war to secede from the U.S.’ was trending on platforms such as Weibo. Pictures and videos, distorted from their original context, were used to support this narrative. For example, pictures and videos from military tanks circulated, with the claim that these tanks were American tanks sent to defend the border.
This narrative was pushed by Chinese influencers as well, and was not necessarily corrected at first, neither by Chinese platforms nor by local Chinese citizens who could not assess the real situation lacking access to verified sources in a censored Chinese environment.
What is worth noting is how this narrative of a civil war, spread from U.S. platforms to Chinese platforms, an online environment which blocks foreign outlets. The spread relied notably on cherry-picking certain U.S. outlets, which headlines and content contributed to produce a distorted image of the situation in the U.S.
Finally, one can wonder whether these threat actors - the U.S. extremist groups, Russia, and China - collaborated to amplify this ‘civil war’ narrative. While no assertion can be made, it is interesting to watch them amplifying it at the same time. It also interesting to see how the past repeats itself, with one breaking news event triggering one opportunistic narrative that is then leveraged by threat actors.
The only thing new is the number of actors that adds up every time, perhaps mirroring the number of enemies piling up against democracies and following the proverb: “The enemy of my enemy is my friend”.
Big Tech’s Cooperation on AI
Last Friday, AI main tech companies signed a “Tech Accord To Combat Deceptive Use of AI in 2024 Elections”. It is a voluntary framework that advances seven goals: prevention, provenance (signals), detection, responsive protection, evaluation, public awareness and resilience.
These goals are supported by a commitment to a series of actions in 2024, including the development and implementation of technology to mitigate risks, detecting the distribution of deceptive AI election content, and fostering cross-industry resilience.
This Tech Accord has received a lot of publicity in the last week, but it remains unclear what it will mean for protecting targeted audiences against malign uses of GenAI.
It appears more as legitimating Big Tech’s ongoing prevention and detection activities and justifying their need develop resources and receive money than as a new framework that could directly ‘combat deceptive use of AI in 2024’.
Furthermore, the definition given of deceptive AI-Election content remains limited, as it appears to correspond to the type of content directly linked to elections, such as in the New Hampshire’s case. Other type of GenAI content, for instance Taylor’s Swift deepfakes would not fit that definition.
However, most of the tactics used by threat actors do not include the direct mention of an election, instead, they aim at polarizing domestic issues. So, will this Tech Accord prove relevant?
Let’s wait and see.
Another Week, Another Framework
Last Friday as well (what a productive Friday), Canada, the U.S. and the U.K jointly endorsed a framework to counter foreign information manipulation threats.
It is not a new framework, but the one published by the U.S State Department a few weeks ago. It aims to provide a standardized approach to tackle the challenge and build ‘interoperable and complementary systems’.
This endorsement has received positive coverage from some civil society actors who underscored the importance of cooperation with civil society and government transparency.
However, the foreign information manipulation threats targeting Canada, the U.S. and the U.K., the three G7 English-speaking countries, are not the same as the ones targeting the ‘Global Majority’ as discussed in this newsletter’s previous editions. While this framework remains generic, its endorsement only by three countries, who belong to the same club, feels like they are thumbing their noses at the rest of the world.
Or perhaps it was a reactive move to the other cooperation effort that was showcased two weeks ago: the revival of the Weimar triangle.
The Revival of The Weimar Triangle
On February 12, France, Germany and Poland announced efforts to revive the Weimar triangle, a format created in 1991 to deepen and unify cooperation between the three states as well as within the EU and NATO. In 2024, the objectives of the format remain relevant, given the context of the war in Ukraine and the new Polish leadership.
Among these efforts, France, Germany and Poland announced that they would jointly combat information manipulation operations coming from Russia. The announcement stated:
‘We have agreed to set up a Weimar alert and response scheme between France, Germany and Poland on Foreign Information Manipulation and Interference, and we will work towards further EU mobilization in this field, with a view to ensure more effective measures actions from online platforms.’
This announcement came on the same day that VIGINUM, the French Agency to counter foreign information manipulation threats (FIMI), published a new report uncovering the activities of a structured and coordinated pro-Russian propaganda network targeting Ukraine and several western countries named ‘Portal Kombat’.
‘Portal Kombat’, a Key Investigation Uncovering Websites’ Infrastructure.
‘Portal Kombat’ corresponds to a network of 193 information portals, belonging to the same digital infrastructure, divided in three ecosystems: one targeting Western countries supporting Ukraine, one targeting Russian-speaking audiences in Ukraine and one targeting Russian and Ukraine audiences. These ecosystems were not created at the same time, with the one targeting the Western countries being the most recent and linked to the war in Ukraine.
VIGINUM’s report is significant for two reasons: it gives us a deep and clear insight on the tactics, techniques and procedures (TTPs) used by Russia and it provides us with an effective methodology to investigate websites’ infrastructure. Websites' infrastructure is often disregarded compared to social networks, but it is key to draw technical indicators to conduct the characterization of online activities.
‘Portal Kombat’ Tactics, Techniques and Procedures (TTPs)
VIGINUM observed the following TTPs used by the ‘Portal Kombat’ network:
Search Engine Optimization: the website pravda-fr(.)com and others appear in the first results on various search engines, including Google, when long-tail keywords are used as an entry.
Automated Publication of Content: while the websites do not produce any original content, they massively copy-paste texts from three categories of sources identified by VIGINUM: social media accounts (Telegram, Vk), Russian press agencies, official institutions and actors.
Segmentation of Audiences: the content distributed is not the same depending on the ecosystem, but also within the ecosystem. For instance, VIGINUM has observed an offensive distribution strategy targeting the Russian-speaking audiences in Ukraine. The content distributed is chosen according to the targeted locations and their demographic characteristics. VIGINUM has mapped 41 towns according to the domains’ names of the network.
‘Portal Kombat’ Infrastructure
VIGINUM’s report underscored the importance to look at details to make sense of connections between websites.
The two reports are full of little tools and tips that could be handy for any investigator:
Using the source code, VIGINUM identified a similar string ‘ZOV’ for 41 websites.
It also identified an identical favicon icon for many of these websites. A favicon icon is a website icon used in search engine results and that can be used to pivot and identify websites using the same favicon.
It observed that one of the websites’ ecosystems had IP addresses with similar network ID (NET ID) and hosted on Russian servers. It used the NET ID to determine the real IP address of the websites.
It also noticed that two ecosystems had IP addresses that were part of the same autonomous system. An autonomous system is a network or set of networks governed by a single entity or organization.
Some IP addresses also shared another common characteristic: an e-tag, an HTTP response header field that helps with caching behavior.
It used the websites’ archives using tools such as web.archive.org to identify previous versions of these websites which could give some clues on their provenance. They identified in the footer the logo of a company TigerWeb preceded by the words « Разработка и техподдержка », meaning "development and technical support" in Russian.
It also found an email address in the footer of these websites which seems to belong to a certain Yevgeny SHEVCHENKO (Евгений Александрович Шевченко).
Using Russian OSINT ressources, it found the identification tax number as well as the company associated with this person.
On the company’s website, it found links redirecting to one of the domain names of one of the ecosystems.
Using its past investigations on Doppelganger/RRN campaign, VIGINUM observed a connection between some of this campaign’s assets and the Portal Kombat’s network.
It also uncovered links to the Inforos infrastructure, attributed to Russian intelligence services by the U.S. Investigating the background and past activities of TigerWeb’s director, especially his activities in Crimea, and using open source knowledge such as Openfacto’s reports on Inforos network, VIGINUM drew the hypothesis that TigerWeb could be a service provider of the Inforos network.
I hope that extracting the methodology would be useful for curious investigators about websites’ infrastructure. And if you are looking for an expert comment on VIGINUM’s investigation and further explanations about websites’ investigations, here’s a great article by Ari in his newsletter Memetic Warfare:
And if you want to know more about the TigerWeb - Inforos’ connection, I encourage you to read the second part of Portal Kombat as well as OpenFacto’s investigations.
Your Press Corner
Here’s the weekly readings to keep you connected to all the conversation on global elections and information operations:
Russia-linked information campaign aims to ‘sow doubt’ among Ukrainians (therecord.media) - 2 years after the invasion of Ukraine, Russian hackers are using dark satirical emails to convince Ukrainian users that a war with Russia is not worth fighting. This campaign is labeled Operation Texonto.
Kremlin Propaganda Aims to Destabilize Ukraine from Within (kyivpost.com) - disclosure of TTPs used in Kremlin's Maidan-3 propaganda campaign targeting Ukraine.
Manipulating Memory: Rewriting School History Books - EUvsDisinfo - A deep dive on Putin’s new history textbooks.
French Ambassador concerned over Russian disinformation operations ahead of EU elections (thejournal.ie) - Vincent Guérend, French Ambassador in Ireland, talks about the latest Russian information manipulation campaigns targeting France and the EU.
Estonian intelligence warns about Chinese state-linked Tik Tok big data collection | News | ERR - “China is creating an integrated political-technological ecosystem, by exploiting Chinese digital companies, especially Tik Tok, and the big data they collect for developing comprehensive artificial intelligence.” With a special disclosure of TikTok ownership scheme.
Estonia thwarts ‘shadow war’ attack, Prime Minister Kaja Kallas tells CNN | CNN - beyond FIMI, there is a whole realm of Russian hybrid threats taking place and targeting baltic states.
Winning the China-US Narrative Competition in Southeast Asia – The Diplomat - what will it take for the US to win the narrative battle in Southeast Asia?
Disinformation, fake images of Prime Minister Kishida spread on social media | NHK WORLD-JAPAN News - GenAI has reached Japan.
Deepfakes swirl in Korea ahead of general elections - The Korea Times - and in South-Korea too.
North Korea and Iran using AI for hacking, Microsoft says | Hacking | The Guardian - signals are emerging that AI is being used by foreign state actors to organize offensive cyber operations according to Microsoft.
Sora, OpenAI's new text-to-video tool, is causing excitement and fears. Here's what we know about it | Euronews - Meet Sora - OpenAI's new text-to-video generator.
Seeing isn't always believing: video edition (substack.com) Testing Sora by Conspirador Norteño.
OpenAI shuts down accounts linked to 5 nation-state hacking groups (therecord.media) - OpenAI said that it terminated accounts on its services being used by threat actors linked to China, Russia, Iran and North Korea.
When does something become political? (substack.com) - an excellent analysis by Katie Harbath, drawing upon her previous experience at Meta and reflecting on past and existing debates about Meta and political content.
Meta is cutting funding for fact-checking on WhatsApp ahead of elections - The Hindu - According to the Hindu’s article on February 15, Meta is cutting payments to news organisations that fact-check misinformation on WhatsApp.
MCA’s WhatsApp Helpline: Curbing The Spread of AI-Generated Misinformation In India | Meta (fb.com) - According to Meta’s announcement on February 19, Meta is launching a dedicated fact-checking helpline on WhatsApp with the Misinformation Combat Alliance to combat GenAi misinformation in India. Did they read The Hindu’s article?
Pakistani content moderators are exhausted and stuck - Rest of World - a very distressing issue, it’s worth taking the time reading about it.
People do change their beliefs about conspiracy theories—but not often | Scientific Reports (nature.com) - a new research to contribute to understanding our cognitive parameters.
Announcement: DISARM Certificate Analyst Course
You heard me talking a lot about TTPs or Tactics, techniques and procedures. This is a common term used in the threat intelligence and cyber space, but there is a dedicated existing framework of TTPs called DISARM.
The DISARM Foundation, who maintains the framework online, and Alliance4Europe offer a new DISARM Certificate Analyst Course to ensure that we are all able to describe information operations with a common concrete understanding of what tactics are used. Guess what? I may be one of the trainers as well in the future!
Check the courses and the dates at Disarm Certification - Alliance4Europe !
Thank you for taking the time to dive into this newsletter and let me know what you thought about it!