China and Russia Leaks. What is in the pipeline?
China Leaks: Delving into I-Soon, a Chinese hacking company - Russia Leaks: How Putin is preparing his re-election - What is the point behind describing and exposing FIMI networks? - Belarus & AI.
Hey there. In the last two weeks, two leaks revealing Russian and Chinese cyber and information activities have been disclosed, shedding lights on these countries manipulative behaviors to control the internet.
Leaks are critical to understand how threat actors operate, their motives, as well as their vulnerabilities. However, leaks do not always include information about their origin and how they were obtained. Leaks raise ethical and legal questions for OSINT investigators, notably about the private data they often contain and the extent it to which it can be leveraged.
With these thoughts in mind, let’s look at what’s out there.
What to expect (beta test of a table of content to facilitate navigation):
Chinese Infrastructure: The Role of Private Companies in CCP Cognitive Warfare
A Summary of I-Soon, a Chinese Hacking Company, Tactics, Techniques and Procedures (TTPs)
Chinese Infrastructure: The Role of Private Companies in CCP Cognitive Warfare
There’s a lot to process in what has been called “the most significant leak of data linked to a company suspected of providing cyberespionage and targeted intrusion services for the Chinese security services” by the director of strategic and persistent threats at Recorded Future. Although the leak concerns a Chinese hacking company named I-Soon (安洵信息), it is very instructive and informative as to where Chinese information operations could be heading.
But before we delve into the latest disclosed tactics, techniques and procedures (TTPs), I would like to address a recurring question regarding China’s structural organisation to conduct its cognitive warfare and the links between the CCP and private companies.
When describing the latest PRC’s TTPs, I often come across expressions suggesting that China is imitating Russia and Iran, employing private companies to conduct its cyber-enabled operations abroad. I tend to find this comparison both a reduction of what Chinese operations and a misconception of PRC’s internal organization.
It is true that the PRC has been resorting to private external companies to conduct cyber-attacks, primarily to grow a database both domestically and abroad. However, these companies are not considered by the CCP as external actors, but as CCP operators who are due to implement CCP’s objectives.
The PRC’s perspective has been clearly explained in Paul Charon and Jean-Baptiste Jeangène Vilmer’s work: ‘Chinese influence operations: A Machiavellian Moment’. In 2015, to expand its influence over the private sector, the CCP elevated civil-military fusion as a national strategy to facilitate innovation in certain sectors as well as to convert civilian technological innovations into military gains. Then in 2017, the National Intelligence Act’s article 7 forced all Chinese companies and citizens to “support, help and cooperate with national intelligence efforts”. The authors of the report concluded: “no important company in China can prosper without aligning itself with the Party.”
At the time of the report, they focused on two companies, Global Tone Communications Technology Co. LTD (GTCOM) and Shenzhen Zhenhua Data Information Technology Co. Both are illustrative of Chinese private companies collecting and managing huge databases on multiple Western and Chinese platforms and providing support directly to China’s national security including military intelligence and propaganda, according to Samantha Hoffman’s report for ASPI.
These two companies’ activities share a resemblance with I-Soon, the company at the heart of the recent leaks.
A Summary of I-Soon, a Chinese Hacking Company, Tactics, Techniques and Procedures (TTPs)
The leak was discovered by a Taiwanese threat intelligence analyst @Akazasekai_ who provided a comprehensive review of the documents on X and Mastodon. Articles and experts have also extracted and translated useful information contained in the leaked documents, believed to be authentic according to Google’s Mandiant Intelligence. What can we learn from them?
I-Soon commenced its activities eight years ago in 2016. It is a security contractor for China’s Ministry of Public Security (Chinese intelligence) and is registered in Chengdu, the capital city of the Chinese province of Sichuan.
This initiation of its activities corresponds with the timeline described above, which is the civil-military fusion. It also coincided with geopolitical and domestic events in 2016 that may have compelled the Chinese government to accelerate its hybrid tactics against other states.
In 2016, Taiwanese DPP president Tsai Ing-wen was elected for the first time in January, signaling Taiwan’s progressive departure from the ‘One China policy’.
In 2016, the US Democratic National Committee email leak was also revealed in July, later known to have been conducted by Russian threat actors affiliated with Fancy Bear, which may have provided ideas about future TTPs to Xi-Jinping.
At the same time, China’s President was aiming to become the ‘core leader’, a title to strengthen his power before the re-election of the Central Committee in 2017. His internal political ambitions might have also pushed for more internal ‘digital authoritarianism’.
During this year as well, the Trans-Pacific Partnership, the largest regional trade accord in history, died as both Democrats and Republicans did not support it during their presidential campaigns, and President Trump signed a statement formally abandoning the deal on his first day in office.
All these events might have encouraged the PRC do go further down the road of cyber-enabled operations, giving more power to the MSP who turned to companies such as I-Soon.
I-Soon is said to have operated a series of TTPs to primarily access private data and build databases of personal information via disguised malwares (Remote Access Trojan (RAT)). It would be a lengthy process to list all the TTPs mentioned, but I found these key takeaways from @Akazasekai_ quite useful to summarize it (see below).
Other noteworthy TTPs include: a platform to collect and analyze email-data, a platform to hack into Outlook accounts, a reconnaissance platform using OSINT data.
According to other experts, these tactics correspond to ‘low-end cyberespionage capabilities’, while ‘high-end cyberespionage’ may still be under the direct control of the CCP.
I-Soon’s list of victims is extensive but appears to have targeted mostly East, South and South-East Asia, including Vietnam, Thailand, Malaysia, Cambodia, Indonesia, Philippines, Hong-Kong, Myanmar, India; Central Asia including Afghanistan, Pakistan, Kazakhstan, Kyrgyzstan; Africa including Rwanda, Nigeria, Egypt; Turkey and…. France.
Ouch. My previous school, Sciences Po, is part of the victims’ list. You may wonder why, but Sciences Po is home to many international students, including from China. It may have been an attempt to follow students enrolled in the university or to identify specific targets within the institution.
Other potentially targeted victims are mentioned in the employees’ chat logs. They include NATO Secretary General and the United-Kingdom government.
For What Purpose?
Listing all these TTPs and victims can make us wonder: for what purpose?
Experts have already mentioned that this leak reveals or confirms several activities, including monitoring activities of ethnic minorities in China, running disinformation campaigns targeting overseas audiences, listening to public sentiment on social media, as well as disrupting Wi-Fi signals.
One part of the leak is very concerning regarding the PRC’s ambitions towards Taiwan. The company collected 495 gigabytes of Taiwan’s road map files. This data contains information about Taiwan’s roads, bridges and tunnels, which can be used in the event of an invasion of Taiwan.
In the context of an election, how can these TTPs can be leveraged?
One key strategy that comes to my mind is taking control of government and electoral bodies’ website:
Using them as a platform to deface them, publish forgeries, or real leaks to undermine trust towards candidates or election integrity,
Making them inaccessible on the day of the election to prevent access to election information or to delay the publishment of results.
These examples are not taken from my imagination but supported by evidence of PRC cyber-enabled operations targeting Asian countries in the last ten years. For example, in 2018 Malaysia, Cambodia and Hong-Kong were all targeted by Chinese cyber-espionage actors who used malware against government and other institutions’ websites to undermine election’s integrity.
In one of the latest episode of Le Collimateur Podcast, Alexis Rapin, Researcher and Expert on foreign interferences tactics, mentions that one thing that hasn’t become true yet is an attack against electoral infrastructure. In 2016, Russian threat actors accessed the electoral repositories of 30 U.S. states but did not modify voters’ data such as ID or driver’s licenses to prevent them from voting. Could Chinese threat actors decide to do so in 2024?
Not One But Two Leaks
In a less publicized but equally important leak given to Estonian News Outlet Delfi, a joint investigation by several European and independent Russian outlets revealed last week an “information war” conducted by the Kremlin against Russian domestic audience to ensure Putin’s election victory.
The documents, gigantic Excel files composed of minutes of work meetings, presentations, and reports, present the state of the information manipulation activities conducted by the Kremlin in Russia. These activities are divided into three categories: activities targeting the presidential election, activities related to a broader informational-idelogical war, and activities targeting the ‘new territories’ aka the occupied areas in Ukraine.
The leaks reveal the infrastructure created by the Kremlin to dominate the information ecosystem.
An extensive network of non-profit organizations (автономная некоммерческая организация / autonomous non-profit organization) has been created, fully controlled and funded by the Kremlin to carry out state propaganda.
One of the largest non-profit organization is the Internet Development Institute (Институт развития интернета). It is in charge of content creation, such as mobile games and films. Its products are supposed to ‘reinforce civic identity and spiritual and moral values” and to “support and disseminate governmental content along new thematic lines.”
One of the most influential NGO is ANO Dialog. It is in charge of content dissemination, through the creation of content centers in the occupied territories of Ukraine in charge of developing localized content. It is in charge of ‘targeted advertising’ and ‘essential analytical products in preparation for the elections’.
The dismantlement of Ukrainian satellite equipment and the installation of Russian satellite sets (Russki Mir) in occupied territories in Ukraine are also documented. The installation of this new Russian infrastructure is the first task to shut down the information channels before implementing a plan ‘to counter the spread of Ukrainian mass media propaganda materials’.
A special internet blocking system “Automated System of Internet Security” (автоматизированная система безопасности Интернета) blocks and censors content on the Internet. It could be accompanied by other projects supported by AI to detect content on social media.
A new media network is supposedly launched in the occupied territories.
The documents reveal how the infrastructure, namely the Internet, has become since 2019 a ‘mandatory part of work’ to disseminate the Kremlin’s propaganda. For a deeper insight into this, I recommend Researcher Kevin Limonier’s very sharp comment on the leaks.
The leaks also reveal some of the TTPs leveraged to influence the Russian elections.
Secret sociological research and pollings in the occupied territories of Ukraine to likely ‘test the temperature of society’ to develop adequate narratives.
Exit polls for the presidential elections.
Offline events such as ‘cultural and educational events’ in the occupied territories.
The polls and sociological research are reminiscent of tactics used in Africa by Prigozhin to study targeted audiences and discourage people from voting or to influence others to vote for a certain candidate.
The leaks also reveal the targeted audiences of these activities.
employees from the Ministry of Education and Sciences and opinion leaders from within institutions overseen by this Ministry are listed down to “increase [their] level of socio-political literacy” and “monitor their political attitudes and voting preference”.
Students are invited to contests and clubs to meet with teachers, opinion leaders, and experts. The aim is to ensure that they will vote for the main candidate and to limit the manipulation of the actual number of ballots, in a tactic called ‘pre-rigging’.
Opinion leaders are created, trained and supported to create, disseminate, and amplify content as well as identifying other opinion leaders and boosting their own image.
In the context of elections, information manipulation activities are used by threat actors not only against foreign audiences but also against domestic audiences. In the case of Russia, like in the case of China, we can observe the necessary development of tools and infrastructure not only to influence, but also to control the information ecosystem in what can be called “digital authoritarianism”.
However, Russia’s internet is harder to control than China’s internet due to the historical ways Internet was developed in both countries. According to Kevin Limonier & al., contrary to what happened in China, “the post-Soviet Internet emerged without any strong government regulations during the 1990s and 2000s, with a lot of Autonomous Systems created by private persons or small companies. But on the contrary to what happened back then in Europe or in the U.S., the immensity of the Russian territory, along with the lack of investments in large Internet infrastructure projects did not lead to any kind of “simplification” of the AS network.”
The Russian leaks are therefore underscoring how the cyber infrastructure is inherently linked to information manipulation activities to control the dissemination channels of information, at all the different layers of the cyberspace.
This link can make us wonder about the relevance of the distinction we still continue to operate between cyber activities and information activities.
A distinction which has a heavy impact on threat intelligence teams’ organisations, on the creation of trainings and academic courses, and therefore our ability to detect and characterize the weaponization of the internet against democracies.
For What Purpose (Bis Repetita)
While foreign threat actors seem to be pretty straight forward about their intentions, it doesn’t seem to be the case for democracies and the way they have been dealing with Foreign Information and Manipulation Interference (FIMI) activities.
In my recent conversations with other experts in countering-FIMI, I have been struggling to answer one of their questions regarding current counter-measures: so what?
What is the purpose of exposing FIMI networks such as ‘Portal Kombat’ and ‘Doppelganger/RRN’ instead of sanctioning or enforcing previous sanctions that are not being correctly implemented or easily circumvented?
What is the purpose of using frameworks to describe FIMI, if not to leverage them in legal cases?
Why are we doing fact-checking when parts of our audiences do not care about facts?
I do not have the answers these questions, but I came across bits of answer over the last years. And I think this time I will use an old phrasing from one of my favorite books’ author, Pierre Bottero: there are two answers to these questions, one from the Scientist and one from the Poet.
The Scientist believes that we need empirical evidence, gathered progressively, by exposing, using frameworks and fact-checking to construct an effective response to hybrid threats such as FIMI. It asks for an unprecedented collective effort, at a time when we are driven by individualist, narcissist and capitalist values augmented by social networks.
I think the most advanced comprehensive and inclusive response is the Institute for Research on the Information Environment (IRIE) which follows the idea of a CERN Model for the information environment developed by Alicia Wanless and Jacob Shapiro.
The Poet’s answer is: we are Democracies. But who is ‘we’ and what are ‘Democracies’?
The Poet leaves that question unanswered, for us to develop the next verse. To support the Scientist’s Response and give a performative effect to the Poet’s statement, it might be worth it to sit down together and reflect on the current state of our own vulnerabilities.
Your Press Corner
Here’s the weekly readings to keep you connected to all the conversation on global elections and information operations:
Russia’s 2024 election interference has already begun (nbcnews.com) - Russia is already spreading disinformation in advance of the 2024 election, according to former U.S. officials and cyber experts
Senate Intel chair warns US is ‘less prepared’ for election threats than in 2020 - POLITICO - “The NSA, CISA, ODNI, FBI literally have had no communications with any of the social media platforms on elections … since July of last year, and that ought to scare the hell out of all of us,” Senate Intelligence Chair Warner said.”
How a Right-Wing Controversy Could Sabotage US Election Security | WIRED - for further explanation about Senate Intelligence Chair Warner’s claims, looking at claims from Secretary of State Mac Warner.
Doppelgänger | Russia-Aligned Influence Operation Targets Germany - SentinelOne - Doppelgänger aka RRN continues, targeting the upcoming EU Parliament Elections
Russia's tactics: Blockchain advocacy or AI disinformation in Kenya's elections? | Nation - A dire warning from Kenya on Russia’s blockchain advocacy ahead of Kenya’s elections.
“We can’t do this alone”: Nigerian fact-checkers teamed up to debunk politicians’ false claims at this year’s election | Reuters Institute for the Study of Journalism (ox.ac.uk) - Lessons learnt from Nigerian 2023 election.
Is democracy dying in Africa? Senegal’s slide into chaos bodes ill in a year of key elections | Global development | The Guardian - President Macky Sall’s decision to cling to power by postponing voting without offering a new date has thrust the country into chaos.
Beijing’s Post-Election Plan for Taiwan and Lai (foreignpolicy.com) - FIMI are not limited to the election’s day.
PCG warns of surge in pro-China trolls on social media (tribune.net.ph) - “Expect an increase in pro-China trolls and influencers on social media who are undermining the country’s assertiveness for transparency initiatives in the West Philippine Sea, a Philippine Coast Guard official warned on Monday”.
Silenced voices: The X and VPN ban after Pakistan’s elections · Global Voices - Continued shutdown of X and VPN in Pakistan, two weeks after the election.
Rising internet shutdowns in India spark fears of authoritarianism before election | South China Morning Post (scmp.com) - while rising internet shutdowns in India are not sending positive signals regarding the upcoming election.
How Meta Is Preparing for the EU’s 2024 Parliament Elections | Meta (fb.com) - Meta will notably activate an Elections Operations Center to identify potential threats and put mitigations in place in real time. Let’s hope Meta will effectively coordinate with European stakeholders.
Cognitive ability mattered in the UK’s vote for Brexit, research shows (bath.ac.uk) - Increasing our knowledge about our current ways of voting.
And for the geeks of frameworks, the latest piece by Anais Meunier on revising the Diamond Model for Influence Campaigns - it is available in English, you just need to click EN on the right top of the page!
Belarus Election: The Case of AI In Authoritarian Regimes
You may remember we discussed how AI could be used in cases of election in authoritarian regimes such as in Pakistan to enable, in the context of authoritarian regimes, banned candidates to participate in the election.
Last week-end, Belarus held its parliamentary election, and without surprise, put all measures in place to ensure the election of the four approved parties who have shown loyalty to Belarus’ leader President Lukashenko. These measures included banning opposition parties, making voting mandatory, excluding citizens living abroad, deploying street patrols and detaining political prisoners.
There would be much to say about Belarus, but I wanted to highlight one thing regarding the use of AI. Ms. Tikhanovskaya, who ran and won against President Lukashenko in the last 2020 election before escaping to Lithuania, gave a GenAI candidate called Yas during this year’s election.
Yas Gaspadar is an AI chatbot that describes itself as a 35-year-old from Minsk and was created by the Belarusian opposition a few weeks before the election.
I can’t help but wonder how in our democracies, we are fearing malign uses of AI to disrupt the electoral process to undermine election integrity, while in authoritarian regimes, AI can be used to give hope to restore election integrity.