With a deep background as a disinformation expert, Felix Kartte is a senior fellow at the German-based Mercator Foundation and a senior adviser at Reset, an NGO that researches social media's impact on democracy.
In the past, Kartte worked at StratCom, the EU diplomatic service's strategic communications division. While there, he helped craft EU policy on countering digital threats.
In an interview with RFE/RL's Tatar-Bashkir Service, Kartte talks about the tactics authoritarian regimes such as Russia use to spread disinformation globally, highlighting recent malign efforts in Moldova during its presidential election and referendum on integration into the European Union.
RFE/RL: You've detailed how suspected Russian actors allegedly bribed politicians, bought votes, and spread deepfake videos targeting Moldovan President Maia Sandu during the recent vote and referendum. Sandu herself called it an "unprecedented attack on democracy." Could you discuss some of the specifics of this Kremlin-backed campaign?
Felix Kartte: There's plenty of reporting in international media detailing these tactics. For instance, in the weeks before the elections, Moldovan authorities found large sums of cash on passengers returning from Moscow through indirect flights, amounts that raised suspicion due to their frequency and scale. In one single day, authorities seized $1.5 million, suspecting it was intended for political influence.
Moldova's chief of police told the BBC that, as of early October, nearly 130,000 Moldovan voters had received payments linked to this scheme, representing around 10 percent of Moldova's active electorate. The country's chief anti-corruption officer also traced suspicious cash flows linked to fugitive oligarch Ilan Shor, who is now living in Russia. She noted Shor had been directing funds to influence voters to oppose Moldova's pro-EU referendum and its path toward European integration.
Sources linked to the Kremlin also infiltrated social networks in Moldova. For instance, they spread deepfakes -- artificially generated videos that are now cheap and easy to produce.
In one such video, Sandu was allegedly shown as mocking the poverty of the country's citizens. In another fake video, which Sandu had to publicly debunk in her New Year's speech, she was falsely shown as banning Moldovans from drinking berry-infused tea, a drink that many Moldovans love.
These are standard cases of disinformation -- tactics designed to sow distrust and fuel anger against democratic governments.
RFE/RL: You've written about the "mirror tactic" that authoritarian regimes like Russia employ in their disinformation campaigns. Could you explain how that works?
Kartte: Sure, "mirror politics" is a tactic where a propagandist accuses opponents of the very behaviors or strategies they themselves are engaging in. The Kremlin frequently uses this tactic to confuse audiences and deflect blame. For instance, Kremlin propaganda often accuses Ukraine and the West of engaging in "Nazi" behavior and attempting to destabilize Russia, when it is, of course, the Kremlin itself that tries to wipe out the democratic governments of foreign countries, destabilize, and subdue societies.
Similarly, the Kremlin keeps accusing "the West" or particular groups such as LGBTIQ of "interfering" in Russia, when it is obviously Russia that is interfering and meddling in other countries' elections and media. This is mirror politics in action -- the aim is to blur the lines between true and false, right and wrong, and to ultimately make people indifferent to who governs them, because they can no longer see a difference between democracy and autocracy.
This tactic is effective, in some countries more than in others. For instance, in Slovakia, public support for NATO membership decreased from 72 percent to 58 percent. According to the [Slovak-based] think tank GLOBSEC, the majority [of people] in Slovakia no longer attributes primary responsibility for the war in Ukraine to Russia.
RFE/RL: How does Russia exploit its war against Ukraine in its disinformation campaigns?
Kartte: As I mentioned, Russia's full-scale [2022 invasion] of Ukraine was both preceded and accompanied by a comprehensive disinformation campaign aimed at misportraying Ukraine as a "Nazi regime" and creating false pretenses for the invasion. A key aim of Russian disinformation has been to undermine public support for Ukraine in Western countries like the U.S. or Germany.
For instance, the so-called "Doppelganger" campaign spreads pro-Russian narratives through fake websites and social media accounts, even spoofing credible news outlets to gain traction. Key themes include questioning the effectiveness of sanctions on Russia, amplifying negative sentiments toward Ukrainian leaders, and stoking fears about Ukrainian refugees in Europe.
RFE/RL: How does conservative rhetoric fit into Russian disinformation efforts?
Kartte: Russian propaganda often leverages conservative rhetoric, focusing on themes like "woke-ism," family values, and LGBTIQ rights to resonate with conservative citizens in Europe who may feel societal changes threaten traditional norms. A prominent example is the "Gayrope" narrative, framing Europe's tolerance as moral decay and a threat to family values.
In this context, [Russian President] Vladimir Putin is portrayed as a strongman who can "save" Europe from what is labeled as the "woke virus," echoing the portrayals of figures like [President-elect] Donald Trump in the United States.
Of course, the Kremlin's interest in these conservative values is purely strategic. The Russian government and its illiberal allies across Europe, far from genuinely defending traditional values, pose a threat to core conservative pillars such as the European peace order, free markets, and even Christian values such as honesty, compassion, and humility.
The Kremlin's strategy aims to manipulate socially conservative audiences and appeals to those who feel overwhelmed by societal changes, while simultaneously blurring the lines between traditional center-right parties and far-right extremists. It invokes common conservative concerns to capture center-right audiences, and then feeds them with content that fuels anxiety and hostility.
Russia's aim here is ultimately to strengthen extremist political parties that have a pro-Russian agenda -- at the expense of traditional conservative parties.
RFE/RL: How crucial is technology, namely artificial intelligence, in disinformation campaigns today?
Kartte: I believe we often overestimate the role of new technologies like AI in disinformation. Consider the U.S. election: According to U.S. authorities, Russia interfered in the electoral process by sponsoring domestic U.S. YouTube influencers to spread pro-Russian content, notably supporting Trump.
While these influence operations are costly and effective, they don't primarily rely on AI tools like ChatGPT. The real harm stems not from AI itself but from Russia's bad intentions and the significant resources it devotes to disinformation.
RFE/RL: How prepared is Europe to counter disinformation campaigns? What could be done better?
Kartte: The EU has taken substantial steps to combat disinformation, including establishing a dedicated task force at its diplomatic service over a decade ago, supporting independent media and researchers, and regulating social media platforms.
Yet, Russian propaganda continues to stay a step ahead by masking its origins through intermediaries and commercial entities, complicating attribution. Additionally, some political parties within the EU actively propagate pro-Russian narratives, making effective countermeasures more challenging and highlighting the complexity of tackling disinformation within the EU's own borders.