European Economic
and Social Committee
Online disinformation is evolving fast, and Europe is facing growing threats from foreign interference and false information. Debunk.org is a disinformation analysis centre that monitors online false narratives, fact-checks claims and exposes coordinated manipulation efforts. Viktor Dauškas, head of the organisation, talked to us about their work in tracking these campaigns. He explained the techniques and patterns used in modern disinformation efforts, the challenges they face in identifying manipulation across different platforms, and the steps that institutions, civil society and individuals can take to protect democratic debate in Europe.
Based on your recent analyses (on surging pro-Kremlin narratives on X by the Telegram news channel Belarusian Silovik, deepfake scams, and a campaign to discredit Ursula von der Leyen), what are the most prominent disinformation patterns you are currently observing in Europe? How do fakes work? Which techniques are proving most effective today, and why?
The Debunk.org team observed a prominent pattern: cross-platform content laundering, which also is a new way to overcome EU sanctions on Kremlin media. Content from sanctioned websites is rewritten with AI and shared across newly created disinformation media outlets, for example Pravda network, or shared on social media channels. Narratives often originate in closed or semi-closed spaces such as Telegram, where state-linked or proxy actors operate with limited oversight, and are then repackaged for platforms like X. In our investigation into Belarusian silovik-linked content, we saw a small set of inauthentic or impersonating accounts repeatedly amplifying the same source material, using AI to localise it and tailoring different audiences to increase reach.
A second pattern is event-driven political discreditation, where formal democratic procedures are deliberately instrumentalised to launch disinformation campaigns. In the case of the no-confidence vote targeting European Commission President Ursula von der Leyen, the vote itself became a vehicle for a coordinated discreditation campaign. Although it was evident in advance that there were insufficient votes to remove the President, the process was still pursued in order to amplify pre-existing narratives of corruption and illegitimacy.
Finally, we increasingly see disinformation converging with fraud, particularly through paid advertising and deepfake-enabled scams. Here, the goal is not only political manipulation but also direct financial harm, often achieved by impersonating trusted media outlets, public figures or institutions. These campaigns benefit from platform advertising systems and are difficult for users to distinguish from legitimate content.
Can you walk us through your debunking process? What signals or evidence typically allow you to identify coordinated or manipulated content?
Our debunking process combines narrative analysis, behavioural signals and technical verification, often in collaboration with partner organisations.
We start by identifying the upstream source of a claim and mapping how it spreads across platforms. Coordination signals include unusually high posting frequency, synchronised amplification around key events, repeated linking to the same origin, and accounts that impersonate real people or organisations. At the content level, we look for recurring phrasing and templated storytelling.
We assess scale and velocity. When large volumes of content appear in short timeframes, this might suggest automation or coordinated behaviour. While tools and techniques vary by case, the key is triangulating multiple indicators of manipulation rather than relying on a single signal.
Foreign information manipulation and interference (FIMI) analysis today is rarely a solo effort. Effective responses increasingly depend on information-sharing networks like FIMI-ISAC that connect researchers, journalists, civil society and public institutions, allowing threats to be identified and contextualised quickly.
Your analysis of disinformation targeting Ursula von der Leyen highlights broader trends in political discreditation. What does this case reveal about how EU-level institutions or figures are framed online today?
This reflects a wider FIMI strategy: eroding trust and weaken EU institutional authority so that official information, media, fact-checking or policy responses are more easily dismissed. The repetition of pre-existing made up narratives or accusations across Kremlin-aligned outlets, channels and influencers attempts to create large-scale campaigns to manipulate public opinion.
Based on your research, what skills do citizens most urgently need today to protect themselves from disinformation and deepfake content? Where do you see the biggest gaps in public digital/media literacy?
The most urgent skills today relate less to fact memorisation and more to recognising manipulation techniques.
Citizens need strong source-checking habits, including lateral reading, which involves leaving a website to check what other credible sources say, and basic verification of who is behind a claim. There is also a growing need for synthetic-media awareness: understanding that a convincing video or audio can be fabricated, and that visual realism is no longer proof of authenticity.
To help address these gaps, Debunk.org developed InfoShield, a free 45-minute online course that has already been completed by more than 5 000 citizens. The course focuses on practical, everyday skills for recognising manipulation, emotional framing and deceptive content in digital environments.
At the professional level, there is also a growing need for trained specialists who can systematically identify and respond to foreign information manipulation and interference. For this purpose, we offer FIMI101, a professional e-learning course designed to certify analysts working in this field. Participation in these professional courses directly supports Debunk.org’s continued research, monitoring and public-interest activities.
Viktoras Daukšas has been at the helm of the independent technology think tank and NGO Debunk.org for eight years. Debunk.org carries out analyses of foreign information manipulation and interference (FIMI) and coordinated inauthentic behaviour analyses (CIB). Together with its partners, it researches disinformation in 22 countries by combining expert knowledge with AI-driven technologies. Debunk.org also offers educational media literacy campaigns to teach people how to spot fakes online and shield themselves from disinformation.