Minutes of the 247th meeting of the INT section

Download — EESC-2025-03989-00-00-PV-TRA — (Minutes)
@European Union 2025 – Source: EP

The Sakharov Prize for Freedom of Thought is the EU’s highest distinction for human rights work. Awarded annually by the European Parliament since 1988, it pays tribute to individuals and organisations defending freedom of thought and fundamental rights. In 2025, it was awarded to two brave journalists who have paid a terrible price for defending human dignity and democracy in their countries: Polish Belarusian journalist Andrzej Poczobut and Georgian journalist Mzia Amaglobeli, both still in prison and denied any or almost any contact with the outside world.

The Sakharov Prize for Freedom of Thought is the EU’s highest distinction for human rights work. Awarded annually by the European Parliament since 1988, it pays tribute to individuals and organisations defending freedom of thought and fundamental rights. 

In 2025, it was awarded to two brave journalists who have paid a terrible price for defending human dignity and democracy in their countries: Polish Belarusian journalist Andrzej Poczobut and Georgian journalist Mzia Amaglobeli, both still in prison and denied any or almost any contact with the outside world.

On 16 December in Strasbourg, the European Parliament held the Sakharov Prize award ceremony in the presence of representatives of the two laureates. The other two finalists also attended: representatives of journalists and humanitarian aid workers in Palestine and in all conflict zones, and Serbian students who have been protesting against corruption in their country for over a year.

‘This House stands in solidarity with Andrzej and Mzia in their struggle. We call for their immediate release, along with every person wrongfully imprisoned. We will keep up the pressure until everyone is free… Democracy takes work; it takes dedication; it takes the courage to act, even when the cost is unimaginably high. This is what this year’s laureates teach us’, said EP President Roberta Metsola.

Receiving the prize on behalf of her father, Jana Poczobut said: ‘You show that even when a person is taken away, their principles cannot be taken away. And even when someone is silenced, their voice continues to speak through others.’

Andrzej Poczobut has spent years reporting on the oppressive regime in Belarus and advocating for its Polish minority. Following a crackdown on the Union of Poles, he has been serving an eight‑year sentence in a penal colony since 2021. He is in solitary confinement in a concrete cell. His family has not spoken to or seen him for five years.

‘Every day we choose hope, because hope is the only thing that has not been taken from us. And there are many families whose stories echo ours… Your recognition cannot change the past, but it gives us something extremely precious for the future: the belief that justice and humanity still have a place in this world’, Ms Poczobut said in her emotional speech before the Parliament.

Mzia Amaglobeli, Georgian journalist and co‑founder and director of the independent media outlets Batumelebi and Netgazeti, received a two‑year prison sentence in January 2025 on charges related to an alleged confrontation with police during anti‑government protests. She became the first female journalist imprisoned for political reasons in independent Georgia.

In a speech read at the ceremony by her friend, Georgian journalist Irma Dimitradze, Ms Amaglobeli said she accepted the award on behalf of all political prisoners unjustly imprisoned and convicted for fighting for Georgia’s European future.

Ms Amaglobeli said Georgia had been captured by a regime that serves Russian interests and is destroying independent journalism, abolishing opposition political parties and dismantling NGOs by labelling them as foreign agents. It is also ruthlessly beating, fining, arresting and blackmailing those who have been protesting on Georgian streets since the announcement that their country’s EU accession process would be suspended.

‘It is my wish that you stand with Georgian society, its democracy and its European aspiration in the same way you stand for the freedom of your own countries... It must now be unmistakably clear that the force behind the horrors in Belarus, Ukraine and Georgia is moving closer to the heart of Europe. It is heading towards your homes, and we are merely in its way’, Ms Amaglobeli warned.

If Ukraine and Georgia were to be left alone in the face of Russia’s aggression, this would be an irreparable historical mistake for which we would all pay a heavy price.

‘The fate of our struggle does not depend on us alone, because our struggle is not only about us. We need your solidarity and support… Fight with us, fight for us. Fight as you would fight for the freedom of your own countries. Use every mechanism at your disposal, and do so before it’s too late’, Ms Amaglobeli concluded.

In Belarus alone, over a thousand people remain imprisoned for political reasons. In a recent article in EESC Info, exiled Belarusian journalist Hanna Liubakova, sentenced to 10 years in prison in absentia, wrote that the EU and the international community must not ease pressure until all are freed and systemic repression ends.

In July 2025, the EESC signed a memorandum of understanding with Belarusian democratic forces represented by Sviatlana Tsikhanouskaya, reaffirming its unwavering support for a democratic Belarus. The memorandum formalises a new phase of structured collaboration to support Belarusian civil society and its European aspirations.

EESC President Séamus Boland congratulated the European Parliament on its decision to award the Sakharov Prize to Mr Poczobut and Ms Amaglobeli. The EESC expressed its solidarity with the laureates and their fight for freedom and democracy.

‘I would like to emphasise with all my strength that there is no democracy without independent journalists, because freedom of the press is the pillar of democracy’, Mr Boland said in the EESC video dedicated to the Sakharov Prize.(ll)

New research from the Balkan Free Media Initiative (BFMI) raises alarm about Europe’s preparedness to counter platform manipulation and election interference. In its latest report Tackling TikTokcracy: A blueprint for fighting algorithmic manipulation in Europe BFMI shows how TikTok and other platforms have been heavily exploited across recent elections in the Balkans, including through large-scale networks of fake accounts and cross-platform amplification. The report documents tactics such as hashtag hijacking and blended influencer-bot networks, revealing shared structural weaknesses that leave democratic processes vulnerable. For EESC Info, BFMI outlines concrete recommendations aimed at strengthening democracy and security ahead of future elections.

New research from the Balkan Free Media Initiative (BFMI) raises alarm about Europe’s preparedness to counter platform manipulation and election interference. In its latest report Tackling TikTokcracy: A blueprint for fighting algorithmic manipulation in Europe,  BFMI shows how TikTok and other platforms have been heavily exploited across recent elections in the Balkans, including through large-scale networks of fake accounts and cross-platform amplification. The report documents tactics such as hashtag hijacking and blended influencer-bot networks, revealing shared structural weaknesses that leave democratic processes vulnerable. For EESC Info, BFMI outlines concrete recommendations aimed at strengthening democracy and security ahead of future elections.

 

By the Balkan Free Media Initiative (BFMI)

The latest report by BFMI, developed in partnership with analytics firm Sensika, uncovered powerful networks of digital interference in Romania, Bulgaria and Kosovo which use sophisticated hybrid tactics to mimic genuine online engagement and artificially boost political messaging. These strategies are not confined to the Balkans, the report warns, and are spreading across Europe faster than institutions, platforms and citizens are currently responding.

The authors call for a rapid reimagining of Europe’s democratic defence architecture, one that makes platforms more transparent, proactively detects fake online activity across borders and builds citizens’ resilience to online influence. Without taking action, Europe risks becoming a ‘TikTokcracy’, in which algorithms – not citizens – decide its future.

 

Algorithmic influence exploits cracks in media systems

Across the Balkans, BFMI researchers found that networks of automated accounts, paid influencers and misinformed supporters were taking advantage of algorithmic incentives and regulatory gaps. Through sophisticated strategies like high-volume posting, hashtag engineering and the blending of political and entertainment content, those involved in these networks were, both knowingly and unknowingly, amplifying disinformation and manipulating public opinion.

The annulment of Romania’s presidential election in 2024 first revealed the scale of this threat, when such networks directly undermined electoral processes. Romanian intelligence exposed a large-scale operation, which coordinated over 25,000 automated TikTok accounts and a network of micro-influencers to artificially boost one candidate’s content to users and drive them to the polls in his favor. Much of this coordination took  place via Telegram, where locally-resonant narratives and strategic hashtags were distributed en masse. 

Similar dynamics were unearthed in Bulgaria, where politically-charged content disseminated from bogus websites and monetised through non-transparent advertising contributed to the country’s four years of perpetual election cycles.  The problem is perhaps even more worrying here, as the report unearthed a cross-platform amplification model that is economically embedded in Bulgaria’s captured media landscape, constantly adapting and ready to be activated at any time. 

In Kosovo, such tactics helped to create an especially tense campaign environment in 2025 with the potential to inflame pre-existing ethnic tensions at a particularly sensitive time. Once again, the shared strategies included the 'Fire Hose' tactic of mass posting and commenting, synchronisation of engagement, targeted hashtag usage, and the blending of unlabeled political advertisement with entertainment or sport content.

However, historic anti-corruption protests in Bulgaria via TikTok at the end of 2025 have illustrated that these platforms can be a double-edged sword for democracy, capable of both energising civic participation and undermining political stability.

A core finding of BFMI’s report is that the manipulation of platform algorithms thrives where and when media ecosystems are fragmented, non-transparent and captured by political or business interests. While these vulnerabilities are indeed prevalent in the Balkans, under-regulated platforms, poor transparency standards, fragile media infrastructure and limited cross-border cooperation are shared weaknesses across Europe. Without a concerted response spearheaded in Brussels, all Member States remain at risk of ‘TikTokcracy’.

 

Strengthening European defences against ‘TikTokcracy’

The report moves beyond diagnosis to outline a clear and immediate European policy framework which not only involves policymaking bodies but also includes efforts to mobilise society as a whole, from platforms to national institutions to European citizens. This includes: 

  • creating forensic and monitoring tools under the European Democracy Shield that feed into an EU-wide early-warning and rapid-response system;

  • aggressively enforcing existing legislation such as the Digital Services Act (DSA) and the European Media Freedom Act (EMFA), and creating additional binding platform guidelines for political content, advertising transparency and election protection;

  • new financing, training and tech support for credible, independent media to assist in the fight against disinformation;

  • digital literacy initiatives for young voters and public awareness campaigns to protect citizens from threats and rebuild societal trust.

Together, these measures would significantly strengthen Europe’s capacity to safeguard free and fair democratic debate and ensure that political agency remains with citizens rather than with platforms.

As BFMI’s findings make clear, algorithmic manipulation is evolving faster than Europe’s current defences, requiring more than incremental adjustments or regulatory enforcement, including a harmonised response that matches the speed and scale of these threats. Europe must either modernise its democratic defences for the digital age or risk allowing algorithmic visibility and manufactured popularity to erode public trust, distort political choice and weaken democratic societies.

The Balkan Free Media Initiative (BFMI) is a Brussels-based organisation that promotes media freedom and safeguards journalists’ rights in the Balkans. BFMI focuses on promoting transparency, accountability and ethical journalism, while countering disinformation, hybrid threats and other obstacles that undermine Euro-Atlantic values. By supporting collaboration among media professionals and civil society, BFMI helps reinforce democratic resilience. Thanks to its comprehensive approach, the initiative plays a crucial role in empowering independent media, ensuring that diverse voices are heard across the Balkans.

Clever – ECIT Joint Conference
-
VMA 22

21-22 January 2026

European Parliament, Paul-Henri Spaak building (Grand hemicycle – PHSHEM), 1047 Brussels

Web stream click here

The AI Act has set the rules for artificial intelligence in Europe. The challenge now is applying them effectively in businesses, public services and key sectors. EESC member Rudolf Kolbe, rapporteur of the opinion Apply AI Strategy – strengthening the AI continent, gives the EESC’s take on how Europe can turn its AI rules into practical results, all while keeping people and fundamental rights at the centre. 

The AI Act has set the rules for artificial intelligence in Europe. The challenge now is applying them effectively in businesses, public services and key sectors. EESC member Rudolf Kolbe, rapporteur of the opinion Apply AI Strategy – strengthening the AI continent, gives the EESC’s take on how Europe can turn its AI rules into practical results, all while keeping people and fundamental rights at the centre. 

The right to safe abortion is still not evenly protected across the EU, with some countries restricting or even criminalising access, while others leave women unable to practically exercise this fundamental right. We asked EESC member José Antonio Moreno Díaz about the European citizens’ initiative  My Voice, My Choice, which has collected over one million verified signatures and has already been backed by the European Parliament. With its opinion, the EESC gives full support to the initiative and calls on the European Commission to act to make sure that all women in the EU can access abortion safely and without barriers.

The right to safe abortion is still not evenly protected across the EU, with some countries restricting or even criminalising access, while others leave women unable to practically exercise this fundamental right. We asked EESC member José Antonio Moreno Díaz about the European citizens’ initiative My Voice, My Choice, which has collected over one million verified signatures and has already been backed by the European Parliament. With its opinion, the EESC gives full support to the initiative and calls on the European Commission to act to make sure that all women in the EU can access abortion safely and without barriers.

Online disinformation is evolving fast, and Europe is facing growing threats from foreign interference and false information. Debunk.org is a disinformation analysis centre that monitors online false narratives, fact-checks claims and exposes coordinated manipulation efforts. Viktor Dauškas, head of the organisation, talked to us about their work in tracking these campaigns. He explained the techniques and patterns used in modern disinformation efforts, the challenges they face in identifying manipulation across different platforms, and the steps that institutions, civil society and individuals can take to protect democratic debate in Europe.

Online disinformation is evolving fast, and Europe is facing growing threats from foreign interference and false information. Debunk.org is a disinformation analysis centre that monitors online false narratives, fact-checks claims and exposes coordinated manipulation efforts. Viktor Dauškas, head of the organisation, talked to us about their work in tracking these campaigns. He explained the techniques and patterns used in modern disinformation efforts, the challenges they face in identifying manipulation across different platforms, and the steps that institutions, civil society and individuals can take to protect democratic debate in Europe.

 

Based on your recent analyses (on surging pro-Kremlin narratives on X by the Telegram news channel Belarusian Silovik, deepfake scams, and a campaign to discredit Ursula von der Leyen), what are the most prominent disinformation patterns you are currently observing in Europe? How do fakes work? Which techniques are proving most effective today, and why?

The Debunk.org team observed a prominent pattern: cross-platform content launderingwhich also is a new way to overcome EU sanctions on Kremlin media. Content from sanctioned websites is rewritten with AI and shared across newly created disinformation media outlets, for example Pravda network, or shared on social media channels. Narratives often originate in closed or semi-closed spaces such as Telegram, where state-linked or proxy actors operate with limited oversight, and are then repackaged for platforms like X. In our investigation into Belarusian silovik-linked content, we saw a small set of inauthentic or impersonating accounts repeatedly amplifying the same source material, using AI to localise it and tailoring different audiences to increase reach.

A second pattern is event-driven political discreditation, where formal democratic procedures are deliberately instrumentalised to launch disinformation campaigns. In the case of the no-confidence vote targeting European Commission President Ursula von der Leyen, the vote itself became a vehicle for a coordinated discreditation campaign. Although it was evident in advance that there were insufficient votes to remove the President, the process was still pursued in order to amplify pre-existing narratives of corruption and illegitimacy.

Finally, we increasingly see disinformation converging with fraud, particularly through paid advertising and deepfake-enabled scams. Here, the goal is not only political manipulation but also direct financial harm, often achieved by impersonating trusted media outlets, public figures or institutions. These campaigns benefit from platform advertising systems and are difficult for users to distinguish from legitimate content.

 

Can you walk us through your debunking process? What signals or evidence typically allow you to identify coordinated or manipulated content?

Our debunking process combines narrative analysis, behavioural signals and technical verification, often in collaboration with partner organisations.

We start by identifying the upstream source of a claim and mapping how it spreads across platforms. Coordination signals include unusually high posting frequency, synchronised amplification around key events, repeated linking to the same origin, and accounts that impersonate real people or organisations. At the content level, we look for recurring phrasing and templated storytelling.

We assess scale and velocity. When large volumes of content appear in short timeframes, this might suggest automation or coordinated behaviour. While tools and techniques vary by case, the key is triangulating multiple indicators of manipulation rather than relying on a single signal.

Foreign information manipulation and interference (FIMI) analysis today is rarely a solo effort. Effective responses increasingly depend on information-sharing networks like FIMI-ISAC that connect researchers, journalists, civil society and public institutions, allowing threats to be identified and contextualised quickly.

Your analysis of disinformation targeting Ursula von der Leyen highlights broader trends in political discreditation. What does this case reveal about how EU-level institutions or figures are framed online today?

This reflects a wider FIMI strategy: eroding trust and weaken EU institutional authority so that official information, media, fact-checking or policy responses are more easily dismissed. The repetition of pre-existing made up narratives or accusations across Kremlin-aligned outlets, channels and influencers attempts to create large-scale campaigns to manipulate public opinion.

Based on your research, what skills do citizens most urgently need today to protect themselves from disinformation and deepfake content? Where do you see the biggest gaps in public digital/media literacy?

The most urgent skills today relate less to fact memorisation and more to recognising manipulation techniques.

Citizens need strong source-checking habits, including lateral reading, which involves leaving a website to check what other credible sources say, and basic verification of who is behind a claim. There is also a growing need for synthetic-media awareness: understanding that a convincing video or audio can be fabricated, and that visual realism is no longer proof of authenticity.

To help address these gaps, Debunk.org developed InfoShield, a free 45-minute online course that has already been completed by more than 5 000 citizens. The course focuses on practical, everyday skills for recognising manipulation, emotional framing and deceptive content in digital environments.

At the professional level, there is also a growing need for trained specialists who can systematically identify and respond to foreign information manipulation and interference. For this purpose, we offer FIMI101, a professional e-learning course designed to certify analysts working in this field. Participation in these professional courses directly supports Debunk.org’s continued research, monitoring and public-interest activities.

Viktoras Daukšas has been at the helm of the independent technology think tank and NGO Debunk.org for eight years. Debunk.org carries out analyses of foreign information manipulation and interference (FIMI) and coordinated inauthentic behaviour analyses (CIB). Together with its partners, it researches disinformation in 22 countries by combining expert knowledge with AI-driven technologies. Debunk.org also offers educational media literacy campaigns to teach people how to spot fakes online and shield themselves from disinformation.

 

Published in
Study
263 pages

The study assesses how a prospective EU Just Transition Directive (JTD) could shape the social and employment outcomes of the European Green Deal. Drawing on literature review, stake-holder interviews, foresight-based PESTEL scenario building and partial-equilibrium projections, it develops business-as-usual, as well as weak and strong JTD scenarios to 2045. The scenarios focus on seven Member States representing diverse welfare and production regimes.

Bilateral road transport agreement between Austria and Switzerland

Document Type
PAC