By Tatjana Babrauskienė

In today’s digital era, AI algorithms are the unseen gatekeepers of information, wielding immense power over what we see and what we don't see. Designed to maximise engagement, these algorithms often amplify sensational and divisive content, a trend that disproportionately affects Eastern European voices. The European Economic and Social Committee (EESC), in its report, has starkly revealed how this digital ecosystem suppresses independent journalism while enabling foreign disinformation to flourish.

This algorithmic bias creates a damaging feedback loop. Platforms, driven by their relentless pursuit of likes and shares, drown out nuanced, fact-based reporting. The result is a distorted reality in which stereotypes are reinforced and 'filter bubbles' hinder public discourse. This is acutely felt in Central and Eastern Europe (CEE), where foreign influence operations - notably from Russia and China - cleverly exploit local languages and cultural contexts to sow discord.

The case of Belarus is a powerful example of this dynamic. The oppressive regime has used digital suppression and algorithmic bias to systematically marginalise independent media. Social media platforms often default to Russian-language content, silencing independent Belarusian voices and reinforcing state propaganda. This problem is further compounded by platforms like Telegram and VKontakte, which operate outside EU regulatory frameworks, creating dangerous blind spots in content moderation.

To reclaim the digital narrative, a multi-pronged strategy is essential. It must begin with algorithmic transparency from platforms, with public disclosure of how content is prioritised to ensure accountability for diverse voices. This is a key principle of the Digital Services Act (DSA). Furthermore, a standardised framework for algorithmic audits is crucial. The EESC is pushing for a dedicated European Oversight Authority for Digital Information - an autonomous body that would enforce the DSA and the AI Act, specifically targeting systemic biases that threaten media pluralism and fundamental rights.

The EU must also financially support independent CEE media. A dedicated fund could provide journalists with the resources needed for investigative reporting and media literacy programmes, empowering them to effectively counter disinformation. Combating foreign information manipulation is another vital front. The EU must impose stricter penalties on platforms that fail to address manipulative content, and require the platforms to proactively develop measures to detect foreign interference, supported by partnerships with cybersecurity firms and data access for researchers.

Finally, language sensitivity is paramount. Platforms must employ moderators with deep regional and socio-political knowledge to ensure fair treatment of local languages and dialects. The EESC emphasises that it is time to move beyond reductive labels like 'post-Soviet' and embrace narrative frameworks that honour the unique histories and identities of the Eastern European nations.

In conclusion, the dual nature of AI algorithms presents both a challenge and an opportunity. By prioritising transparency and accountability, platforms can become allies in the fight for a balanced digital space. At the same time, the EU must actively champion independent media and the rich narratives of Eastern European nations. Only through these concerted efforts can we build a resilient and inclusive digital landscape that reflects the complexities and realities of the region.