European Economic
and Social Committee
AI 'made in Europe' – possible but needs work
By Sandra Parthie
The AI Act is the first comprehensive legal framework regulating artificial intelligence globally.
The use of AI is expanding and affects many aspects of our daily lives. For instance, it influences the information people see online through targeted advertisements. But more importantly, it is now used in the health sector to help diagnose and treat diseases such as cancer. To do so, AI applications rely on general-purpose AI (GPAI) models, which need to be trained. They need to be fed many images of, for example, cancerous cells to eventually recognise them independently.
Successful training relies on data – enormous amounts of data. The way the training is done influences the quality of the outcome of the trained model or AI application. If it is fed the wrong data or images, it will misidentify healthy cells as cancerous ones.
Improving medical and health care is a compelling example of why it is necessary that in the EU we have the capacity and infrastructure to develop underlying general purpose AI models. It will simply help to save lives.
Beyond that, GPAI is a game changer in production processes and also for businesses. For Europe’s economy to remain competitive, we need to provide the space for innovation within the EU, and encourage entrepreneurs and start-ups to develop their ideas.
Of course, there are risks connected with AI and GPAI – ranging from flaws in the models and bugs in the applications to the outright criminal use of the technology. Thus, the EU also must have the expertise to refute malicious attacks and cyberthreats, it must be able to rely on EU-based infrastructure, to ensure that, to put it simply, “the lights stay on”.
All of the above shows the importance of having the right regulation, one focusing on the quality of the training data, the training methods and ultimately, the final product. It needs to be based on European values, such as transparency, sustainability, data protection or respect for the rule of law. Unfortunately, many of the major GPAI developments are being spearheaded by actors outside the EU’s jurisdiction. The EU must therefore develop the capacities to enforce compliance with its regulatory provisions and European values vis-à-vis EU and non-EU actors active in our market.
The EU must reduce market dominance by large, often non-European, digital companies, including by mobilising the tools of competition policy. Competition authorities in the EU need to leverage their capacities and ensure that hyperscalers do not abuse their B2B or B2G market position.
Public authorities can support European providers of GPAI and AI applications by procuring their products, demonstrating their trustworthiness to further users and clients. The EU does have the talent, the technological know-how and the entrepreneurial spirit needed for “AI made in Europe”. But a lack of investment, a lack of the relevant IT infrastructure, and the continuous fragmentation of the internal market which hinders scaling-up, impede the competitiveness of Europe’s AI actors.