Artificial Intelligence: EU law should set safe boundaries for high-risk applications

Biometric recognition for tracking, surveillance and detecting emotions should have no place in Europe's human-centric Artificial Intelligence (AI), says the EESC in its response to the European Commission's White Paper on AI, adopted by the EESC plenary in July.

The European Commission has proposed that an AI application should be considered high-risk if it involves both a high-risk sector (healthcare, transport, energy and parts of the public sector) and high-risk use, with a few exceptions to be defined. Only if these two conditions are met should we talk about high-risk AI, which would fall under specific regulations and governance structures.

The EESC believes that this definition risks creating potentially dangerous loopholes.

"Take Facebook's political advertising", argues opinion rapporteur Catelijne Muller. "Advertising is a low-risk sector and Facebook's news aggregation function can be regarded as a low-risk use. However, we have seen during election campaigns that the spread across Facebook of fake news and deepfakes generated with the help of AI can have many negative effects and influence how people vote, with interference even from outside Europe."

The EESC believes it would be better to draw up a list of common characteristics to be considered high-risk no matter what the sector.

AI-driven biometric recognition for surveillance or to track, assess or categorise human behaviour or emotions should also be banned, the EESC insists. All the more so since there is no scientific evidence that we can discern a person's feelings based on their biometrics, stresses Ms Muller.

Additionally, the EESC warns against an uncontrolled surge in tracking and tracing technology finding its way into our society in a bid to fight the coronavirus outbreak.

"AI techniques and approaches to fight the pandemic should be just as robust, effective, transparent and explainable as any other AI technique in any other situation," says Ms Muller. "They should uphold human rights, ethical principles and legislation. They should also be voluntary, because whether we like it or not, many techniques introduced during the crisis will become permanent". (dm)