Is the EU AI Act leaving a backdoor for emotion recognition?
As the European Union prepares to adopt the Artificial Intelligence Act, the landmark legislation is still attracting criticism, this time for allowing the use of emotion recognition. An opinion piece published in the EU Observer highlights the AI Act still leaves the door open for its use in law enforcement and migration officers which could lead to potential rights abuses.
The anonymous writer, who describes themselves as an EU civil servant, claims that this leaves space for its use in law enforcement and migration control, particularly the controversial EU-funded iBorderCtrl project, an AI lie-detector for facial and emotion recognition technologies designed to be used during migrant interrogation. In 2018, the bloc announced it would fund the project and conduct pilots in Hungary, Greece and Latvia.
iBorderCtrl has been a target of heavy scrutiny among rights groups and lawmakers. Despite efforts to shed light on its use, the Court of Justice of the European Union (CJEU) ruled in September 2023 to deny access to documentation related to the project, citing the protection of commercial interests.
Facial emotion recognition is the process of identifying human sentiment using face biometrics, a relatively new field of AI that has been met with mixed results. The AI Act itself highlights the technology’s shortcomings and its potential for “discriminatory outcomes” and intrusiveness to rights and freedoms.
“There are serious concerns about the scientific basis of AI systems aiming to identify or infer emotions, particularly as expression of emotions vary considerably across cultures and situations, and even within a single individual,” the document states.
Despite these concerns, emotion recognition has not been included in Article 5 of the AI Act which defines Prohibited Artificial Intelligence Practices. Instead, emotion recognition is defined as a high-risk AI system and its use is prohibited at the workplace and in educational institutions, with exceptions for safety and medical reasons. Deployers of emotion recognition systems are not required to inform people about the operation of such systems if the system is used to “detect, prevent and investigate criminal offenses,” according to the legislation.
The language of the AI Act adds to the confusion on the legality of emotion recognition: The law states that the fact that an AI system is classified as a high-risk AI system “should not be interpreted as indicating that the use of the system is lawful under other acts of Union law or under national law compatible with Union law.”
At the beginning of February, European lawmakers reached an agreement on the technical details of the AI Act, opening the path to European Parliament committees’ approval in April. Although significant opposition is not expected, lawmakers may still introduce changes that could slow down the timeline of its implementation.
Article Topics
AI Act | biometrics | emotion recognition | EU | face biometrics | iBorderCtrl | legislation
Comments