Council of Europe warns against biometrics-based emotion recognition
Face biometrics should not be used to implement evaluations of employees based on emotion recognition, The Council of Europe has warned, according to the Financial Times. The organization also says private companies must gain specific consent from people before using their facial recognition data.
The Council of Europe is a regional rights watchdog, and is made up of 47 countries including EU members, Turkey, Russia and the UK.
The plans represent the continent’s “most extensive proposals” yet, the Times writes, but also notes that document, drafted by an expert committee within the Council, is non-binding.
“Forty years ago, the Council of Europe introduced the first binding international legal standards for data protection,” states Council Secretary-General Marija Pejčinović Burić. “Today we are tasked with ensuring that facial recognition technology also respects the rights to which we are all entitled by law.”
Applications where the use of face biometrics could lead to discrimination include staff analysis, insurance access, education and policing, and each should be prohibited, according to the Council. The group also wants biometrics-based estimates of gender, age, health and other characteristics to be banned, barring the enactment of legal safeguards that would prevent discrimination.
Emotion recognition has long been identified by researchers as an unreliable technology.
The European Commission, meanwhile, is working on its own legislative proposal for artificial intelligence regulation, and is being urged to enact tough measures.
An amendment was recently proposed in the EC to ban all biometrics use by law enforcement.
Marketing of emotion analysis tech ramps up in China
In China, marketing of biometrics-based emotion analysis technology is ramping up, according to a report from UK-based human rights group Article 19 analyzed by Reuters. The increase in the market’s supply-side comes despite persistent concerns about how accurate the systems are, and the implications they have on human rights.
A senior program officer at Article 19 described emotion recognition systems as having “racist foundations and fundamental incompatibility with human rights.”
Systems that purport to indicate criminality based on biometric analysis are particularly flawed and prove to bias, researchers say.
The concerns are also spilling over into applications for monitoring students and drivers, such as those attempting to detect fatigue or distraction. Herta launched an emotion recognition solution based on face biometrics this month.
Lenovo, the world’s largest PC-maker, offers smart education solutions with speech and gesture recognition features and face-based emotion recognition. Article 19 says Lenovo has provided educational technology to a dozen provinces in China, though it is not clear all have been deployed.
Article 19 counts 30 companies selling emotion recognition technology.
accuracy | AI | biometrics | China | data protection | emotion recognition | EU | facial recognition | gender recognition | regulation