FB pixel

Deepfake detection in cochlear implants could reduce fraud risk for hearing impaired

Study finds CI users have a harder time recognizing synthetic voices
Deepfake detection in cochlear implants could reduce fraud risk for hearing impaired
 

It is increasingly difficult for the human ear to recognize audio deepfakes. For the hearing-impaired who rely on cochlear implants (CI), it’s even harder – unless the implants have built-in deepfake detection.

“Deaf and hard-of-hearing populations, especially cochlear implant users, perceive audio in very different ways from hearing persons,” says Kevin Butler, a professor at the University of Florida’s Department of Computer and Information Science and Engineering (CISE), and director of the Florida Institute for Cybersecurity Research (FICS).

CIs assist hearing by converting sound into electrical signals that stimulate the auditory nerve. They prioritize speech-relevant frequencies and compress sound, which reduces nuances in aspects like pitch – meaning they could be more vulnerable to deepfake voice fraud.

The issue prompted a CISE student, Magdalena Pasternak, to research whether cochlear implants presented an increased risk of AI deepfake fraud for their users. The short answer, recently published in the paper “Characterizing the Impact of Audio Deepfakes in the Presence of Cochlear Implant Simulated Audio”, is yes.

A release from UF says “the study indicates the need for enhanced deepfake detection systems in implants for the hearing impaired.” Participants without CIs were able to identify deepfakes with 78 percent accuracy, while CI users achieved only 67 percent accuracy. CI users were twice as likely to misclassify deepfakes as real speech.

According to Butler, the CISE faculty lead on Pasternak’s study, the solution could be to build certain alert mechanisms directly into assistive devices.

Pasternak researches security in large language models (LLMs), deepfake detection and machine learning, and focuses on how the security community addresses the specific cybersecurity needs of vulnerable populations. She says “deepfake technology has advanced so rapidly in such a short span of time that many people are simply unaware of how sophisticated modern deepfakes have become. While education and awareness programs can play a huge role in protecting users, we must also develop technological solutions that accommodate diverse auditory processing abilities. By doing so, we can ensure that all users, especially those with auditory implants, have the necessary defenses against ever-evolving threats.”

Related Posts

Article Topics

 |   |   |   |   |   | 

Latest Biometrics News

 

Municipal ID programs offer ID to undocumented people, and ICE wants their data

Amid the ongoing collapse of democratic norms in the U.S., it is easy to miss a nightmare scenario unfolding for…

 

Unissey levels-up biometric injection attack detection certification

Unissey’s face biometrics have been certified to substantial-level compliance with the European biometric injection attack detection (IAD) standard. Injection attacks…

 

Hey babe, check out my regulations: porn star, VerifyMy spice up UK Online Safety Act

It’s one thing when Christian moralists lobby for age assurance laws – but another thing entirely when the voices are…

 

Regula launches dedicated biometric morph attack detector

A new face morphing detector has been unveiled by Regula to defend against the significant security threat of passports and…

 

UK regulator fines 23andMe over massive genetic data breach

The U.K. Information Commissioner’s Office (ICO) has fined U.S.-based 23andMe £2.31 million for serious security failures that resulted in a…

 

Tonga reveals MOSIP and VS One World foundations of DPI success

Tonga launched its TongaPass digital ID and digital government portal this month. The government is now ramping up registration as…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events