FB pixel

Deepfakes: Some progress in video detection, but it’s back to the basics for faked audio

Categories Biometric R&D  |  Biometrics News
Deepfakes: Some progress in video detection, but it’s back to the basics for faked audio
 

A pair of developments are being reported in efforts to thwart deepfake video and audio scams. Unfortunately, in the case of digitally mimicked voice attacks, the advice is old school.

An open-access paper published by SPIE, an international professional association of optics and photonics, reports on a new algorithm reportedly has scored a precision rate in detecting deepfake video of 99.62 percent. It reportedly was accurate 98.21 percent of the time.

It has been three years since the threat of deepfakes broke big in the global media, and during that time, efforts have quickly grown more sophisticated.

Fear about misuse (beyond simply grafting the faces of celebrities on those of porn actors) has sometimes been breathless, with some observers warning that a key military figure in a nuclear-armed nation could appear to issue emergency orders to launch missiles.

The paper’s authors, two from Thapar Institute of Engineering and Technology and the third from Indraprastha Institute of Information Technology (both in India), claim a research milestone.

They say that they are the first to make publicly available a database of deepfakes manipulated by generative adversarial networks that feature famous politicians. Their database is 100 source and 100 destination videos.

What is more, they claim to be the first with an algorithm that can spot deepfakes of politicians within two seconds of the start of a clip. The team has said they used temporal sequential frames taken from clips to pull off the feat.

Biometrics providers ID R&D and NtechLab finished among the leaders in a recent video Deepfake Detection Challenge.

Voice fraud detection efforts continue apace, too.

Until the pandemic, when people of all walks of life began routinely participating in video calls, deepfake audio attack looked more menacing over the medium term.

Comparing the two threats, it just seemed more likely that a convincing faked call could rattle a key mid-level staff member into helping the boss out in an emergency. The odds have evened a bit.

A white paper published by cybersecurity firm Nisos, sketches five incidents involving deepfake audio attacks.

Nisos writes in the marketing document that it actually investigated one such attack — including the original synthetic audio. It was the faked voice of a company’s CEO “asking an employee to call back to ‘finalize an urgent business deal.’ ”

Wisely, the employee immediately called the legal department. The number the would-be victim was intended to call was a VOIP service burner.

Nisos engineers studied the recording with Spectrum3d, a spectrogram tool, which, along with just listening to the message and comparing it to a known human voice elicited some data, but, apparently, no smoking gun.

Ultimately, the best advice that Nisos or anyone in the industry can offer is to stress the commonsense. If something about a call smells fishy, call legal.

Article Topics

 |   |   |   |   |   |   |   | 

Latest Biometrics News

 

Biometrics options and digital ID choices provide needed nuance but vex some govts

Biometrics can be stored, tokenized and applied in myriad ways depending on the requirements of a given application, and the…

 

CDD Services seeks funding to pilot local digital ID with UK communities

A new initiative to provide digital ID to people in the UK vulnerable to digital exclusion has been proposed by…

 

Dialogue between UK government, digital identity providers gets restart next week

Chief Secretary to the Prime Minister Darren Jones will meet with UK digital identity providers next Tuesday, December 2 for…

 

As Dec 10 deadline approaches, opponents of Australia social media law get louder

With less than two weeks to go before Australia’s social media ban for people under 16 takes effect, Silicon Valley…

 

Age checks for social media not ready says Australian Parliamentary Committee

An Australian parliamentary committee is recommending that the government delay the implementation of the country’s age assurance law for social…

 

Genetec foresees biometrics, wallets in the near future of access control

Montreal’s Genetec Inc., a provider of enterprise physical security software, has made its  predictions for the physical security industry in…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events