FB pixel

Deepfakes were scoffed at last year; FBI issues a warning this year


biometric digital identity verification for fraud prevention

Last spring, NATO gave itself a pep talk about the threat posed by deepfakes. They are no big deal, private sector experts said in panel discussion.

A year later, the FBI’s cyber division is warning that “malicious actors almost certainly will leverage synthetic content for cyber and foreign influence operations.”

In fact, the FBI says that a serious deepfake misinformation campaign carried out by another nation, an independent propaganda cell or a home-grown criminal operation could occur a year to 18 months from now.

Deepfakes are AI applications capable of creating lifelike still and video images of people who do not exist. They also can use biometric data to create believable images and footage of real people in fictional scenarios.

FBI agents called out unnamed Russian, Chinese and Chinese-speaking actors for creating synthetic social media profile images as part of influence campaigns in the United States. Fictitious journalists are being created in this way, for instance, according media reports cited by the bureau.

What has changed over the past year that two interrelated government organizations could assess the world so differently? Not much, really.

Deepfake algorithm improvement has been bracing, but not surprising to AI researchers and programmers.

It is more likely that the technology is being taken more seriously in global and national security circles.

The panel, assembled by NATO’s Strategic Communications Center of Excellence, found little of concern because deepfake attacks had not yet occurred, people will not be fooled and because methods of determining what is fake will keep up with nefarious influencers.

And, indeed, fakes-spotting tactics continue to dribble out.

One of the more recent efforts focused on the eyes of people digitally represented. Light reflections in each eye of a real person’s photograph are largely mirrored. That is not the case in the eyes of manufactured faces.

That could be because every pixel of a deepfake is an amalgamation of untold data set images. It could also be a result of programming that avoids unnatural symmetry (which often produces misshapen ears and mismatched earrings.)

Either way, the resulting images can be analyzed for similarity by an AI system to detect the fake, and researchers achieved a 94 percent deepfake detection success rate with portrait-style photos. The researchers also acknowledge limitations in the method based on the need for a reflected source of light mirrored by both eyes.

Article Topics

 |   |   |   |   |   |   |   | 

Latest Biometrics News


Smile ID surges past 150M identity verifications completed

Selfie biometrics provider Smile ID (formally Smile Identity) has achieved the completion of 150 million identity verifications – five months…


Incode age assurance gets stamp of approval from ACCS

Incode Technologies has announced that their Incode Identity Platform product has received certification under the Age Check Certification Scheme (ACCS)….


Arana launches police biometrics app for HID scanner

Arana Security has launched a new mobile app for mobile biometric fingerprint readers from HID used by law enforcement. The…


Fraud hammers online services, drives AI ambivalence

Fraud rates are spiking just like temperatures in many parts of the world. Global identity verification companies Sumsub, AuthenticID and…


Contractor distances self from biometric device failures in South Africa elections

A Johannesburg-based company, Ren-Form, which supplied biometric hardware to the Electoral Commission of South Africa (IEC) says it is not…


Online age verification requirements in US legislation raise thorny problems

More than a dozen bills have been introduced in the current U.S. Congress that, if enacted, would increase protections for…


Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Most Read This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events