FB pixel

YouTube offers its biometric deepfake detection tool to celebrities

YouTube offers its biometric deepfake detection tool to celebrities
 

After content creators, politicians and journalists, YouTube will also enable celebrities to access its likeness detection tool, allowing them to remove deepfakes and stop unauthorized impersonation on the platform.

YouTube’s biometric likeness technology scans for AI-generated videos that match a verified user’s appearance.  The feature functions similarly to Content ID, a tool that helps detect and remove copyrighted material on the platform, according to a company blog post.

Users can verify their identity by submitting a government ID and a self-recorded video for face biometrics matching. Those enrolled in the program receive alerts when potential matches surface and can request removal if the content goes against YouTube’s privacy policy.

The feature was first offered to creators in the YouTube Partner Program last year, while in March it was expanded to government officials, journalists and political candidates. Expansion to other famous people is the next logical step.

YouTube says it has collaborated with talent agencies and management companies, such as CAA, UTA, WME, and Untitled Management, to provide the tech to entertainers.

While some celebrities have recoiled at the thought of seeing their deepfakes online, others are seeing it as a money-making tool. Talent agency CAA has, for instance, built a database with AI developer Veritone that stores their clients’ digital likenesses and voices to give them control and compensation in cases of AI use.

YouTube is not the only company that is introducing measures to protect famous people from unauthorized deepfakes, as AI-generated videos fuel scams, political misinformation and reputational manipulation.

Last year, OpenAI committed to “strengthen guardrails around replication of voice and likeness when individuals do not opt-in,” following Breaking Bad star Bryan Cranston’s decision to reach out to the actors’ union SAG-AFTRA over unauthorized AI-generated versions of his likeness.

Meanwhile, both OpenAI and YouTube have voiced support for the proposed NO FAKES Act, a U.S. federal law designed to protect individuals’ voices and visual likenesses from unauthorized, AI-generated digital replicas.

Related Posts

Article Topics

 |   |   |   |   |   | 

Latest Biometrics News

 

Regula analysis finds ID document verification hardest for Arabic, Chinese, Japanese

While the Latin alphabet is the alpha and omega for around 40 percent of the world’s people, that still leaves…

 

Roblox settles with Alabama, West Virginia, agrees to age checks for users under 16

Social gaming platform Roblox is settling its accounts. Having settled with the State of Nevada for $12.5 million over lawsuits…

 

Japan moves toward age verification for social media filters and risk labels

Japan’s policymakers are considering their own version of age assurance for social media with content filtering taking the limelight. Nikkei…

 

Neurotechnology spinoff SkyBiometry launches AI infrastructure suite

SkyBiometry, a subsidiary of biometric technology company Neurotechnology, has announced the launch of an AI factory and an infrastructure suite of…

 

AVPA plots course for age assurance future based on learnings from Australia

In 2025, few people on Earth logged as many travel miles as Iain Corby, the executive director of the Age…

 

London police win legal challenge against live facial recognition deployment

London’s Met Police force has won a legal challenge to its use of live facial recognition, allowing them to continue…

Comments

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Biometric Market Analysis and Buyer's Guides

Most Viewed This Week

Featured Company

Biometrics Insight, Opinion

Digital ID In-Depth

Biometrics White Papers

Biometrics Events