FTC chief warns deepfake takedown law requires urgent infrastructure overhaul

On Thursday, Federal Trade Commission (FTC) Chairman Andrew N. Ferguson testified before the House Appropriations Subcommittee on Financial Services and General Government regarding the agency’s preparations to implement and enforce the newly enacted Take It Down Act.
Officially known as the Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act, the bill was introduced by Texas Senator Ted Cruz and was passed by both chambers of Congress with overwhelming bipartisan support.
The act represents one of the most significant bipartisan legislative responses to the growing threat of AI-generated non-consensual intimate imagery. It also significantly expands the FTC’s enforcement authority when it comes to AI-generated deepfakes.
“Passage of the Take it Down Act is a historic win in the fight to protect victims of revenge porn and deepfake abuse,” Cruz said. “This victory belongs first and foremost to the heroic survivors who shared their stories and the advocates who never gave up. By requiring social media companies to take down this abusive content quickly, we are sparing victims from repeated trauma and holding predators accountable.”
At its core, the law mandates that websites and online platforms comply with takedown requests of non-consensual intimate imagery, including AI-generated deepfakes, within 48 hours of receiving a complaint from the depicted individual. Failure to do so places the platform at risk of regulatory enforcement, civil penalties, and potential liability. The FTC has been empowered to enforce rapid removal from online platforms.
One of the unique features of the act is that it recognizes and responds directly to the technological evolution of digital manipulation. Whereas prior laws have struggled to define or prosecute synthetic media due to gaps in existing statutory definitions, the act explicitly covers AI-generated or digitally altered intimate imagery, even when no original photo or video existed. By doing so, the act closes a critical loophole by acknowledging that the psychological and reputational harm inflicted by such content is not diminished by its inauthenticity.
The act grants primary enforcement authority to the FTC, which is now tasked with investigating violations, ensuring compliance, and overseeing the removal of harmful material across a vast and decentralized Internet ecosystem. However, enforcing the law presents new operational challenges.
Ferguson said the FTC will need additional resources to effectively enforce the act, and highlighted the necessity for specialized, secure, and segregated software systems to handle explicit materials during investigations to ensure such sensitive content remains separate from other agency data. Additionally, he noted that the potential inclusion of AI-generated child sexual abuse material in these investigations underscores the importance of maintaining strict data isolation protocols.
“I’m going to need some kind of segregated technology system to house this material when we’re conducting investigations,” Ferguson told lawmakers, adding that, “we’re not going to want that intermingled with the other bread-and-butter data that the commission brings on to conduct investigations.”
To address these new responsibilities, the FTC is prioritizing investments in infrastructure capable of managing large data environments and leveraging advanced analytics capabilities. Personnel will also have to be trained to evaluate and process complaints, and infrastructure will have to be expanded that is capable of tracking non-compliant digital platforms.
Ferguson’s testimony aligned with concerns raised by subcommittee Chair Rep. Dave Joyce, who acknowledged the FTC’s flat budget of $425.7 million for Fiscal Year 2025 isn’t enough to cover the anticipated rising costs related to IT, infrastructure, and expert witnesses. Both officials agreed that the enforcement of the Take It Down Act will require additional staffing and resources to address the challenges posed by AI-generated non-consensual content.
Ferguson made clear that while the FTC has broad experience in consumer protection, the sheer scope and sensitivity of deepfake takedown enforcement require dedicated investments and an agile, tech-savvy response framework.
Despite 1,221 employees having been fired or who retired this year, Ferguson assured the subcommittee that, even with these reductions, the FTC can fulfill its mission effectively, although he also conceded that the agency will need to hire specialized staff such as prosecutors and investigators,
The law applies to publicly accessible websites, social media platforms, file-sharing services, and any digital service that hosts or transmits user-generated content. However, it contains limited exemptions for platforms operating under good-faith moderation policies or which engage in robust compliance with takedown protocols, provided they act promptly upon valid requests.
It also sets conditions for verification of takedown claims, requiring that requesters affirm under penalty of perjury that they are the depicted individual or their legal representative.
Another important dimension of the law involves the intersection of AI-generated child sexual abuse material (CSAM), which is increasingly being detected in synthetic formats. The act strengthens the FTC’s investigative authority to handle complaints involving AI-manipulated CSAM, overlapping with existing federal child exploitation statutes while providing additional pathways for expedited removal and investigation. The law does not replace child protection laws but supplements them by focusing on the mechanisms of distribution and technological dissemination.
While the enforcement mechanisms rest with the FTC, the law also anticipates eventual collaboration with state Attorneys General who are authorized to bring civil actions on behalf of affected residents. The act’s drafters included provisions to avoid First Amendment challenges by clearly targeting non-consensual conduct and not criminalizing artistic expression or parody that does not include involuntary nudity or sexual imagery of real individuals.
Ferguson’s testimony underscored the FTC’s commitment to protecting individuals from the harms of non-consensual deepfake content and highlighted the agency’s proactive steps to equip itself with the necessary tools and resources to enforce the law effectively.
Article Topics
deepfakes | FTC | generative AI | legislation | research and development | Take It Down Act | U.S. Government
Comments