President Trump Endorses First-Ever Federal Legislation Against Deepfake Technology Abuse
On Monday, President Donald Trump signed the bipartisan Take It Down Act into law—a landmark measure that criminalises the non-consensual online sharing of intimate images, whether real or AI-generated.
Effective immediately, the law also requires major tech platforms to remove such content within 48 hours of a takedown request.
In a rare and symbolic gesture, First Lady Melania Trump joined the signing ceremony, adding her signature in support.
The legislation marks a defining moment in her recent push to combat online exploitation and protect children in the digital age.
🇺🇸 BREAKING: TRUMP SIGNS TAKE IT DOWN ACT — FIRST FEDERAL LAW TARGETING DEEPFAKE ABUSE
“Today, it’s my honor to officially sign the Take It Down Act into law.
This would be the first-ever federal law to combat the distribution of explicit, imaginary [images] posted without… https://t.co/5w7IwuyH8h pic.twitter.com/u1MIqOJ0Cu
— Mario Nawfal (@MarioNawfal) May 19, 2025
Melania Trump said:
"This legislation is a powerful step forward in our efforts to ensure that every American, especially our young people, can feel better protected from their image or identity being abused through non-consensual intimate imagery or NCII."
She added:
"AI and social media are the digital candy of the next generation—sweet, addictive, and engineered to have an impact on the cognitive development of our children. But unlike sugar, these new technologies can be weaponized, shape beliefs, and sadly, affect emotions and even be deadly."
🚨 #BREAKING: President Trump has just signed Melania Trump's TAKE IT DOWN Act into law
He even had Melania sign it too.
The bill requires big tech platforms to delete revenge p*rn within 48 hours, and requires JAIL TIME for perpetrators.
Many children around the country have… pic.twitter.com/E8CjWohNmO
— Nick Sortor (@nicksortor) May 19, 2025
Rise in Deepfake Porn and Online Exploitation
The Take It Down Act arrives in response to a sharp rise in deepfake pornography and non-consensual image abuse—issues that have impacted public figures like Taylor Swift and Jamie Lee Curtis, as well as minors.
Explicit, AI-generated images of individuals have circulated online without consent, spotlighting the urgent need for regulation.
As one of the first significant federal responses to the dangers of generative AI, the new law makes it a federal crime to publish—or threaten to publish—non-consensual intimate images, including deepfakes, with the intent to harm or harass.
Offenders face serious criminal penalties, including imprisonment and fines.
For adults, the law prohibits posting intimate imagery online without consent when it causes or is intended to cause harm, especially when the images were created or obtained in private settings.
For minors, it applies when publication is intended to harass, abuse, or sexually exploit.
The legislation also compels online platforms that host user-generated content to implement clear takedown procedures and to prevent re-uploads of flagged material.
Backed by near-unanimous support—passing 409–2 in the House and unanimously in the Senate—the bill gained momentum thanks in part to First Lady Melania Trump’s advocacy.
She personally lobbied lawmakers, hosted victims in a Capitol Hill roundtable, and brought survivor Elliston Berry to the president’s congressional address in March.