In a major move to curb the dark side of artificial intelligence, President Donald Trump has signed a powerful new law that makes it a federal crime to publish AI-generated explicit images without consent.

On May 19, President Trump officially signed the TAKE IT DOWN Act—short for Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks—marking a major victory in the fight against digital exploitation.

The legislation, strongly backed by First Lady Melania Trump, targets the growing misuse of deepfake technology, particularly the disturbing trend of nonconsensual explicit content. The law mandates that any such content—whether real or AI-generated—must be taken down within 48 hours once reported. Offenders could face serious penalties ranging from hefty fines to jail time.

> “This law protects Americans—especially women and children—from the cruelty of weaponized deepfakes,” Trump said during a speech at the White House Rose Garden, later sharing his remarks on Truth Social.

Melania Trump, who played a pivotal role in pushing the bill through Congress, called it a “national victory” and warned of the broader dangers AI poses to young minds.

> “AI and social media are like digital candy—sweet, addictive, and dangerously influential,” she said. “But unlike sugar, these tools can be weaponized to manipulate, harm, and even destroy lives.”

Introduced by Senators Ted Cruz and Amy Klobuchar in June 2024, the TAKE IT DOWN Act saw bipartisan support and was passed by both the House and Senate in April 2025.

---

The U.S. Joins the Fight Against Deepfake Abuse

The new law comes in the wake of alarming incidents, including the viral spread of explicit AI-generated images of Taylor Swift on X (formerly Twitter) in early 2024. The platform was forced to temporarily ban searches for her name, while lawmakers scrambled to introduce emergency legislation in response.

> Related: AI scammers are now impersonating U.S. government officials, says FBI

The U.S. is now one of several countries taking legal action against deepfake pornography. The UK, for instance, outlawed the practice under its Online Safety Act of 2023.

A shocking 2023 study by Security Hero revealed that over 99% of deepfake porn targets are women, underscoring the gendered nature of this technological abuse.

$XRP

$ETH

$BNB