Trump Signs Take It Down Act: Making Deepfake Pornography a Federal Crime, Protecting Users from AI Abuse

President Donald Trump has signed the bipartisan Take It Down Act into law, marking a major step forward in the federal effort to combat the spread of deepfake pornography and non-consensual intimate images online. The new law imposes strict regulations on technology platforms and provides legal protections for victims – especially women and minors – from being exploited through AI technology.

The law goes into effect immediately and requires online platforms to remove sensitive images without permission within 48 hours of receiving a request from a victim or legal representative.

Melania Trump: “A National Victory” for Children and Women
Joining the President at the signing ceremony, First Lady Melania Trump described the bill as a “national victory” and a major step forward in the campaign she has led to protect children from the dangers of digital technology.

“This law is a powerful step forward in our efforts to protect young people and women from the exploitation of non-consensual intimate images – whether real or AI-generated,” Melania said.

She noted that AI and social media have become the “digital candy” of the new generation – engaging, addictive, and capable of having a profound impact on the cognition and mental health of young people. “But unlike sugar, these technologies can be weaponized – and that is what this law is fighting,” she added.

Federal Crimes for Deepfake Pornography
The Take It Down Act criminalizes the distribution or threat to distribute non-consensual intimate images of adults or minors, including deepfakes, for the purpose of causing harm, harassment, or sexual arousal. Violators could face federal fines and prison time.

In the case of adults, if the images were created or collected in a situation where there was a reasonable expectation of privacy and then made public without consent, the action is prosecuted under federal law. For minors, the law protects them from any publication for the purpose of abuse, harassment, or personal gratification.

Tech platforms required to respond quickly
The law requires all websites and platforms that host user content to establish a transparent, effective removal process and respond within 48 hours of receiving a valid request. They would also be required to prevent the re-publishing of content that has been removed.

The bill received near-unanimous support in the US Congress, with the House voting 409–2 and the Senate passing unanimously – reflecting a rare bipartisan consensus on the threat posed by deepfake technology.

High-profile cases that prompted the legislation
The bill comes amid a series of high-profile deepfake incidents. Celebrities such as Taylor Swift and Jamie Lee Curtis have had their likenesses used without permission in AI-generated videos. Curtis recently publicly criticized Meta CEO Mark Zuckerberg for failing to quickly remove a deepfake ad that featured her likeness.

In addition, incidents at schools – such as in Sydney, Australia – have raised global alarm about the threat of AI in education. A student here has been investigated for using AI to create deepfake pornographic images of his classmates.

A symbolic act
The Take It Down Act is not only a legislative step forward, but also a symbol of the United States' transformation in the face of new-age technological challenges. With strong support from both the President and the First Lady, along with efforts from victims and social activists, it is seen as the beginning of a new legal era to protect people in the digital world.