US President Donald Trump has enacted a new law that criminalizes the sharing of intimate images, whether fake or real, created without an individual’s permission.
The legislation specifically targets videos and pictures generated using artificial intelligence—commonly known as deepfakes—which are frequently employed for online harassment or embarrassment.
Under these new regulations, anyone who distributes such content without consent could face a prison sentence of up to three years.
The “Take It Down Act,” passed with overwhelming bipartisan congressional support, not only criminalizes the non-consensual publication of intimate images but also mandates their removal from online platforms.
“With the rise of AI image generation, countless women have been harassed with deepfakes and other explicit images distributed against their will,” Trump stated at a signing ceremony held in the Rose Garden of the White House.
“And today we’re making it totally illegal,” the president declared. “Anyone who intentionally distributes explicit images without the subject’s consent will face up to three years in prison.”
First Lady Melania Trump endorsed the bill in early March and attended the signing ceremony, marking a rare public appearance at the White House since her husband assumed office on January 20, having spent limited time in Washington.
During her remarks at the signing ceremony, she characterized the bill as a “national victory that will help parents and families protect children from online exploitation.”
“This legislation is a powerful step forward in our efforts to ensure that every American, especially young people, can feel better protected from their image or identity being abused,” she affirmed.
Deepfakes often utilize artificial intelligence and other technological tools to produce realistic-looking fabricated videos.
They can be used to create falsified explicit imagery of real women, which is then published without their consent and widely disseminated.
While some US states, including California and Florida, have existing laws criminalizing the publication of sexually explicit deepfakes, critics have voiced concerns that the “Take It Down Act” grants authorities increased censorship powers.
The Electronic Frontier Foundation, a non-profit organization dedicated to free expression, has argued that the bill provides “the powerful a dangerous new route to manipulate platforms into removing lawful speech that they simply don’t like.”
The bill will require social media platforms and websites to implement procedures for the swift removal of non-consensual intimate imagery upon notification from a victim.
Harassment, bullying, blackmail:
The rapid online proliferation of non-consensual deepfakes is currently outpacing global efforts to regulate the technology, driven by the widespread availability of AI tools, including photo applications that digitally undress women.
While high-profile politicians and celebrities, including singer Taylor Swift, have fallen victim to deepfake videos, experts caution that women not in the public eye are equally vulnerable.
A wave of AI video scandals has been reported in schools across US states, with hundreds of teenagers being targeted by their own classmates.
Such non-consensual imagery can lead to harassment, bullying, or blackmail, sometimes resulting in devastating mental health consequences, experts warn.
Renee Cummings, an AI and data ethicist and criminologist at the University of Virginia, described the bill as a “significant step” in addressing the exploitation enabled by AI-generated deepfakes and non-consensual imagery.
“Its effectiveness will depend on swift and sure enforcement, severe punishment for perpetrators and real-time adaptability to emerging digital threats,” Cummings told AFP.