Take It Down Act, addressing nonconsensual deepfakes and ‘revenge porn,’ passes. What is it?
Apr 29, 2025, 12:20 PM

First lady Melania Trump listens as President Donald Trump speaks with reporters as she and the President depart on Marine One from the South Lawn of the White House, Friday, April 25, 2025, in Washington. The President and first lady will be traveling to Rome and the Vatican to attend the funeral for Pope Francis. (AP Photo/Alex Brandon)
Credit: ASSOCIATED PRESS
(AP Photo/Alex Brandon)
Congress has overwhelmingly approved bipartisan legislation to enact stricter penalties for the distribution of non-consensual intimate imagery, sometimes called 鈥渞evenge porn.鈥 Known as the Take It Down Act, the bill is now headed to President Donald Trump鈥檚 desk for his signature.
The measure was introduced by Sen. Ted Cruz, a Republican from Texas, and Sen. Amy Klobuchar, a Democrat from Minnesota, and later gained the support of First Lady Melania Trump. Critics of the bill, which addresses both real and artificial intelligence-generated imagery, say the language is too broad and could lead to censorship and First Amendment issues.
The bill makes it illegal to 鈥渒nowingly publish鈥 or threaten to publish intimate images without a person’s consent, including AI-created “deepfakes.” It also requires websites and social media companies to remove such material within 48 hours of notice from a victim. The platforms must also take steps to delete duplicate content. Many states have already banned the dissemination of sexually explicit deepfakes or revenge porn, but the Take It Down Act is a rare example of federal regulators imposing on internet companies.
The Take It Down Act has garnered strong bipartisan support and has been championed by Melania Trump, who lobbied on Capitol Hill in March saying it was 鈥渉eartbreaking鈥 to see what teenagers, especially girls, go through after they are victimized by people who spread such content. President Trump is expected to sign it into law.
Cruz said the measure was inspired by Elliston Berry and her mother, who visited his office after Snapchat refused for nearly a year to remove an AI-generated 鈥渄eepfake鈥 of the then 14-year-old.
Meta, which owns and operates Facebook and Instagram, supports the legislation.
鈥淗aving an intimate image 鈥 real or AI-generated – shared without consent can be devastating and Meta developed and backs many efforts to help prevent it,鈥 Meta spokesman Andy Stone said last month.
The Information Technology and Innovation Foundation, a tech industry-supported think tank, said in a statement Monday that the bill’s passage 鈥渋s an important step forward that will help people pursue justice when they are victims of non-consensual intimate imagery, including deepfake images generated using AI.鈥
鈥淲e must provide victims of online abuse with the legal protections they need when intimate images are shared without their consent, especially now that deepfakes are creating horrifying new opportunities for abuse,鈥 Klobuchar said in a statement after the bill’s passage late Monday. 鈥淭hese images can ruin lives and reputations, but now that our bipartisan legislation is becoming law, victims will be able to have this material removed from social media platforms and law enforcement can hold perpetrators accountable.”
Free speech advocates and digital rights groups say the bill is too broad and could lead to the censorship of legitimate images including legal pornography and LGBTQ content, as well as government critics.
鈥淲hile the bill is meant to address a serious problem, good intentions alone are not enough to make good policy,鈥 said the nonprofit Electronic Frontier Foundation, a digital rights advocacy group. 鈥淟awmakers should be strengthening and enforcing existing legal protections for victims, rather than inventing new takedown regimes that are ripe for abuse.鈥
The takedown provision in the bill 鈥渁pplies to a much broader category of content 鈥 potentially any images involving intimate or sexual content鈥 than the narrower definitions of non-consensual intimate imagery found elsewhere in the text, EFF said.
鈥淭he takedown provision also lacks critical safeguards against frivolous or bad-faith takedown requests. Services will rely on automated filters, which are infamously blunt tools,鈥 EFF said. 鈥淭hey frequently flag legal content, from fair-use commentary to news reporting. The law鈥檚 tight time frame requires that apps and websites remove speech within 48 hours, rarely enough time to verify whether the speech is actually illegal.鈥
As a result, the group said online companies, especially smaller ones that lack the resources to wade through a lot of content, 鈥渨ill likely choose to avoid the onerous legal risk by simply depublishing the speech rather than even attempting to verify it.鈥
The measure, EFF said, also pressures platforms to 鈥渁ctively monitor speech, including speech that is presently encrypted鈥 to address liability threats.
The , a nonprofit that helps victims of online crimes and abuse, said it has 鈥渟erious reservations鈥 about the bill. It called its takedown provision unconstitutionally vague, unconstitutionally overbroad, and lacking adequate safeguards against misuse.”
For instance, the group said, platforms could be obligated to remove a journalist鈥檚 photographs of a topless protest on a public street, photos of a subway flasher distributed by law enforcement to locate the perpetrator, commercially produced sexually explicit content or sexually explicit material that is consensual but falsely reported as being nonconsensual.