Home

Donate

The US Senate’s Passage of the TAKE IT DOWN ACT is Progress on an Urgent, Growing Problem

Sunny Gandhi, Adam Billen / Feb 21, 2025

Sunny Gandhi is Vice President of Political Affairs and Adam Billen is the Vice President of Public Policy at Encode.

The US Senate Chamber in 1873 in a restored image from a glass negative. Brady-Handy Photograph Collection (Library of Congress), Public domain, via Wikimedia Commons

On February 13, the United States Senate unanimously passed the TAKE IT DOWN Act, a bill that makes it unlawful to knowingly publish “nonconsensual intimate visual depictions,” including “digital forgeries” created with AI software, and requires covered technology platforms to remove reported content after receiving a valid request. For the 119th Congress, which just got underway, Representatives Maria Elvira Salazar (R-FL) and Madeleine Dean (D-PA) reintroduced companion legislation in the House. The legislation has been endorsed by over 100 organizations, including platforms like Microsoft, Snap, and Meta; sexual violence organizations like the Rape, Incest, and Abuse National Network and the National Center on Sexual Exploitation; and civil society organizations like SAG-AFTRA and the American Principles Project.

The support for the legislation is driven by the growing scale of the problem. Over the past few years, the knowledge required to create AI-generated nudes has dropped sharply. Today, a single clothed photo of a person and an internet connection is all someone needs to create and distribute hundreds or thousands of fake nude images of their victim. This has resulted in a crisis that is rapidly spiraling out of control.

Surveys now indicate that one in ten adults has either been victimized or knows a victim of AI-generated nudes. The problem is even worse in schools: nearly one in five high schoolers say they know of a peer at their school who’s been victimized. A poll conducted by YouGov for Tech Policy Press last summer found that 1 in 10 American voters were “personally aware” of incidents involving the sharing of nonconsensual AI-generated intimate imagery among their communities. The sheer scale of online activity is hard to conceive; the ten leading websites dedicated to “deepfake porn” had monthly traffic of over 34 million users in 2023, and nearly ten thousand additional sites exist that distribute similar content.

A tangled web of outdated policy, highly diffuse technology, and the challenge of prosecuting cross-border disputes have raised complex barriers to solving the issue.

The primary challenge to addressing AI-generated intimate imagery is the highly decentralized technology underlying its production. Many of the models employed are based on publicly available open-source frameworks. Once these models are fine-tuned for generating intimate imagery, they can be shared, replicated, and modified with minimal oversight. The open diffusion means that even if regulators target a particular provider, countless other copies already exist and are circulating. Efforts to regulate or remove a single model will have little effect on the broader ecosystem of AI tools.

This enforcement challenge is compounded when considering that many of the websites and apps facilitating access to these models — with names such as “ClothOff” or “Nudify” — are hosted overseas. Operated by criminal networks and structured as shell companies in various jurisdictions, these platforms are largely insulated from US legal action. Even if domestic regulators were successful in shutting down specific websites, the inherent global availability of these tools means that perpetrators can easily migrate to new platforms outside of US jurisdiction.

The final major obstacle lies in outdated legal frameworks. The two bodies of law most applicable to this domain, including existing revenge porn and child sexual abuse laws, were written before AI-generated intimate imagery became a major issue. Victims pursuing legal action against their abusers under revenge porn laws have struggled to argue in court that laws not written to explicitly include inauthentic images should apply to this new domain. Given this gap, many victims are advised not to pursue legal action by their attorneys. Even within cases that seem extreme — such as child sexual abuse material —there are caveats. While the FBI has made clear that AI-generated child sexual abuse material (CSAM) is illegal, the law currently allows for the creation and distribution of CSAM so long as it is not “sexually explicit” in nature. This technicality means that AI-generated CSAM can proliferate online so long as it does not include:

sexual intercourse, bestiality, masturbation, sadistic or masochistic abuse, or content that overtly intends to arouse or appeal to sexual desire.

In response to these challenges, lawmakers at both the state and federal levels have proposed measures that combine civil remedies, criminal penalties, and platform accountability.

States have largely focused on civil and criminal penalties in established legislation. However, enforcement of those penalties has proven difficult. Because of the nature of the platforms where this content is created and distributed and because a victim and perpetrator don’t need to reside in the same state, abuse often happens across different state jurisdictions. That means that even with legislation establishing civil and criminal penalties passed in every state, many victims would still lack the tools to hold perpetrators accountable.

At Encode we have been advocating for two key pieces of legislation in Congress, the DEFIANCE Act and the TAKE IT DOWN Act. If passed the two bills would cover all three policy mechanisms: civil penalties, criminal penalties, and platform accountability.

The DEFIANCE Act, introduced by Senators Dick Durbin (D-IL) and Lindsey Graham (R-SC), would amend an existing civil right of action — originally established under the 2022 reauthorization of the Violence Against Women Act — to explicitly include AI-generated intimate imagery. By building on a legally and politically tested framework, the DEFIANCE Act provides a clear and immediate path for victims to seek redress. Its support spans major groups, including the Sexual Violence Prevention Association, technology policy advocates like the Center for Democracy and Technology, industry associations like the Software Information and Industry Association, and major tech companies like Snapchat. Notably, after overcoming an initial objection by Senator Lummis, the bill passed the Senate by unanimous consent in July last year.

The TAKE IT DOWN Act, introduced by Senators Ted Cruz (R-TX) and Amy Klobuchar (D-MN), is a broader proposal that would create criminal penalties for the non-consensual publishing of both authentic and AI-generated intimate images. Critically, it would patch the existing loophole in CSAM law by including nude images published with the intent to “abuse, humiliate, harass, or degrade” a minor rather than only “sexually explicit” images. The bill also mandates that large online platforms establish a process for victims to report non-consensual intimate imagery, requiring removal within 48 hours. This focus on accountability for platforms based in the US allows victims to take back control of their images without attempting to regulate companies operating outside of the US. The TAKE IT DOWN Act does this by classifying failure to comply as an unfair or deceptive trade practice under the Federal Trade Commission Act rather than by attempting to alter Section 230.

Groups like the Electronic Frontier Foundation (EFF) and the Center for Democracy and Technology (CDT) have expressed concerns that provisions of the legislation could threaten free speech and encrypted platforms. Some of these concerns were addressed after the TAKE IT DOWN Act was initially blocked on the floor by Senator Cory Booker (D-NJ), who later endorsed the bill after a round of amendments in November. Those amendments redefined “deepfake” to “digital forgery” to match the text of the definition in the DEFIANCE Act, modified the criminal offense for digital forgeries to require proof of a lack of consent, and lowered the penalties for publishing digital forgeries. The bill then made it into the original Continuing Resolution agreed on between the House and Senate but was cut in the final Continuing Resolution that eventually passed. While groups like EFF and CDT’s concerns remain, the legislation has been endorsed by over 100 organizations, including platforms and industry groups like Microsoft, TechNet, and the Software Information and Industry Association; sexual violence organizations like the Rape, Incest, and Abuse National Network and the National Center on Sexual Exploitation; and civil society organizations like SAG-AFTRA and the American Principles Project. Additionally, Senators well known for their support of encryption, like Rand Paul (R-KY) and Ron Wyden (D-KS), and companies who use end-to-end encryption like Snap, Meta, and Google support the legislation.

A recent high-profile case in Florida—where two teenagers aged 13 and 14 were charged under a law that criminalized the creation of non-consensual intimate images as a felony—sparked controversy over its severe penalties and highlighted the need for legislation on this issue to be carefully crafted. The bill the teenagers were charged under imposed higher penalties for inauthentic images than for authentic ones. The TAKE IT DOWN Act, in contrast, proposes lower, more proportionate penalties based on the age of the victim and whether the images are authentic or AI-generated.

Both the DEFIANCE and TAKE IT DOWN Acts have secured rare levels of bipartisan support and endorsements from a unique coalition made up of industry associations, organizations focused on sexual violence, tech policy advocates, and kids’ safety groups.

A central legal challenge has been ensuring that any new restrictions on nonconsensual intimate imagery do not infringe on First Amendment rights. Although some forms of “obscene” explicit imagery might fall outside constitutional protection, many cases of “non-obscene” AI-generated intimate imagery would not. Courts, therefore, are likely to apply strict scrutiny to bills targeted at explicit imagery as content-based restrictions. This would require bills like DEFIANCE and TAKE IT DOWN to be considered the “least restrictive means of advancing a compelling government interest.” While some prior, older revenge porn legislation has been challenged under the First Amendment for being too broad or vague, the TAKE IT DOWN and DEFIANCE Acts are designed to be as narrowly tailored as possible to meet the standard of strict scrutiny. Importantly, both bills require either actual harm caused or intent to cause harm to trigger their civil and criminal penalties. This prevents ensnaring unaware, innocent third parties that distribute images for innocuous reasons — such as sending an image shared to them with a friend to seek support — from being unintentionally caught in the legislation.

The complexity of regulating AI-generated intimate imagery lies in its distributed technology, international enforcement challenges, and the outdated legal structures currently in place. The proposed legislation represents a coordinated effort to empower victims, enhance deterrence, and improve platform accountability.

The TAKE IT DOWN Act has already passed the Senate by unanimous consent, and the DEFIANCE appears to be following shortly. By ultimately passing these measures, Congress can put power back in the hands of victims, close dangerous legal loopholes, deter would-be perpetrators, and ensure platforms work to create a safe online future.

Authors

Sunny Gandhi
As Vice President of Political Affairs, Sunny led Encode’s co-sponsorship of California’s SB 1047, a landmark AI safety bill that would have required testing of advanced AI systems and created whistleblower protections. The bill passed both chambers of the state legislature with strong bipartisan ma...
Adam Billen
Adam Billen serves as the Vice President of Public Policy at Encode, where he crafts and advances Encode’s policy priorities and builds bipartisan, cross-issue coalitions. His work has included passing the first-ever restrictions on AI's use in nuclear weapons operations in the FY25 NDAA, advancing ...

Related

Free Speech Advocates Express Concerns As TAKE IT DOWN Act Passes US Senate

Topics