Search:

Recent Posts

Popular Topics

Contributors

Archives

Legal developments in data, privacy, cybersecurity, and other emerging technology issues

The Take it Down Act: Federal Protections Against Digital Exploitation Bring Compliance Obligations for Online Platforms

On May 19, 2025, President Donald Trump signed into law the bipartisan Take It Down Act, which is aimed at combating the distribution of nonconsensual intimate imagery, including both authentic and AI-generated “deepfakes.” The law was championed by Senators Ted Cruz and Amy Klobuchar, with support from a broad coalition including victim advocates, technology companies, and law enforcement groups. 

The Take It Down Act imposes requirements on “covered platforms,” which are defined as: (1) websites, online services, online applications, or mobile applications that serve the public and that primarily provides a forum for user-generated content, including messages, videos, images, games, and audio files, or (2) online services that are in the business of publishing, curating, hosting, or making available content of nonconsensual intimate visual depictions. Within one year of the law’s enactment, covered platforms must establish a process for individuals to notify the platform of nonconsensual intimate images and to request their removal. Upon receiving a valid request, platforms are required to remove the image within 48 hours and “make reasonable efforts” to identify and remove any identical copies of such depiction. Nonconsensual intimate images are referred to as “intimate visual depictions,” which are defined per an existing law as explicit images that reveal genitals, sexual acts, or are otherwise sexually explicit. The Act also makes it illegal to use an interactive computer service, as defined by Section 230 of the Communications Decency Act, to knowingly publish an intimate visual depiction of an identifiable individual or a digital forgery if the content was published without consent, outside of a public setting and not as a matter of public concern, and with an intent to cause harm or that otherwise causes harm. The Act attempts to address concerns that definitions in the Act could be interpreted too broadly, potentially leading to the removal of content that is not truly harmful or to censorship of lawful speech, by requiring that digital forgeries meet a “reasonable person” standard for being indistinguishable from authentic images and by including exceptions for matters of public concern and lawful investigative activities.

Enforcement authority under the Act is vested in the Federal Trade Commission (FTC), which is empowered to ensure that digital platforms comply with the Act’s requirements. Notably, the Act states that a failure to reasonably comply with the notice and takedown obligations are treated as a violation of an FTC rule under Section 18 of the FTC Act, which should allow for the FTC in an enforcement action to seek civil penalties, which presently are up to $53,088 for each violation.  In addition to for-profit entities, the FTC also has jurisdiction with respect to organizations “that are not organized to carry on business for their own profit or that or their members.” The law also creates new federal criminal offenses: it is now illegal to knowingly publish or threaten to publish nonconsensual intimate images or digital forgeries in interstate commerce, with enhanced penalties if the victim is a minor. For offenses involving adults, violators may face up to two years in prison; for offenses involving minors, the penalty increases to up to three years. The Act also mandates forfeiture of materials and property used in the commission of these crimes and requires restitution to victims.

In addition to the new federal standards set by the Take It Down Act, nearly every state in the U.S. already has laws addressing the nonconsensual distribution of intimate images, often referred to as "revenge porn" statutes. These laws prohibit the distribution or production of nonconsensual pornography, with penalties and definitions varying widely from state to state. Some states have also created or amended laws specifically to address deepfake pornography, either by referencing AI-generated images directly or by broadening existing statutes to include digital forgeries. For example, states like Louisiana, Minnesota, and New York have explicit provisions targeting deepfake imagery, while others rely on more general language to encompass technology-driven offenses. While state laws continue to operate and may provide additional avenues for prosecution or civil remedies, the Take It Down Act creates a uniform federal framework that addresses gaps and inconsistencies in state approaches, particularly with respect to digital platforms and AI-generated content.

Jump to Page

Necessary Cookies

Necessary cookies enable core functionality such as security, network management, and accessibility. You may disable these by changing your browser settings, but this may affect how the website functions.

Analytical Cookies

Analytical cookies help us improve our website by collecting and reporting information on its usage. We access and process information from these cookies at an aggregate level.