The Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act (Take It Down Act ) is a United States (US) federal law enacted on 19 May 2025.[1]
The Take It Down Act amends 47 U.S. Code § 223 (Code) of the Communications Act 1934 (US) (Communications Act) by establishing new criminal offences and removal procedures related to non-consensual intimate imagery (Non-Consensual Intimate Images)[2] and digitally generated forgeries (Digital Forgeries).[3] The effect of this Act is to expand the scope of what is prohibited in regard to generative artificial intelligence.
This article provides an overview of the amendments made to the Code and includes an introduction to the new criminal offences and procedures for the removal of such content from interactive computer services or communication platforms that provide access to user generated content (Covered Platform).
Summary of the key provisions of the Code
The Take It Down Act amends the Code and introduces new section, 47 U.S. Code § 223a, to theCommunications Act 1934 (US), broadening the scope of prohibited conduct involving the distribution of Non-Consensual Intimate Images and Digital Forgeries. The law targets non-consensual explicit content such as revenge porn, deepfakes, and other harmful AI-generated material. Key provisions include:
- a 48-hour takedown requirement (platforms must remove explicit content within 48 hours of receiving a valid complaint);
- penalties (those who upload or distribute this content may face up to three (3) years imprisonment);
- focus (the law aims to protect individuals, especially women and minors, from harmful or misleading AI-generated material); and
- connection to Digital Millenium Copyright Act of 1998 (US) (DMCA) (much like the DMCA, which mandates platforms to take down copyright-infringing material upon receiving a formal complaint, the Take It Down Act uses a similar notice-and-takedown system for explicit content. The key difference being the focus on harmful, explicit content rather than copyright violations. Both laws give Covered Platforms a safe harbour from liability if they act quickly and appropriately to remove content).
Introduction of criminal offences
Under the amended Code, it is now a Federal offence to knowingly use any interactive computer services to publish[4] or threaten to publish[5] an intimate visual depiction of an identifiable individual where:
“(i) the intimate visual depiction was obtained or created under circumstances in which the person knew, or reasonably should have known, that the identifiable individual had a reasonable expectation of privacy;
(ii) what is depicted was not voluntarily exposed by the identifiable individual in a public or commercial setting;
(iii) what is depicted is not a matter of public concern; and
(iv) publication of the intimate visual depiction –
(I) is intended to cause harm; or
(II) causes harm, including psychological, financial, or reputational harm, to the identifiable individual.”[6]
This also applies to Digital Forgeries.[7] Violations are punishable by a fine under title 18 of the Code, imprisonment for up to two (2) years, or both.[8] Offences involving minors carry increased penalties, including fines and imprisonment for up to three (3) years, or both.[9]
Notice and removal of Non-Consensual Intimate Images
The Take It Down Act creates a notice and removal requirement in which Covered Platforms are required to establish a process whereby an identifiable individual can notify the platform about an intimate visual depiction published without their consent.[10] An identifiable individual can request that the depiction be removed. Covered Platforms have until 19 May 2026 to establish this system.[11]
For a removal request to be valid (Valid Removal Request), it must include the following written information:
- “a physical or electronic signature of the identifiable individual (or an authorized person acting on behalf of such individual);
- an identification of, and information reasonably sufficient for the covered platform to locate, the intimate visual depiction of the identifiable individual;
- a brief statement that the identifiable individual has a good faith belief that any intimate visual depiction identified under clause (ii) is not consensual, including any relevant information for the covered platform to determine the intimate visual depiction was published without the consent of the identifiable individual; and
- information sufficient to enable the covered platform to contact the identifiable individual (or an authorized person acting on behalf of such individual).”[12]
[Bold is our emphasis]
Upon receiving a Valid Removal Request, the Covered Platform must, as soon as possible but no later than forty-eight (48) hours after receipt, “remove the intimate visual depiction; and make reasonable efforts to identify and remove any known identical copies of such depiction.”[13] An intimate visual depiction is defined as a visual depiction, including undeveloped film and videotape, and data capable of conversion into a visual image,[14] containing sexually explicit content of an identifiable individual.[15]
A Covered Platform’s failure to reasonably comply with these obligations is deemed an unfair or deceptive act or practice under section 18(a)(1)(B) of the Federal Trade Commission Act (FTC Act).[16] Accordingly, the Federal Trade Commission (FTC) is granted the authority to enforce this section using the full range of its existing powers and procedures.[17] This includes investigating violations, pursuing enforcement actions, and imposing penalties, as if the relevant provisions of the FTC Act were fully incorporated into the Take It Down Act.
Intersection with DMCA
The DMCA is a United States law that addresses copyright issues in relation to advancements in technology and reflects similar takedown processes as the Take It Down Act, requiring online service providers and platforms to remove harmful content to avoid liability. Under both Acts, there are obligations to act quickly to avoid liability, however, the Take It Down Act includes a stricter 48-hour timeline for removal of harmful content, in comparison to the more general “reasonable time” obligation under the DMCA.
Extraterritorial application
Does this Act apply outside the USA?
Although the Take It Down Act makes references to ‘interstate’ or ‘foreign’ communications, United States law generally operates under a presumption against extraterritoriality – meaning unless explicitly stated, its provisions are not intended to apply outside United States borders. How, or whether, the Take it Down Act will be interpreted or enforced internationally remains uncertain – only time will tell.
Has the Australian eSafety Commissioner considered similar legislation?
The Online Safety Act 2021 (Cth) (Online Safety Act) Part 6 Division 3 provides a similar framework to the Take It Down Act. It empowers the eSafety Commissioner to issue removal notices for non-consensual intimate images, requiring platforms, particularly those operating within or accessible from Australia, to take down the offending content. While the jurisdictional scope differs, both laws reflect a growing international effort to combat image-based abuse and enhance protections for victims online.
Concluding remarks about the Take It Down Act
The Take It Down Act targets harm that can be caused by generative artificial intelligence, such as deepfakes, with an aim to protect US citizens. This measure balances the need for regulation in the digital space (to protect individuals from harmful content) with a push for innovation and competition in AI development, all while leveraging existing frameworks like the DMCA to ensure platforms take responsibility for the content they host. It seems likely that in the coming years, more countries will release similar legislation to reflect the evolving nature of AI technology.
Links and further references
Legislation
The Digital Millenium Copyright Act of 1998 (US)
Further information about privacy compliance for businesses
If your business needs advice on compliance with the Take it Down Act, contact us for a confidential and obligation free and discussion:

Malcolm Burrows B.Bus.,MBA.,LL.B.,LL.M.,MQLS.
Legal Practice Director
T: +61 7 3221 0013 (preferred)
M: +61 419 726 535
E: mburrows@dundaslawyers.com.au

Disclaimer
This article contains general commentary only. You should not rely on the commentary as legal advice. Specific legal advice should be obtained to ascertain how the law applies to your particular circumstances.
[1] Congressional Research Service, The TAKE IT DOWN Act: A Federal Law Prohibiting the Nonconsensual Publication of Intimate Images (Legal Sidebar No LSB11314.1, 20 May 2025) https://www.congress.gov/crs_external_products/LSB/PDF/LSB11314/LSB11314.1.pdf.
[2] 47 USC § 223(2)(a) (2018).
[3] 47 USC § 223(3)(a) (2018).
[4] 47 USC § 223(3)(a) (2018).
[5] 47 USC § 223(6) (2018).
[6] 47 USC § 223(2)(a) (2018).
[7] 47 USC § 223(3)(a) (2018).
[8] 47 USC § 223(4)(a) (2018).
[9] 47 USC § 223(4)(b) (2018).
[10] 47 USC § 223a (a)(1)(A) (2018).
[11] 47 USC § 223(4)(b) (2018).
[12] 47 USC § 223a (a)(1)(B) (2018)
[13] 47 USC § 223a (a)(1)(B)(3) (2018).
[15] 15 USC § 6851(5)(A) (2022).
[16] 47 USC § 223a (b)(1) (2018); 15 U.S. Code § 57a.
[17] 47 USC § 223a (b)(2)(A) (2018).