privacy compliance

What is the US Take It Down Act?

by

reviewed by

Malcolm Burrows

The Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act (Take It Down Act ) is a United States (US) federal law enacted on 19 May 2025.[1]

The Take It Down Act amends 47 U.S. Code § 223 (Code) of the Communications Act 1934 (US) (Communications Act) by establishing new criminal offences and removal procedures related to non-consensual intimate imagery (Non-Consensual Intimate Images)[2] and digitally generated forgeries (Digital Forgeries).[3]  The effect of this Act is to expand the scope of what is prohibited in regard to generative artificial intelligence.

This article provides an overview of the amendments made to the Code and includes an introduction to the new criminal offences and procedures for the removal of such content from interactive computer services or communication platforms that provide access to user generated content (Covered Platform).

Summary of the key provisions of the Code

The Take It Down Act amends the Code and introduces new section, 47 U.S. Code § 223a, to theCommunications Act 1934 (US), broadening the scope of prohibited conduct involving the distribution of Non-Consensual Intimate Images and Digital Forgeries.  The law targets non-consensual explicit content such as revenge porn, deepfakes, and other harmful AI-generated material.  Key provisions include:

  • a 48-hour takedown requirement (platforms must remove explicit content within 48 hours of receiving a valid complaint);
  • penalties (those who upload or distribute this content may face up to three (3) years imprisonment);
  • focus (the law aims to protect individuals, especially women and minors, from harmful or misleading AI-generated material); and
  • connection to Digital Millenium Copyright Act of 1998 (US) (DMCA) (much like the DMCA, which mandates platforms to take down copyright-infringing material upon receiving a formal complaint, the Take It Down Act uses a similar notice-and-takedown system for explicit content.  The key difference being the focus on harmful, explicit content rather than copyright violations.  Both laws give Covered Platforms a safe harbour from liability if they act quickly and appropriately to remove content).

Introduction of criminal offences

Under the amended Code, it is now a Federal offence to knowingly use any interactive computer services to publish[4] or threaten to publish[5] an intimate visual depiction of an identifiable individual where:

(i)   the intimate visual depiction was obtained or created under circumstances in which the person knew, or reasonably should have known, that the identifiable individual had a reasonable expectation of privacy;

(ii)   what is depicted was not voluntarily exposed by the identifiable individual in a public or commercial setting;

(iii)   what is depicted is not a matter of public concern; and

(iv)   publication of the intimate visual depiction –

(I)   is intended to cause harm; or

(II) causes harm, including psychological, financial, or reputational harm, to the identifiable individual.[6]

This also applies to Digital Forgeries.[7]  Violations are punishable by a fine under title 18 of the Code, imprisonment for up to two (2) years, or both.[8]  Offences involving minors carry increased penalties, including fines and imprisonment for up to three (3) years, or both.[9]

Notice and removal of Non-Consensual Intimate Images

The Take It Down Act creates a notice and removal requirement in which Covered Platforms are required to establish a process whereby an identifiable individual can notify the platform about an intimate visual depiction published without their consent.[10]  An identifiable individual can request that the depiction be removed.  Covered Platforms have until 19 May 2026 to establish this system.[11]

For a removal request to be valid (Valid Removal Request), it must include the following written information:

  • a physical or electronic signature of the identifiable individual (or an authorized person acting on behalf of such individual);
  • an identification of, and information reasonably sufficient for the covered platform to locate, the intimate visual depiction of the identifiable individual;
  • a brief statement that the identifiable individual has a good faith belief that any intimate visual depiction identified under clause (ii) is not consensual, including any relevant information for the covered platform to determine the intimate visual depiction was published without the consent of the identifiable individual; and
  • information sufficient to enable the covered platform to contact the identifiable individual (or an authorized person acting on behalf of such individual).[12]

[Bold is our emphasis]

Upon receiving a Valid Removal Request, the Covered Platform must, as soon as possible but no later than forty-eight (48) hours after receipt, “remove the intimate visual depiction; and make reasonable efforts to identify and remove any known identical copies of such depiction.[13]  An intimate visual depiction is defined as a visual depiction, including undeveloped film and videotape, and data capable of conversion into a visual image,[14] containing sexually explicit content of an identifiable individual.[15]

A Covered Platform’s failure to reasonably comply with these obligations is deemed an unfair or deceptive act or practice under section 18(a)(1)(B) of the Federal Trade Commission Act (FTC Act).[16]  Accordingly, the Federal Trade Commission (FTC) is granted the authority to enforce this section using the full range of its existing powers and procedures.[17]  This includes investigating violations, pursuing enforcement actions, and imposing penalties, as if the relevant provisions of the FTC Act were fully incorporated into the Take It Down Act.

Intersection with DMCA

The DMCA is a United States law that addresses copyright issues in relation to advancements in technology and reflects similar takedown processes as the Take It Down Act, requiring online service providers and platforms to remove harmful content to avoid liability.  Under both Acts, there are obligations to act quickly to avoid liability, however, the Take It Down Act includes a stricter 48-hour timeline for removal of harmful content, in comparison to the more general “reasonable time” obligation under the DMCA.

Extraterritorial application

Does this Act apply outside the USA?

Although the Take It Down Act makes references to ‘interstate’ or ‘foreign’ communications, United States law generally operates under a presumption against extraterritoriality – meaning unless explicitly stated, its provisions are not intended to apply outside United States borders.  How, or whether, the Take it Down Act will be interpreted or enforced internationally remains uncertain – only time will tell.

Has the Australian eSafety Commissioner considered similar legislation?

The Online Safety Act 2021 (Cth) (Online Safety Act) Part 6 Division 3 provides a similar framework to the Take It Down Act.  It empowers the eSafety Commissioner to issue removal notices for non-consensual intimate images, requiring platforms, particularly those operating within or accessible from Australia, to take down the offending content.  While the jurisdictional scope differs, both laws reflect a growing international effort to combat image-based abuse and enhance protections for victims online.

Concluding remarks about the Take It Down Act

The Take It Down Act targets harm that can be caused by generative artificial intelligence, such as deepfakes, with an aim to protect US citizens.  This measure balances the need for regulation in the digital space (to protect individuals from harmful content) with a push for innovation and competition in AI development, all while leveraging existing frameworks like the DMCA to ensure platforms take responsibility for the content they host.  It seems likely that in the coming years, more countries will release similar legislation to reflect the evolving nature of AI technology.

Links and further references

Legislation

Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act 2025 (US)

The Digital Millenium Copyright Act of 1998 (US)

Online Safety Act 2021 (Cth)

47 U.S. Code § 223

47 U.S. Code § 223a

Communications Act 1934 (US)

Further information about privacy compliance for businesses

If your business needs advice on compliance with the Take it Down Act, contact us for a confidential and obligation free and discussion:

Doyles Recommended TMT Lawyer 2024

[1]     Congressional Research Service, The TAKE IT DOWN Act: A Federal Law Prohibiting the Nonconsensual Publication of Intimate Images (Legal Sidebar No LSB11314.1, 20 May 2025) https://www.congress.gov/crs_external_products/LSB/PDF/LSB11314/LSB11314.1.pdf.

[2]     47 USC § 223(2)(a) (2018).

[3]     47 USC § 223(3)(a) (2018).

[4]     47 USC § 223(3)(a) (2018).

[5]     47 USC § 223(6) (2018).

[6]   47 USC § 223(2)(a) (2018).

[7]   47 USC § 223(3)(a) (2018).

[8]   47 USC § 223(4)(a) (2018).

[9]   47 USC § 223(4)(b) (2018).

[10] 47 USC § 223a (a)(1)(A) (2018).

[11] 47 USC § 223(4)(b) (2018).

[12] 47 USC § 223a (a)(1)(B) (2018)

[13] 47 USC § 223a (a)(1)(B)(3) (2018).

[14] 18 USC § 2256(5).

[15] 15 USC § 6851(5)(A) (2022).

[16] 47 USC § 223a (b)(1) (2018); 15 U.S. Code § 57a.

[17] 47 USC § 223a (b)(2)(A) (2018).


Related insights about privacy compliance

  • What is the US Take It Down Act?

    What is the US Take It Down Act?

    The Tools to Address Known Exploitation by Immobilizing Technological Deepfakes on Websites and Networks Act (Take It Down Act ) is a United States (US) federal law enacted on 19 May 2025. The Take It Down Act amends 47 U.S. Code § 223 (Code) of the Communications Act 1934 (US) (Communications Act) by establishing new…

    Read more …

  • Federal parliament enacts cyber security legislation

    Federal parliament enacts cyber security legislation

    On 25 November 2024, the Australian Parliament passed a suite of legislation, collectively referred to by the Australian Government as the Cyber Security Legislative Package 2024.  The purported impetus for this legislation was a series of high-profile data breaches in 2022 and 2023.

    Read more …

  • Privacy Act amended to increase penalties to a max of $50 million

    Privacy Act amended to increase penalties to a max of $50 million

    The Privacy Legislation Amendment (Enforcement and Other Measures) Bill 2022 (Bill) was passed by both Houses of Parliament on the 28 November 2022 and now awaits Royal Assent.  The Bill was passed with virtually no amendment.

    Read more …

  • What should APP Entities include in data destruction policies?

    What should APP Entities include in data destruction policies?

    This article summarises the Australian Privacy Principles (APPs) and the importance of having a data destruction policy (DDP) in place. It outlines the steps to take when destroying or deidentifying personal and sensitive information, and the consequences of not doing so.

    Read more …

  • Uber found in breach of Australian privacy laws

    Uber found in breach of Australian privacy laws

    This article provides an overview of interesting decisions of Australian Courts in Corporate Law, Technology Law and Intellectual Property. With cases on Trade Marks, Copyright, Defamation, Negligence, Joint Ventures and Confidential Information, it is an invaluable resource for anyone interested in these areas.

    Read more …

  • Overview of the Ransomware Payments Bill 2021 (Cth)

    Overview of the Ransomware Payments Bill 2021 (Cth)

    Australian government proposed the Ransomware Payments Bill 2021 (Cth) (Bill) to enforce mandatory reporting of ransomware payments. Penalties of up to $110,000 for non-compliance.

    Read more …

  • International businesses subject to Australian privacy laws

    International businesses subject to Australian privacy laws

    Australian Intelligence Community (AIC) Commissioner Falk determined how the Office of the Australian Information Commissioner (OAIC) will assess if international entities have an Australian Link to Privacy Act 1988 (Cth).

    Read more …

  • 7-Eleven customer survey: implied consent?

    7-Eleven customer survey: implied consent?

    The Office of the Australian Information Commissioner found 7-Eleven Stores Pty Ltd are in breach of the Australian Privacy Principles (APP’s). Learn more about the findings, implications, and how businesses can comply with the APP’s.

    Read more …

  • Use of confidential information – the springboard injunction

    Use of confidential information – the springboard injunction

    This article examines the UK decision of Forse & ors v Secarma Ltd & ors [2019] EWCA Civ 215, which discussed the legal concept of a springboard injunction, and its implications in Australia. The Court must consider similar principles to determine if an injunction should be granted.

    Read more …

Send this to a friend