What Is the AI-focused COPIED Act? 

The Content Origin Protection and Integrity from Edited and Deepfake Media Act, known as the COPIED Act, is a proposed piece of legislation in the United States designed to address growing concerns about the effects of artificial intelligence (AI) on media authenticity and creators’ rights. As AI increasingly shapes content creation, this bipartisan bill proposes a framework for protecting original works from unauthorised use and alteration by AI technologies. Introduced in 2024, the COPIED Act seeks to safeguard creators in fields like music, journalism, and visual arts, providing a legal basis for tracking content provenance, setting standards for synthetic content detection, and clarifying copyright protections in the AI age.

Background of the COPIED Act

The COPIED Act emerged as a response to the expanding capabilities of AI to produce realistic, machine-generated content, often indistinguishable from human-created works. With AI technology now able to mimic voices, create synthetic images, and replicate artistic styles, many creators have expressed concerns over the risk of losing control over their works. This risk is especially pertinent in cases of “deepfakes” — media manipulated or fabricated through AI to deceive audiences or misrepresent individuals. These issues have become critical enough to draw the attention of U.S. lawmakers, who recognise that as AI capabilities grow, so do the challenges around originality, authenticity, and fair compensation in creative industries.

The Act is supported by various industry groups, including unions like SAG-AFTRA and organisations like the Recording Industry Association of America, which see it as a vital step toward preserving the integrity of creative works. The bill, sponsored by Senators Marsha Blackburn, Maria Cantwell, and Martin Heinrich, has moved forward in Congress with significant backing but also faces questions about its potential impact on copyright law, freedom of speech, and AI innovation.

Objectives of the COPIED Act

The COPIED Act has several primary goals:

  1. Enhancing Transparency in AI-Generated Content: The Act mandates the use of “content provenance information,” a digital watermark or similar technology that identifies the origin of a piece of media. This feature allows creators to establish the authenticity of their work, helping to differentiate it from AI-generated content.
  2. Limiting AI Access to Copyrighted Materials: The COPIED Act aims to give creators more control over how their works are used by prohibiting the use of content with provenance information for AI training or generation without consent. This restriction addresses a major concern among creators whose work could otherwise be used to improve AI models without compensation or permission.
  3. Setting Industry Standards for Synthetic Content Detection: The Act calls upon the National Institute of Standards and Technology (NIST) to create guidelines and standards that can help detect and label AI-generated or synthetically altered content. These standards will aid both creators and consumers in identifying and tracing the origins of digital media, enabling better management of synthetic content.
  4. Preventing Misuse and Deepfake Media: The Act also aims to curb the spread of misleading or harmful AI-altered media by establishing legal grounds for creators to protect their likenesses and creative content. This measure is intended to reduce the misuse of digital representations in ways that can deceive or mislead viewers.

Role of the National Institute of Standards and Technology (NIST)

A central part of the COPIED Act is its directive for the National Institute of Standards and Technology (NIST) to develop and enforce guidelines for synthetic content detection and content provenance. NIST is tasked with researching and defining standards to support transparency and authenticity in digital media. Under the COPIED Act, NIST would oversee the creation of tools and standards that allow creators to add provenance information to their works. These standards would provide content creators with the means to secure their original work and grant consumers the ability to verify the authenticity of digital content they encounter.

Additionally, NIST’s role in developing synthetic content detection will have implications across various sectors, from journalism and entertainment to government communications. By setting up a framework that enables digital watermarking and the tracking of media origin, NIST aims to create a standardised approach for distinguishing original content from synthetic or AI-generated materials.

Key Provisions of the COPIED Act

The COPIED Act outlines several specific provisions designed to enhance transparency, protect creators, and curb the misuse of AI in content creation. These include:

Content Provenance Requirements

Creators can add a digital watermark to their work that cannot be removed or altered except for specific security reasons. This watermarking, referred to as “content provenance information,” serves as proof of originality, helping to distinguish between original and AI-altered media.

Restrictions on AI Training Data 

The Act prohibits the use of any work marked with provenance information from being used as training data for AI models or to generate synthetic outputs without the creator’s permission. This measure aims to protect copyrighted materials from being exploited by AI developers without fair compensation or acknowledgment.

Liability for AI Developers

Under the COPIED Act, AI developers are held accountable for using copyrighted materials marked with provenance information. This provision aims to prevent AI systems from generating content that violates creators’ copyrights or misuses their likenesses.

Transparency Requirements for AI-Generated Content

The Act mandates that AI developers enable users to label AI-generated content with provenance information, making it possible to identify synthetic content. This helps consumers and other users recognise AI-generated material, potentially reducing the spread of misleading or deceptive digital representations.

Implications for Copyright Law and Fair Use

One of the most debated aspects of the COPIED Act is its impact on existing copyright law and the fair use doctrine. Traditionally, copyright law has permitted certain uses of copyrighted material for purposes such as education, commentary, and research, under fair use principles. However, the COPIED Act’s restrictions on AI training data and synthetic content generation could limit this flexibility.

By making it illegal to use content with provenance information for AI training, the Act challenges the conventional application of fair use. Courts are currently evaluating whether using copyrighted content to train AI models is permissible under fair use. The COPIED Act’s prohibition could override these fair use arguments by setting a legal standard against using marked content in AI model training without permission, which could potentially undermine First Amendment rights related to freedom of expression and speech.

Concerns and Criticisms of the COPIED Act

While the COPIED Act has garnered support, it has also faced criticism from various stakeholders concerned with its implications on free speech, innovation, and fair use. Some of the primary concerns include:

  1. Impact on Free Speech and Creative Expression: Critics argue that the Act’s restrictions on modifying provenance information could discourage forms of expression that rely on transformative use, such as parody and satire. By prohibiting any alterations to content provenance, the Act may restrict artists and creators from repurposing existing works for new, socially relevant expressions.
  2. Potential Barriers to AI Innovation: The AI industry depends on large, diverse datasets to train and improve machine learning models. The COPIED Act’s limitations on AI training data may reduce the availability of these datasets, hindering technological advancement. As AI models need significant volumes of data for effective development, this restriction could delay innovation and limit AI’s growth in creative and commercial applications.
  3. Legal Uncertainty in AI Development: The COPIED Act creates a degree of legal ambiguity for AI developers who rely on copyrighted materials. Increased liability could discourage companies from developing new AI technologies, particularly as they face potential legal action if they inadvertently use content with provenance information.
  4. Potential Prematurity in Legislation: Some critics argue that the COPIED Act may be premature, as legal battles surrounding AI and copyright are still unfolding. With the U.S. Copyright Office recently highlighting AI-related copyright concerns and ongoing lawsuits involving AI companies, opponents believe it may be wiser to wait for further legal clarity before establishing new legislation. A more measured approach could help ensure that the Act aligns with evolving legal interpretations and industry practices.

Benefits of the COPIED Act

Despite the criticisms, the COPIED Act has clear benefits for content creators, consumers, and the creative industry:

  1. Protecting Creative Rights: By giving creators control over their work and likeness, the Act addresses longstanding concerns around copyright and digital ownership in an age where AI-generated content is becoming increasingly realistic and widely available.
  2. Combating Misinformation and Misuse: The Act provides tools for tracing content back to its source, reducing the potential for deepfakes and other forms of AI-generated disinformation that can be used to mislead the public.
  3. Fostering Consumer Awareness: By making content provenance accessible, the COPIED Act empowers consumers to verify the authenticity of digital media. This transparency helps maintain trust in online content and protects individuals and entities from the harms of digitally altered media.
  4. Promoting Ethical AI Development: By enforcing standards around synthetic content and provenance, the Act encourages responsible use of AI in content creation, promoting a balance between technological advancement and respect for intellectual property rights.

Conclusion

The COPIED Act of 2024 represents an important legislative effort to address the challenges posed by AI in creative fields. As lawmakers continue to debate its provisions, the Act’s impact on copyright law, freedom of speech, and AI innovation will be central to shaping the future of digital content. Whether the Act ultimately passes or prompts further legislative refinement, it has already underscored the urgent need for clarity in the laws governing AI, originality, and digital ownership. The journey of the COPIED Act through Congress will undoubtedly play a pivotal role in defining the rights and responsibilities of creators and AI developers alike in an increasingly AI-driven world.