Aiming to combat the rise of harmful deepfakes and Artificial Intelligence (AI) generated content, lawmakers from the United States (U.S.) introduced a bill last July 11 requiring online platforms to allow labels for synthetic content to protect artists’ creative rights.
The bill, dubbed the Content Origin Protection and Integrity from Edited and Deepfaked Media Act (COPIED Act), authored by U.S. Senators Maria Cantwell, Marsha Blackburn, and Martin Heinrich, will put journalists, actors, and artists ‘back in control’ in the fight against AI-driven theft.
“The COPIED Act will also put creators, including local journalists, artists, and musicians, back in control of their content with a provenance or origin and watermark process that I think is very much needed,” Senator Cantwell said in the bill’s press release.
The bill requires the National Institute of Standards and Technology (NIST), in consultation with the U.S. Patent and Trademark Office (USPTO) and U.S. Copyright Office to develop guidelines and standards for watermarking, content provenance information, evaluation, testing, and cybersecurity protections.
The Act will require developers of AI tools that generate synthetic content to allow users to attach and prohibit the removal, alteration, tampering, or disabling of provenance information to their content.
“The COPIED Act takes an important step to better defend common targets like artists and performers against deepfakes and other inauthentic content,” Senator Blackburn said.
The bill gives journalists, actors, and artists such as songwriters the ability to protect their work and set the terms of use for their content, including their compensation.
Additionally, the bill also clarifies the enforcement powers of the Federal Trade Commission (FTC) and State Attorneys to enforce its provided dimensions.