Senators propose COPIED Act for better AI watermarking

A bipartisan group of senators has sponsored a new bill that would safeguard artists and journalists from having their creations stolen by AI models and simplify the process of authenticating and identifying content generated by AI.

The National Institute of Standards and Technology (NIST) would be tasked with developing standards and recommendations that assist in proving the origin of content and identifying synthetic content, such as by watermarking, under the Content Origin Protection and Integrity from Edited and Deepfaked Media Act (COPIED Act). Along with requiring AI tools for creative or journalistic output to allow users to attach information about its origin and forbid such information from being erased, it also instructs the agency to develop security measures to avoid tampering. Additionally, such content could not be utilized to train AI models according to the bill.

Newspapers, artists, and broadcasters are among the content owners who may file lawsuits against businesses they claim have altered authentication markers or used their works without their consent. The Federal Trade Commission and state attorneys general may also enforce the bill.

The Senate has been working to comprehend and regulate AI, and this bill is the most recent in a string of bills pertaining to the subject. Although he made it apparent that new laws will be developed in separate committees, Senate Majority Leader Charles Schumer (D-NY) oversaw an effort to develop an AI roadmap for the chamber. The COPIED Act benefits from having Senate Commerce Committee Chair Maria Cantwell (D-WA) as a strong committee leader and supporter. The bill’s leaders are Commerce Committee member Marsha Blackburn (R-TN) and Senate AI Working Group member Martin Heinrich (D-NM).

A number of publishing and artist organizations, including SAG-AFTRA, the Recording Industry Association of America, the News/Media Alliance, and the Artist Rights Alliance, among others, released statements praising the bill’s introduction.

According to a statement from SAG-AFTRA national executive director and chief negotiator Duncan Crabtree-Ireland, “the ability of AI to produce stunningly accurate digital representations of performers poses a real and present threat to the economic and reputational well-being and self-determination of our members.” “To safeguard everyone’s fundamental right to manage the use of their face, voice, and persona, we require a completely transparent and accountable supply chain for generative Artificial Intelligence and the content it generates.”

Source link