California governor Gavin Newsom signs laws to crack down on election deepfakes created by AI
The Hindu
California Governor Gavin Newsom signed three bills to crack down on the use of AI in political ads.
California Governor Gavin Newsom signed three bills Tuesday to crack down on the use of artificial intelligence to create false images or videos in political ads ahead of the 2024 election.
A new law, set to take effect immediately, makes it illegal to create and publish deepfakes related to elections 120 days before Election Day and 60 days thereafter. It also allows courts to stop distribution of the materials and impose civil penalties.
“Safeguarding the integrity of elections is essential to democracy, and it’s critical that we ensure AI is not deployed to undermine the public’s trust through disinformation -– especially in today’s fraught political climate,” Newsom said in a statement. "These measures will help to combat the harmful use of deepfakes in political ads and other content, one of several areas in which the state is being proactive to foster transparent and trustworthy AI.”
Large social media platforms are also required to remove the deceptive material under a first-in-the-nation law set to be enacted next year. Newsom also signed a bill requiring political campaigns to publicly disclose if they are running ads with materials altered by AI.
The governor signed the bills to loud applause during a conversation with Salesforce CEO Marc Benioff at an event hosted the major software company during its annual conference in San Francisco.
The new laws reaffirm California’s position as a leader in regulating AI in the U.S., especially in combating election deepfakes. The state was the first in the U.S. to ban manipulated videos and pictures related to elections in 2019. Measures in technology and AI proposed by California lawmakers have been used as blueprints for legislators across the country, industry experts said.
With AI supercharging the threat of election disinformation worldwide, lawmakers across the country have raced to address the issue over concerns the manipulated materials could erode the public’s trust in what they see and hear.