Recently, California introduced three new laws to tackle the issue of deepfake videos. However, the new legislation faces a legal challenge from a content creator who makes parody videos.
Who introduced the new laws?
California Governor Gavin Newsom signed the bills to introduce the new laws last week. The new provisions also build on a separate law Newsom introduced five years ago. That law makes it illegal to maliciously share deceptive audio or video media that try to discredit a candidate in the run-up to an election. The new laws are intended to prevent deepfake content which could affect the upcoming Presidential election.
What are the new laws?
The first new law expands the run-up period defined in the previous legislation from 60 days to 120 days before an election. Another of the new laws, the Defending Democracy from Deepfake Deception Act, means online platforms must block users from posting election-related content that is “materially deceptive”. In addition, the third new law requires that AI-generated, or “substantially altered,” content in electoral adverts includes a disclosure.
Who has challenged the new laws?
The lawsuit to challenge California’s new laws has been filed by a content creator who makes parody videos. Their deepfake videos feature Vice President and Democratic presidential nominee Kamala Harris. One of these videos was previously shared by Elon Musk on X (formerly Twitter) back in July. As a result, Governor Newsom posted, “Manipulating a voice in an ‘ad’ like this one should be illegal. I’ll be signing a bill in a matter of weeks to make sure it is.”
What is the challenge?
The content creator has challenged two of the new laws in a lawsuit that was filed this week. They’re contesting the right to sue for damages over election deepfakes and the requirement for online platforms to remove deceptive content. The content creator also claims that the laws censor free speech and allow anybody to take legal action over content they dislike.
How has the governor responded?
In response to the lawsuit, Governor Newsom’s spokesperson, Izzy Gordon said, “It’s unclear why this conservative activist is suing California. This new disclosure law for election misinformation isn’t any more onerous than laws already passed in other states, including Alabama.” The governor’s office also stated that the new laws don’t ban parody videos. However, they do require that the deepfake content is clearly marked to show that it has been altered by AI.
What next?
The attorney for the content creator claimed that California’s new laws “force social media companies to censor and harass people.” In addition, some free speech campaigners have claimed that the laws are unconstitutional and even infringe the First Amendment. However, Assemblymember Gail Pellerin played down these concerns. She said, “What we’re saying is, hey, just mark that video as digitally altered for parody purposes. And so it’s very clear that it’s for satire or for parody.” In addition, this week Governor Newsom signed another new law that requires election campaigns to disclose AI-generated material from 2025 onwards.
What we think
The rapid advances in generative AI have unleashed a flood of deepfake videos and audio. While often this is harmless or humorous, there is an increasing amount of malicious content. In addition, in the run-up to the 2024 elections, there has been a rise in deepfakes featuring the candidates. While some see this as parody or satire, these videos can influence voters who don’t realize what they are watching is fake. Censorship of artworks and content creation is never an issue to be treated lightly. However, manipulation of the vote in an election, including by foreign powers, is a growing concern. Many other states also have similar laws to California. The lawyer for the content creator has already said he is going to challenge legislation passed by Minnesota. It seems that the courts will play an important part in the future of generative AI.