+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

California's governor signed new deepfake laws for politics and porn, but experts say they threaten free speech

Oct 10, 2019, 22:21 IST

California Gov. Gavin Newsom speaks at the California for All Emergency Management Preparedness Summit, Monday, June 3, 2019, in Sacramento, Calif. Newsom said Monday that the Defense Department as agreed to provide information from a Cold War-era military satellite to help spot wildfires, while the defense secretary also gave the California National Guard blanket approval through the year's end to use unmanned drones to map fires, count destroyed houses and spot survivors. (AP Photo/Rich Pedroncelli)Associated Press

Advertisement
  • California has two new laws regulating deepfakes - videos or images manipulated with artificial intelligence to make it appear as if someone has said or done something that they haven't.
  • The first law makes it illegal to post deepfakes of political candidates in the 60 days ahead of an election. It was introduced after a Nancy Pelosi deepfake went viral.
  • The second law allows state residents to sue anyone who uses a deepfake to place them in pornographic material without consent. A recent study found that more than 90% of deepfakes are pornographic and target women.
  • But civil liberties and misinformation experts have criticized both laws, saying that they are misguided, vague, subjective, and threaten free speech.

Last week, California Gov. Gavin Newsom signed two deepfake bills into state law.

The first is political, making it illegal to post manipulated videos and pictures that give a "false impression of a political candidate's actions or words" in the 60 days before an election.

The bill was introduced by Democratic Assemblyman Marc Berman after a deepfake of Nancy Pelosi went viral, in which her speech was altered in a video to make it sound like she was slurring her words.

"In the context of elections, the ability to attribute speech or conduct to a candidate that is false - that never happened - makes deepfake technology a powerful and dangerous new tool in the arsenal of those who want to wage misinformation campaigns to confuse voters," Berman said in a statement.

Advertisement

The law will take effect next year and it includes exemptions for news outlets, satire and parody, and manipulated videos that have clear disclaimers.

But most deepfakes aren't political. Deeptrace, a cybersecurity company, released a study of almost 15,000 deepfakes, and found that more than 90% were pornographic. All of these deepfakes targeted women, a horrifying and prevalent new form of online harassment and revenge porn.

Accordingly, the second California deepfake law allows residents to sue anyone who uses deepfake technology to place them in pornographic material without consent. Both of these measures seem positive - but they may not have their intended effect, according to experts.

Experts say California's deepfake legislation is misguided, and threatens free speech

Claire Wardle is the executive director of First Draft, a nonprofit focused on addressing the online tactics that fuel misinformation and disinformation. As Wardle has seen deepfake worries increase, she isn't sure that our attention is in the right place.

"I have real concerns about new legislation that focuses on the technology or techniques used to create the manipulated content," Wardle told Business Insider. "It's the impact - especially the harm that it has - that we should be focused on."

Advertisement

There are already laws that regulate the impact of pornographic deepfakes, including specific measures for revenge porn and digital harassment. Wardle argues that we should be using those existing laws to remedy the harm caused by deepfakes.

David Greene, the Electronic Frontier Foundation's civil liberties director, is similarly skeptical of deepfake legislation.

Greene added extortion, false light, and defamation to the list of laws that could already police deepfakes, depending on the creator's intent. Further, Greene says California's political deepfake law does not strike an appropriate balance between preventing harm and protecting free speech.

"The law is overbroad, vague, and subjective," Greene told Business Insider. "It hinges on whether the deepfake leads to a fundamentally different impression of the candidate, which is not specific enough, and could suppress speech."

Both the EFF and the ACLU wrote letters to Gov. Newsom, warning that the political deepfake law would not solve the problem and may only lead to more confusion.

Advertisement

Wardle and Greene also expressed concern over how the exceptions for satire and parody would be determined.

The governor's office did not immediately respond to a request for comment regarding opposition to the law. Assemblyman Berman's office also did not respond to a request for similar comment.

"As people have become increasingly concerned about the impact of disinformation, we've learned the challenges of legislating around content," Wardle told Business Insider. "It can have really worrying consequences on free speech."

Instead, Wardle and Greene agree that we need to place more emphasis on understanding the intent behind creating deepfakes.

"There has been a consensus that we should focus on the sources," Wardle said. "Who is creating the content? What are they aiming to achieve? Is it a coordinated campaign to manipulate? That's how we should think about these questions."

Advertisement

NOW WATCH: How to use the iPhone's new voice control feature

You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article