+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

Fake explicit Taylor Swift photos have politicians sounding off. But will AI laws actually change?

Jan 27, 2024, 04:53 IST
Insider
Sexually explicit and offensive AI-created images of Taylor Swift went viral on social media this week.Steve Granitz/FilmMagic/Getty Images, Harun Ozalp/Anadolu Agency via Getty Images
  • Sexually explicit AI-created images of Taylor Swift went viral on X this week.
  • It's sparked a renewed call for legislation to combat AI and deepfakes' threats.
Advertisement

Taylor Swift is the latest celebrity to fall victim to artificial intelligence misuse, with her likeness being used in a series of sexually explicit posts that went viral on X and Telegram this week.

It's sparking renewed calls for legislation to combat the threats that AI and deepfakes pose.

Pornographic images of the pop star began circulating on the social media platform on Wednesday. Many of them portrayed Swift nude and engaging in sexual acts in a football stadium, seemingly in reference to her recent appearances at NFL games.

According to The Verge, one post, uploaded by a user who had paid for X's blue check, attracted more than 45 million views and 24,000 reposts before it was taken down by X's moderators 17 hours later.

The mass proliferation of the images has prompted discussions about the increasingly alarming spread of AI-generated content and misinformation online, with many politicians arguing that it's high time federal law is introduced to tackle the issue.

Advertisement

According to NBC News, since the beginning of the year, legislation has been introduced in at least 14 states to combat the issues that AI and deepfakes can create in relation to elections, which seems particularly timely given that New Hampshire voters received bogus calls claiming to be from US President Joe Biden this week.

However, Democratic Rep. Joseph Morelle has proposed a bill that would specifically criminalize the nonconsensual sharing of sexually explicit digitally altered material across the US.

In May 2023, Morelle introduced the Preventing Deepfakes of Intimate Images Act, which seeks to make it illegal to share deepfake pornography without consent. If passed, it would also allow victims to sue the creators and distributors of such material while maintaining anonymity.

The bill was referred to the House Judiciary Committee, but no further action has been taken in the last eight months.

Now, Morelle is just one of the voices calling for urgent action.

Advertisement

Posting on X on Thursday, he wrote: "The spread of AI-generated explicit images of Taylor Swift is appalling — and sadly, it's happening to women everywhere, every day."

Rep. Tom Kean Jr. — who became the first Republican co-sponsor of Morelle's bill in November after an incident affecting a high schooler in his hometown — stated that it is "clear that AI technology is advancing faster than the necessary guardrails."

He added: "Whether the victim is @taylorswift13 or any young person across our country — we need to establish safeguards to combat this alarming trend."

Others are speaking up, too, including Democratic Rep. Yvette D. Clarke, who has attempted to push through a separate Deepfakes Accountability Act that would impose regulations around the creation of AI-generated content. (That bill also hasn't made it past the first hurdle.)

"What's happened to Taylor Swift is nothing new," she posted on the app, noting that women have been victimized by AI-generated nudes for years and "advancements in AI, creating deepfakes is easier & cheaper."

Advertisement

Swift herself has not reacted publicly to the images, and a representative for the singer did not respond to a request for comment on the situation from Business Insider. (The Daily Mail has reported that her team is considering legal action against the site that created the AI-generated images.)

Even without commenting, Swift's position as one of the most famous women in the world has perhaps done enough to shine a light on this particularly nefarious — and increasingly common — form of sexual harassment.

According to the State of Deepfakes report published in 2023, over 95,000 deepfake videos were posted online last year, representing a 550% increase over 2019.

The report also found that deepfake pornography makes up 98% of all deepfake videos online, and women are disproportionately targeted.

Due to the lack of existing legislation, many women who have been in Swift's position have been left with little recourse after reporting the creators and distributors of their own deepfakes.

Advertisement

Now with Swift as its latest victim, AI pornography may finally face consequences.

You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article