+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

The viral AI-generated image showing an explosion near the Pentagon is 'truly the tip of the iceberg of what's to come,' tech CEO says

Jun 9, 2023, 17:41 IST
Business Insider
The Pentagon is seen from Air Force One as it flies over Washington, March 2, 2022. A wayward and unresponsive business jet that flew over the nation's capital Sunday afternoon, June 4, 2023, caused the military to scramble a fighter plane before the jet crashed in Virginia, officials said. The fighter jet caused a loud sonic boom that was heard across the capital region.AP Photo/Patrick Semansky, File
  • People need to prepare for a surge in AI-generated content being shared online, a tech CEO told CNN.
  • The viral image of an explosion near the Pentagon is just "the tip of the iceberg," Jeffrey McGregor said.
Advertisement

The viral AI-generated image showing an explosion near the Pentagon is "truly the tip of the iceberg of what's to come," a CEO who works in image authenticity detection has warned.

The image, which was largely thought to have been created using AI, quickly spread on social media last month and even caused the stock market to briefly dip.

"We're going to see a lot more AI generated content start to surface on social media, and we're just not prepared for it," Jeffrey McGregor, the CEO of Truepic, told CNN.

AI image-generation sites such as DALL-E, Midjourney, and Stable Diffusion have boomed in popularity over recent months. In their prompts, users can ask the sites to create artwork in the style of a particular artist, creating concerns about ownership and copyright, or images of events that never happened, leading to some deepfake images going viral, includes ones showing former President Donald Trump being arrested.

Earlier this year, a photographer sparked debate about whether AI-generated images can be classed as art after an image he created using DALL-E 2 won a major international photography competition.

Advertisement

The photographer told Insider that judges hadn't managed to spot that the image was a fake. "It has all the flaws of AI, and it could have been spotted but it wasn't," he said.

It's not just AI-generated images that are being used to deceive people. Trolls have used a voice-cloning to mimic the voices of celebrities including Joe Rogan, Ben Shapiro, and Emma Watson, and scammers have used the technology to persuade people to part with money they think is going towards a relative or even to fake kidnappings.

Sites such as GPTZero have been developed to help detect whether text was written by AI chatbots like ChatGPT. Some professors have been putting their students' essays through AI-detection services.

"When anything can be faked, everything can be fake," McGregor told CNN. "Knowing that generative AI has reached this tipping point in quality and accessibility, we no longer know what reality is when we're online."

Ben Colman, the CEO of Reality Defender, which says it can detect AI-generated images, video, and audio, told CNN that one of the reasons that fake images were spreading online was that "anybody can do this."

Advertisement

"You don't need a PhD in computer science. You don't need to spin up servers on Amazon. You don't need to know how to write ransomware," he told the outlet. "Anybody can do this just by Googling 'fake face generator.'"

You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article