Discord
- Discord has shut down an area in its chat service that was being used to share pornographic videos that had been edited using an artificial intelligence technology to include the images of female celebrities without their consent.
- Dozens of people were active on the channel, which Discord said violated its rules on revenge porn.
- People who have doctored videos using the AI technology are also sharing them on Reddit.
Online chat provider Discord has shut down an area within its app that was being used to share pornographic videos that had been doctored using artificial intelligence technology to include the images of female celebrities without their consent.
The company closed down the area shortly after Business Insider reached out to Discord about it. Discord took it offline because the area violated its rules against non-consensual pornography - otherwise known as revenge porn, a company representative said in a statement.
"Non-consensual pornography warrants an instant shut down on the servers whenever we identify it, as well as permanent ban on the users," the representative said. "We have investigated these servers and shut them down immediately."
The user-created area was called deepfakes, and it included several channels in which users could communicate. One was reserved for general discussions, another for sharing pictures that could be used to train the AI technology, and yet another for sharing the doctored videos. About 150 Discord users were logged into the deepfakes server earlier on Friday.
Discord
The users who created the area did attempt to set some ground rules for it. The rules stated that the videos couldn't be doctored to include images of "amateurs;" instead, only images of celebrities and public figures were allowed. And the rules barred images or videos of children.
"It goes without saying that you should respect each other," the rules added. "Be polite, avoid excessive swearing, and speak English at all times."
Over the last few months, growing numbers of people have been using an AI technology called FakeApp to virtually insert images of celebrities and other people into videos in which they didn't originally appear, as first reported by Motherboard. People who are interested in the technology or who have already edited videos using it have been congregating on Discord and Reddit to exchange notes on how to use FakeApp and share their efforts.
Although the technology can be used for a variety of purposes, to date it's largely been used to place the images of famous women into adult videos without their consent.
As of publication, a Reddit community dedicated to sharing the clips remained online. A Reddit representative did not respond to a request for comment.