Instagram is developing a ‘Nudity Protection’ feature to block unsolicited nude photos in DMs
Oct 7, 2022, 11:32 IST
Advertisement
- Instagram is working on a new tool that detects nude images sent via direct message.
- The feature will automatically cover the nude photo but gives the user an option to open it.
- It is hoped the new feature will reduce the number of 'cyber-flashing' cases.
This feature was revealed by Alessandro Paluzzi, a reliable leaker and reverse engineer on Twitter, in collaboration with The Verge. In a Twitter post, he said, “Instagram is working on nudity protection for chats,” and posted a screenshot of what users may see when opening this feature.
The screenshot posted by Paluzzi suggests that Instagram will process all images for this feature on the device, meaning that nothing will be sent to the company’s servers. Plus, you can choose to see the photo if you think it's from a trusted person. When the feature rolls out, it will be an optional setting for users who want to weed out messages with nude photos, Meta told Verge.
Last year, Instagram launched DM controls to enable keyword-based filters that work with abusive words, emojis, and phrases. The company also introduced a Sensitive Content filter that, nudity, and graphical violence out of the users' experience.
How will the ‘Nudity Prevention’ tool work?
Advertisement
Meta said that the technology will not allow Meta to view the actual message or share them with third parties.
Today, social media has badly grappled with the problem of unsolicited nude photos. Meanwhile, some apps like Bumble use AI-powered tools to tackle this problem. Twitter is struggling with catching child sexual abuse material and non-consensual nudity at scale.
Last year, The Pew Research Center published a report in which 33 percent of women under 35 had been sexually harassed online.
SEE ALSO:
With 303 vulnerabilities, Google Chrome is the riskiest browser of 2022: Atlas VPN
Twitter is going the TikTok way with a new full-screen video feature