Instagram is working on a new tool that detects nude images sent via direct message.- The feature will automatically cover the nude photo but gives the user an option to open it.
- It is hoped the new feature will reduce the number of 'cyber-flashing' cases.
This feature was revealed by
The screenshot posted by Paluzzi suggests that Instagram will process all images for this feature on the device, meaning that nothing will be sent to the company’s servers. Plus, you can choose to see the photo if you think it's from a trusted person. When the feature rolls out, it will be an optional setting for users who want to weed out messages with nude photos, Meta told Verge.
Last year, Instagram launched DM controls to enable keyword-based filters that work with abusive words, emojis, and phrases. The company also introduced a Sensitive Content filter that, nudity, and graphical violence out of the users' experience.
The new ‘Nudity Prevention’ tool will work by detecting images that may contain nudity that has been sent to the user over chat. It will automatically cover the image, and the user can decide whether to view it or not when they open the message.
Meta said that the technology will not allow Meta to view the actual message or share them with third parties.
Today, social media has badly grappled with the problem of unsolicited nude photos. Meanwhile, some apps like Bumble use AI-powered tools to tackle this problem. Twitter is struggling with catching child sexual abuse material and non-consensual nudity at scale.
Last year, The
SEE ALSO:
With 303 vulnerabilities, Google Chrome is the riskiest browser of 2022: Atlas VPN
Twitter is going the TikTok way with a new full-screen video feature