+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

Facebook Is Scanning Your Photos For Explicit Images Of Children, Too

Aug 8, 2014, 06:05 IST

Just like Microsoft and Google, Facebook is actively looking for explicit photos of children.

Advertisement

Facebook scans your pictures for this content when you upload a photo to Facebook or Instagram, using the same tool invented by Microsoft that both Microsoft and Google use, a spokesperson confirmed to Business Insider.

The spokesperson told us, "There is no place for child exploitative content on Facebook. We use PhotoDNA to check that each image which is uploaded to our site is not a known child abuse image. If a match is found then we prevent the image being uploaded to Facebook, disable the account, and flag it to NCMEC for investigation."

While we haven't heard of any arrests coming from Facebook tipping off the authorities, we know that this is entirely possible. As we've previously reported, if an online service provider discovers this material on their network, they are legally obligated to report it to the National Center for Missing and Exploited Children (NCMEC), who reviews the material and reports as necessary to law enforcement.

They are not under legal obligation to look for the material, a legal expert tells us, but the online service providers have taken it upon themselves to do that.

Advertisement

In fact, it is so illegal to possess these images that online service providers do not even keep copies of the NCMEC's database themselves, a source close to Facebook tells us. The technology invented by Microsoft allows them to detect the pictures by looking for what's known as hashtags, special digital fingerprints that indicates the picture has been labeled as illegal.

In fact, it is those tags that they are looking for. So, there's almost no possibility of making a mistake. If you share an innocent picture of your cute, naked baby, splashing around the bathtub, it would not be in that database, it would not have such a hashtag, and you would not be reported to the authorities.

Facebook's terms of service don't spell out that it's scanning photos in this way, but it does clearly warn people: "You will not post content that: is hate speech, threatening, or pornographic; incites violence; or contains nudity or graphic or gratuitous violence."

Privacy advocates remain concerned that one day service providers could abuse this ability to scan our emails, cloud services, and images. However, most of us would agree that if ever there was a good use for such tech, this would be it.

You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article