Facebook's nudity-spotting AI mistook a photo of some onions for 'sexually suggestive' content
- A Canadian garden center had its Facebook ad for onion seeds taken down by the platform on Monday.
- Facebook said the ad was removed for breaking its rules on "products with overtly sexual positioning."
- Facebook's head of comms in Canada said the post had been restored on Wednesday, and that it had been initially removed by the platform's automated moderation systems.
- "We use automated technology to keep nudity off our apps. But sometimes it doesn't know a walla walla onion from a, well, you know."
Facebook's AI struggles to tell the difference between sexual pictures of the human body and globular vegetables.
A garden center in Newfoundland, Canada on Monday received a notice from Facebook about an ad it had uploaded for Walla Walla onion seeds that contained a photo of some onions.
Facebook's notice said the ad broke its rules on "products with overtly sexual positioning," clarifying: "listings may not position products or services in a sexually suggestive manner."
Facebook on Wednesday told Canada's CBC News the ad had been reinstated after review. The mistake had been made by its AI moderation tech, which automatically takes down content it thinks contains nudity, it said.
"We use automated technology to keep nudity off our apps. But sometimes it doesn't know a walla walla onion from a, well, you know. We restored the ad and are sorry for the business' trouble," Meg Sinclair, Facebook Canada's head of communications told CBC.
She did not clarify what she meant by a "you know."
This is not the first time Facebook's automated systems have over-zealously removed content later reinstated by human moderators. In 2018 its systems took down a post containing excerpts from the Declaration of Independence after it flagged the post as containing hate speech.