+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

Ex-Facebook engineer reportedly suggested ending People You May Know recommendations of children to adults, but he says execs refused

Dec 23, 2023, 01:44 IST
Business Insider
An ex-Facebook worker told the WSJ that the "People You May Know" feature was the main way adults found minors to target on Facebook. Harun Ozalp/Anadolu Agency via Getty Images
  • Meta reportedly rejected an idea to stop its "People You May Know" from recommending kids to adults.
  • An ex-Facebook worker alleged to the WSJ that the feature was a way adults found minors to target.
Advertisement

Meta execs reportedly refused to disable Facebook's "People You May Know" recommendation feature after one employee suggested it could be fueling child exploitation.

In 2018, David Erb, who was an engineering director at Facebook at the time, ran a team dedicated to identifying unsafe user behavior on the platform. When the team looked into inappropriate interactions between adults and minors on Facebook, it discovered that Facebook's "People You May Know" algorithm was the most popular way adults on the platform found children to target, according to a new report.

"It was a hundred times worse than any of us expected," Erb told The Wall Street Journal in a recent interview. "There were millions of pedophiles targeting tens of millions of children." Some cases included adults asking teenage girls for photos of their private parts in exchange for cash, and threats to leak nude photos, the WSJ reported.

Meanwhile, Meta execs were reportedly in talks about moving to encrypt Facebook messages to ensure that user data is private. Worried the encryption plans would make it harder to detect predatory behavior, Erb said he turned to his colleagues at Meta-owned WhatsApp for advice on how Facebook could combat child exploitation on the platform.

The takeaway: Facebook should limit the recommendation features.

Advertisement

Following the conversation, Erb said his team proposed that Facebook's "People You May Know" feature should stop recommending minors to adults. Meta execs rejected the proposal, Erb told the Journal.

And ultimately, Meta decided to move forward with encrypting Facebook messages. In turn, Erb said he was removed from his role and resigned soon after. He left Facebook in December of 2018, according to his LinkedIn.

Andy Stone, a Meta spokesperson, told BI that Erb's claims are wrong.

"Former employees are entitled to their opinions, but not their own facts," Stone said in an emailed statement to BI. "The truth is, while we'd long invested in child safety efforts, in 2018 we began work to restrict recommendations for potentially suspicious adults, continued ongoing efforts removing large groups of violating accounts and even supported a successful push to update the National Center for Missing and Exploited Children reporting statute to cover many grooming situations, which previously had not been included."

Erb didn't immediately respond to Business Insider's request for comment.

Advertisement

The alleged internal disagreements related to how predatory behavior on Facebook should be addressed were revealed after Meta announced in December that it's making end-to-end encryption a default setting on the platform's Messenger app.

It's part of Meta's efforts to buckle down on privacy.

"The extra layer of security provided by end-to-end encryption means that the content of your messages and calls with friends and family are protected from the moment they leave your device to the moment they reach the receiver's device," Meta wrote in press release. "This means that nobody, including Meta, can see what's sent or said, unless you choose to report a message to us."

Meta appears to have taken some measures to address child safety online. The tech giant created an internal tool it calls Masca, which stands for "Malicious Child Safety Actor," which detects accounts that have engaged in suspicious activity with minors, and now, could potentially disable them, the Journal reported. A spokesperson for Meta told the Journal that the company has removed 160,000 accounts related to child-exploitation since 2020.

Meanwhile, unsealed documents obtained by The New York Times claimed that Meta knowingly has millions of underage users on Instagram; 33 states in the US, according to the documents, claim that Meta "routinely continued to collect" children's personal information. Under US law, it's illegal to collect data from children under the age of 13.

Advertisement

In response to these claims, Meta previously told Business Insider that "the complaint mischaracterizes our work using selective quotes and cherry-picked documents."

You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article