The Facebook - now Meta - exec leading the metaverse mission reportedly said VR can be 'toxic,' and it will be 'practically impossible' to moderate how people behave
- Meta's, Andrew Bosworth said VR can be "toxic," especially for minorities, per a March memo.
- He also wrote that content moderation is "practically impossible" at a meaningful scale.
Andrew Bosworth, Facebook veteran and future CTO of its new parent company Meta, made some remarks questioning the technology involved in its ambitious metaverse project, according to a March internal memo obtained by The Financial Times.
Known as "Boz," the executive said virtual reality can be a "toxic environment," especially for women and minorities, the FT reported. He also said bullying and overall bad behavior can be worsened in an interactive environment like VR, which will play a key role in the metaverse of the future.
Bosworth said that toxicity would be an "existential threat" to the company's mission if it "turned off mainstream customers from the medium entirely," according to the memo as the FT reported. He suggested a solution via a reporting function for users who can send human reviewers footage of alleged harm.
Bosworth also told employees in the memo that he wants the company's metaverse to have "almost Disney levels of safety," per the paper. But he said moderating what people say and how they act "at any meaningful scale is practically impossible."
When asked to comment, a company spokesperson pointed Insider to a blog post from today saying "we want everyone to feel like they're in control of their VR experience and to feel safe on our platform. Full stop." Bosworth later shared the post on Twitter.
The memo that the Financial Times viewed appears to be separate from the trove of leaked internal documents that former employee-turned whistleblower Frances Haugen's legal counsel continues to share with the press.
Bosworth's reported comments shed light on how the executive spearheading the company's push into the metaverse perceives the very technology that will help it do so.
The Facebook social platform has long grappled with content moderation woes and keeping hate speech and misinformation from proliferating on the site. Experts previously told Insider that the company's rebrand to Meta, made to reflect its push into the metaverse, means those problems could spread to the virtual world and be amplified if gone unsolved.
Bosworth said in a 2020 memo that hate speech spreads on Facebook because there's an appetite for it, according to the leaked documents.
"As a society, we don't have a hate speech supply problem, we have a hate speech demand problem," Bosworth wrote.
The company named Bosworth, who is currently the head of Meta's Reality Labs division, as Meta's CTO, a move that will go into effect next year.
He's a Facebook veteran and has been with the company since 2006 after meeting Zuckerberg at Harvard University two years prior. Boz built the social platform's first AI-driven News Feed.
But he's also stirred up controversy with what company insiders have called his provocative internal comments.
In 2016, he reportedly wrote a memo called "The Ugly" appearing to claim that the company's growth is worth any harm that Facebook poses.
"Maybe it costs a life by exposing someone to bullies," he wrote in the memo, which was first reported by Buzzfeed in 2018. "Maybe someone dies in a terrorist attack coordinated on our tools. And still we connect people. The ugly truth is that we believe in connecting people so deeply that anything that allows us to connect more people more often is *de facto* good."
New York Times journalists Sheera Frankel and Cecilia Kang named their book, "The Ugly Truth," after his remarks.