scorecard
  1. Home
  2. tech
  3. news
  4. 3 members of Twitter's Trust and Safety Council quit: 'Red lines have been crossed'

3 members of Twitter's Trust and Safety Council quit: 'Red lines have been crossed'

Sarah Jackson   

3 members of Twitter's Trust and Safety Council quit: 'Red lines have been crossed'
Tech3 min read
  • Three members of Twitter's Trust and Safety Council resigned this week.
  • Under Musk's ownership, Twitter has seen a drastic rise in hate speech, two watchdog groups found.

Twitter's exodus continues, with three members of its Trust and Safety Council becoming the latest to leave.

Twitter's Trust and Safety Council, formed in 2016, consists of several dozen people and independent organizations that Twitter says help "advocate for safety and advise us as we develop our products, programs, and rules." The departing members are Anne Collier, founder and executive director of The Net Safety Collaborative; Eirliani Abdul Rahman, co-founder of Youth, Adult Survivors & Kin In Need (YAKIN); and Lesley Podesta, an advisor to the Young and Resilient Research Center at Western Sydney University.

"We are announcing our resignation from Twitter's Trust and Safety Council because it is clear from research evidence that, contrary to claims by Elon Musk, the safety and wellbeing of Twitter's users are on the decline," they wrote in a press release shared by Collier on Thursday.

They're referring to research from two watchdogs, the Center for Countering Digital Hate and the Anti-Defamation League, that recently reported a sharp increase in hate speech — including slurs against Black people and gay men, as well as antisemitic posts — on Twitter since Musk bought the platform.

"The question has been on our minds: Should Musk be allowed to define digital safety as he has freedom of expression? Our answer is a categorical 'no,'" the departing council members' release continued. "A Twitter ruled by diktat is not a place for us."

Musk has called himself a "free speech absolutist," which experts and advocates say could weaken Twitter's ability to effectively address hate speech, misinformation, and harassment.

Council members have been "mystified by the lack of communication" since Musk took charge, according to Collier. She adds that several changes Musk has made since taking over have raised alarm for the assurance of safety on Twitter, including his slashing of the company's outsourced content moderator positions. Twitter's new head of trust and safety, Ella Irwin, told Reuters earlier this week that the company has gotten rid of some manual reviews for content moderation, instead relying heavily on automation.

"You really need human review on a lot of abuse reports because they can be very nuanced and highly contextual to offline life, and the platforms don't really have that context," Collier said. "So it's really hard for machine learning algorithms to detect all of it or make decisions on all of it."

Of course, there's also Musk's new Twitter Blue subscription model, which allows people to buy verification on the platform for $8 a month. Previously, Twitter verified users' identities before giving them a blue check mark.

"Verification on Twitter was supposed to be about credibility and accountability, and that's usually not something you can buy. So if you just let people buy verification or credibility, you have no credibility," Collier said. "So if someone is seeing a little blue check mark or really any sort of badge of approval, the user doesn't know what that means and can't count on it."

Collier told Insider she would consider returning to the council if a Musk-owned Twitter improves its commitment to safety on the platform. But for now, as Abdul Rahman and Podesta echo in the release, they believe that vow has been broken.

"I have watched with, dare I say, trepidation, the negotiations over Elon Musk's purchase of Twitter," Abdul Rahman said in the release. "I had written down some commitments to myself at the time. Should Musk step over those thresholds, I told myself I would resign. Those red lines have been crossed."

In a statement to Insider, Podesta said, "The safety and protection of all users was always paramount. Having policy in place was always critical - it meant that everyone knew how decisions on moderation would be assessed. That careful process seems to have broken down now. I'm deeply saddened to see the rise in racist, violent and hate speech in recent months."


Advertisement

Advertisement