+

Cookies on the Business Insider India website

Business Insider India has updated its Privacy and Cookie policy. We use cookies to ensure that we give you the better experience on our website. If you continue without changing your settings, we\'ll assume that you are happy to receive all cookies on the Business Insider India website. However, you can change your cookie setting at any time by clicking on our Cookie Policy at any time. You can also see our Privacy Policy.

Close
HomeQuizzoneWhatsappShare Flash Reads
 

Twitter now tells users to review their tweets to check for 'harmful or offensive' content before they post

May 6, 2021, 18:56 IST
Business Insider
Twitter CEO Jack Dorsey.PRAKASH SINGH/AFP via Getty Images
  • Twitter will now prompt users to review if tweets are "harmful or offensive" before posting.
  • Twitter says the feature can distinguish between offensive content, sarcasm, and "friendly banter."
  • In a test, 34% of people prompted checked their reply or didn't reply after all, Twitter said.
Advertisement

Twitter on Wednesday rolled out a new feature that prompts users to check whether their tweets are "potentially harmful or offensive" before posting them.

The company said the prompts will pop up on English-language Twitter accounts on Apple and Android devices from Wednesday.

"People come to Twitter to talk about what's happening, and sometimes conversations about things we care about can get intense and people say things in the moment they might regret later," Twitter said in a blog post.

The feature uses artificial intelligence (AI) to detect harmful language in a tweet a user has composed, the company said. Before the user presses the send button, an alert pops up on the screen, asking them to review what they've written in the post. The user can then edit, delete or send the reply.

The social media platform began testing the feature in May 2020 to see whether it would make users pause and reconsider what they were posting online.

Advertisement

Twitter said in the blog post that 34% of people who received a prompt went on to revise their initial reply or decided to not post at all. Users in the test wrote 11% fewer offensive replies after being prompted for the first time, Twitter said.

Users were also less likely to receive offensive and harmful replies back, Twitter added.

In early testing, Twitter's systems struggled to tell the difference between offensive content and jokes between friends. But Twitter said that the feature can now distinguish between "sarcasm and friendly banter" and takes into account "the nature of the relationship between the author and replier."

The news comes after English sports teams, soccer players, and athletes staged a four-day boycott of social media to protest online racist abuse aimed at Black players.

The new content prompts aren't the only feature Twitter has been trialing in an attempt to promote a more amicable environment on the platform. Last year, the company began warning users that they should read an article before posting it to "help promote informed discussion." This feature is still being tested.

Advertisement
You are subscribed to notifications!
Looks like you've blocked notifications!
Next Article