Twitter is changing some of its features to crack down on political misinformation leading up to the 2020 presidential election
- Twitter is temporarily changing a number of its features to prevent the spread of political misinformation on the platform as the 2020 presidential election draws near.
- Twitter will prompt users to post a comment on a tweet before retweeting it and will prevent tweets liked by people you don't follow from being recommended to you in your timeline, among other changes.
- The changes are designed to have people to think twice before they amplify a tweet that could contain misleading information ahead of and during the November election.
Twitter will start implementing changes in an effort to limit the spread of political misinformation on the platform leading up to the November 3 presidential election.
Twitter said in a blog post it will start encouraging users to post a comment on a tweet before they can retweet it, a move designed to prompt more consideration and thought regarding the tweet's topic and to limit how easily a tweet can reach more people. The company said it will start testing this feature for some Twitter users on Friday.
Twitter will also start preventing tweets that were liked by people you don't follow from showing up on your timeline on the site. "This will likely slow down how quickly Tweets from accounts and topics you don't follow can reach you, which we believe is a worthwhile sacrifice to encourage more thoughtful and explicit amplification," the company said in the blog post.
In May, Twitter said it would add warning labels to tweets that contain potentially harmful misleading information. Now, the company said when someone tries to retweet a post with such a label, Twitter will point them to credible information regarding the topic before they can share it.
The company will also add additional warning labels and restrictions to tweets with existing misleading information labels that are posted by US political figures, like candidates and accounts run by their campaigns, as well as US-based Twitter accounts that have more than 100,000 followers.
"Twitter has a critical role to play in protecting the integrity of the election conversation, and we encourage candidates, campaigns, news outlets and voters to use Twitter respectfully and to recognize our collective responsibility to the electorate to guarantee a safe, fair and legitimate democratic process this November," the company said in the blog post.
The temporary changes will roll out in the coming weeks, and all will be in effect by October 20, lasting "through at least the end of election week" in early November.
The news comes as Twitter, and other social media sites, gear up to police misinformation ahead of the upcoming election. Twitter has already rolled out a series of moves to crack down on false information on its site. The platform announced it was banning political ads on its site in late 2019, and has added election labels to the accounts of political candidates, among other changes.
Other tech companies like Facebook are also making changes to prevent the spread of misleading information. Facebook, for example, said it would stop accepting political ads starting the day after the election. It has also banned accounts, groups, and pages associated with the far-right conspiracy theory QAnon.
One of Twitter's most active users, President Donald Trump, clashed with Twitter after the company began cracking down on his tweets in May, adding "public interest notices" and fact-checking labels to his posts. The crackdown prompted the president and other Republicans to claim, without evidence, that Twitter and other tech companies wield anti-conservative bias.