Facebook announces new measures to stop spreading harmful content like misinformation and hate speech
Mar 18, 2021, 16:21 IST
Facebook has announced new measures to prevent its interest-based forums called Groups from spreading harmful content, like hate speech and misinformation.
The measures come after the social networking platform faced criticism for its groups being linked to protests that led up to the Capitol riot in the US earlier this year.
"We know we have a greater responsibility when we are amplifying or recommending content," Tom Alison, Vice President of Engineering at Facebook, wrote in a blog post on Wednesday.
These new changes will roll out globally over the coming months, Facebook said.
The social networking giant said that when a group starts to violate its rules, it will now start showing them lower in recommendations, which means it is less likely that people will discover them.
This is similar to its approach in News Feed, where the platform shows lower quality posts further down, so fewer people see them.
"We believe that groups and members that violate our rules should have reduced privileges and reach, with restrictions getting more severe as they accrue more violations, until we remove them completely," Alison said.
"And when necessary in cases of severe harm, we will outright remove groups and people without these steps in between."
Facebook said that it will start to let people know when they are about to join a group that has "Community Standards" violations, so they can make a more informed decision before joining.
"We'll limit invite notifications for these groups, so people are less likely to join," Alison said.
"We think these measures as a whole, along with demoting groups in recommendations, will make it harder to discover and engage with groups that break our rules," Alison said.
Facebook said it will also start requiring admins and moderators to temporarily approve all posts when that group has a substantial number of members who have violated our policies or were part of other groups that were removed for breaking our rules.
This means that content would not be shown to the wider group until an admin or moderator reviews and approves it.
If an admin or moderator repeatedly approves content that breaks its rules, Facebook will take the entire group down.
"They also won't be able to invite others to any groups, and won't be able to create new groups."
SEE ALSO:
Ahead of Delhi High Court's decision, Future Group appeals to the Singapore arbitrator to exclude Future Retail from the fight with Amazon
Indian spacetech startup Pixxel gets $7.3 million in funding as it announces a new product even before the launch of its first satellite
Airtel adds 3 times more subscribers than Jio while Vodafone Idea gets net addition first time in 15 months
Advertisement
The measures come after the social networking platform faced criticism for its groups being linked to protests that led up to the Capitol riot in the US earlier this year.
"We know we have a greater responsibility when we are amplifying or recommending content," Tom Alison, Vice President of Engineering at Facebook, wrote in a blog post on Wednesday.
These new changes will roll out globally over the coming months, Facebook said.
The social networking giant said that when a group starts to violate its rules, it will now start showing them lower in recommendations, which means it is less likely that people will discover them.
Advertisement
"We believe that groups and members that violate our rules should have reduced privileges and reach, with restrictions getting more severe as they accrue more violations, until we remove them completely," Alison said.
"And when necessary in cases of severe harm, we will outright remove groups and people without these steps in between."
Facebook said that it will start to let people know when they are about to join a group that has "Community Standards" violations, so they can make a more informed decision before joining.
"We'll limit invite notifications for these groups, so people are less likely to join," Alison said.
Advertisement
For existing members, the platform will reduce the distribution of that group's content so that it is shown lower in News Feed. "We think these measures as a whole, along with demoting groups in recommendations, will make it harder to discover and engage with groups that break our rules," Alison said.
Facebook said it will also start requiring admins and moderators to temporarily approve all posts when that group has a substantial number of members who have violated our policies or were part of other groups that were removed for breaking our rules.
This means that content would not be shown to the wider group until an admin or moderator reviews and approves it.
If an admin or moderator repeatedly approves content that breaks its rules, Facebook will take the entire group down.
Advertisement
"When someone has repeated violations in groups, we will block them from being able to post or comment for a period of time in any group," Alison said."They also won't be able to invite others to any groups, and won't be able to create new groups."
SEE ALSO:
Ahead of Delhi High Court's decision, Future Group appeals to the Singapore arbitrator to exclude Future Retail from the fight with Amazon
Indian spacetech startup Pixxel gets $7.3 million in funding as it announces a new product even before the launch of its first satellite
Airtel adds 3 times more subscribers than Jio while Vodafone Idea gets net addition first time in 15 months