Facebook says that groups dedicated to health topics are no longer eligible to appear in recommendations. The update is part of the company's latest effort to combat misinformation.
“In order to give priority to connecting people with the right health information, we are starting to stop presenting health groups in recommendations,” Facebook said in a statement. Facebook notes that users will still be able to search for these groups and invite others to join them, but that these groups will no longer appear in the suggestions category.
Facebook groups, especially those dealing with health issues, have long been problematic for the company. Groups dedicated to anti-vaccine conspiracy theories, for example, have also been linked to the misinformation of QAnon and COVID-19 – often through the company's algorithmic proposals. Mark Zuckerberg recently said that the platform will not eliminate anti-vaccine posts as in the case of COVID-19 misinformation.
Speaking about QAnon, Facebook says it is taking an extra step to prevent the spread of conspiracy theories by “reducing their content in the News Feed.” The company has previously eliminated hundreds of groups associated with the movement, but has not completely eliminated its presence.
If the sole administrator of a group leaves, the company will suggest the role of some members of the group. If no one agrees, the company will permanently archive the group. Eventually, Facebook will now archive groups that no longer have an active administrator. “In the coming weeks, we will start archiving groups that have been without an administrator for some time,” Facebook writes. In the future, the company will recommend administrator roles to group members without one before archiving.
Facebook notes that it repeatedly penalizes groups that share false claims that are rejected by factual verifiers and that it has eliminated more than a million groups in the past year for repeated violations or Violation of its rules. But critics have long said that Facebook is not doing enough in this regard – cases that have been linked to misinformation, harassment and threats of violence.
However, the social network has long struggled to combat viral misinformation about the pandemic. Facebook has recently taken another step to make it harder to spread untrue information about the coronavirus pandemic. The social network has started to display a pop-up that appears every time a user shares a link to COVID-19 content.
The notification will include a link to Facebook's coronavirus information center and include details about the age of the article and when it was first distributed. Facebook released a similar update last month, which gives similar warnings when users share articles from older news.
Facebook hopes to slow down the spread of outdated or less credible information. “Notification will help people understand how recent it is, but it's also the source of the content before they share it,” Facebook wrote in an update.