Facebook will send notifications to people who have liked, shared or commented on a post that has been taken down for including COVID-19 misinformation.
A new report from Fast Company reveals that the social media giant is changing how it’s notifying users who have engaged with misinformation.
If a user has interacted with a post that was later taken down for violating Facebook’s policies, they will be sent a notification telling them that the content was removed. If they click the notification, they’ll be given an explanation regarding why the post was removed.
The landing page for the notification will also include links to credible resources, which is an attempt on Facebook’s part to set the record straight. Users will also be presented with actions that they can take, such as unfollowing the group that posted the misinformation.
This latest move is an expansion of Facebook’s current efforts to crack down on COVID-19 misinformation. Prior to this feature, the social media giant displayed a banner on users’ news feeds if they interacted with misinformation that was later removed.
A Facebook product manager told Fast Company that this new approach aims to be more direct at informing users, as the banner feature had left some confused.
Although this new feature will be helpful towards preventing the spread of misinformation, it may be rolling out a little too late into the pandemic. It’s also worth noting that the notifications don’t actually debunk the false claims.
Source: Fast Company