Much like other platforms, Facebook is taking steps to help the general population be better informed about the Coronavirus outbreak. The platform announced they will directly remove all content that could be a potential health risk for users as they will:
“Start to remove content with false claims or conspiracy theories that have been flagged by leading global health organizations and local health authorities that could cause harm to people who believe them.”
Considering Facebook’s usual stance about user privacy, political ads, and misinformation (spoiler alert: far from ideal), a global health crisis seems to finally be where the platform draws the line to step in and take measures to protect its users safety.
More specifically, Facebook’s purge is centred around misleading content that could cause serious harm, including false diagnoses and treatments:
“We’re focusing on claims that are designed to discourage treatment or taking appropriate precautions. This includes claims related to false cures or prevention methods — like drinking bleach cures the coronavirus — or claims that create confusion about health resources that are available.”
The measures are not limited to Facebook but also be implemented on Instagram, as they will:
“Block or restrict hashtags used to spread misinformation on Instagram, and are conducting proactive sweeps to find and remove as much of this content as we can.”
This is great news. Facebook seems to be recognizing this type of content as harmful, which is a break with past policy that undermined the real-world danger of fake news in other cases, most notably those related to anti-vaxxers.
We can only hope this serves as precedent for Facebook to make some needed revisions to their community policies.
Read more here.