And here we go again! Locking the door after the horse is out, leapt over the paddock and heading off to Montana (or wherever it is that horses roam):
Facebook says it will take tougher action during the pandemic against claims that vaccines, including the COVID-19 vaccination, are not effective or safe.
Why it matters: It’s a partial reversal from Facebook’s previous position on vaccine misinformation. In September, Facebook CEO Mark Zuckerberg said the company wouldn’t target anti-vaccination posts the same way it has aggressively cracked down on COVID misinformation.
How exciting! Now that the damage is done, what does Facebook plan to do next?
Details: Facebook is doing four things to crack down on misinformation about COVID-19 and vaccinations in general, following consultation with the World Health Organization:
- Updating misinformation policies to bar the posting of debunked claims about the vaccines, like the idea that vaccines are not effective or cause autism. Groups, pages and accounts on Facebook and Instagram that repeatedly share these debunked claims may be removed altogether, the company said.
- Adding directions on how and where to get vaccinated to its COVID-19 information center, tapping information provided by users’ local health officials.
- Giving $120 million in ad credits to help health agencies, NGOs and UN agencies reach billions of people with information about the COVID-19 vaccine and preventive health.
- Making it harder to find unchecked vaccine misinformation on its platforms by returning validated vaccine info when users search for terms related to debunked claims.
It’s a good start, but it’s a year late. Pinterest(!) stopped misinformation on it’s platform almost overnight at the beginning of the pandemic by essentially black-listing search terms.