Home News Facebook is finally banning vaccine misinformation

Facebook is finally banning vaccine misinformation

Facebook is finally banning vaccine misinformation

Open Sourced logo

Almost a year into the Covid-19 pandemic, Facebook is taking its strictest stance yet against vaccine misinformation by banning it entirely. The ban won’t just apply to Covid-19 vaccine misinformation. That means, for instance, posts claiming that vaccines cause autism, or that measles can’t kill people, are no longer allowed on Facebook. At the same time, the platform will also encourage Americans to get inoculated, and will direct people to information about when it’s their turn for a Covid-19 vaccine and how to find an available dose.

These moves, part of a broader push by the company, are significant because with nearly 3 billion users, Facebook is one of the most influential social media networks in the world. And as inoculations have begun to roll out around the world, many are concerned that misinformation — including misinformation on Facebook — could exacerbate some people’s refusal or hesitancy to get vaccinated.

In a blog post published on Monday, Facebook explained that these changes are part of what it’s calling the “largest worldwide campaign” to promote authoritative information about Covid-19 vaccinations. The effort is being developed in consultation with health authorities like the World Health Organization, and will include elevating reputable information from organizations like the United Nations and various health ministries. (A list of banned vaccine claims, which was formed with the help of health authorities, is available here.) The overall approach seems similar to Facebook’s US voter registration initiative, which the company claims helped sign up several million people to participate in the November election.

“A year ago, Covid-19 was declared a public health emergency and since then, we’ve helped health authorities reach billions of people with accurate information and supported health and economic relief efforts,” wrote Kang-Xing Jin, Facebook’s head of health, on Monday. “But there’s still a long road ahead, and in 2021 we’re focused on supporting health leaders and public officials in their work to vaccinate billions of people against Covid-19.”

A big caveat of the new policy is that just because Facebook says its guidelines about vaccine misinformation are changing doesn’t mean that vaccine misinformation won’t end up on the site anyway. Changing rules and enforcing rules are two different things. Despite Facebook’s earlier rules banning misinformation specifically about Covid-19 vaccines, images suggesting that coronavirus inoculations came with extreme side effects were still able to go viral on the platform, and some racked up tens of thousands of “Likes” before Facebook took them down.

A Facebook spokesperson told Recode the company will enforce its expanded rules as it becomes aware of content that violates them, regardless of whether it’s already been posted or is posted in the future. The spokesperson did not say whether Facebook is increasing its investment in content moderation given its increased scope for vaccine misinformation, but told Recode that expanding its enforcement will require time to train its content moderators and systems.

Still, Monday’s changes are significant because Facebook CEO Mark Zuckerberg, who has repeatedly defended principles of free expression, now says the company will be paying particular attention to pages, groups, and accounts on both Facebook and Instagram (which Facebook owns) that regularly share vaccine misinformation, and may remove them entirely. It’s also adjusting search algorithms to reduce the prominence of anti-vax content.

Like other enforcement actions Facebook has taken — on everything ranging from the right-wing, anti-Semitic QAnon conspiracy theory to incitements of violence posted by Donald Trump — some say the company’s move is too delayed. “This is a classic case of Facebook acting too little, too late,” Fadi Quran, a campaign director at the nonprofit Avaaz who leads its disinformation team, told Recode. “For over a year Facebook has sat at the epicenter of the misinformation crisis that has been making this pandemic worse, so the damage has already been done.” He said that at this point, much more needs to be done to address users who have already seen vaccine misinformation.

Facebook’s announcement comes as major technology platforms wrestle with their role in the Covid-19 crisis. Back in the fall, experts warned that social media platforms were walking a delicate line when it comes to the global vaccine effort: While social networks should promote accurate information about Covid-19 inoculations, they said, platforms must also leave room for people to express honest questions about these relatively new vaccines.

“We have a new virus coupled with a new vaccine coupled with a new way of life — it’s too much newness to people,” Ysabel Gerrard, a digital sociologist at the University of Sheffield, told Recode at the time. “I think the pushback against a Covid-19 vaccine is going to be on a scale we’ve never seen before.”

How well Facebook will enforce its new rules, or how many people the platform will help get vaccinated, is unclear. The changes it announced on Monday come after experts have repeatedly warned about Facebook’s role in promoting anti-vaccine conspiracy theories. For years, researchers have flagged Facebook as a platform where wrong and misleading information about vaccines — including the idea that vaccines can be linked to autism — have proliferated.

Open Sourced is made possible by Omidyar Network. All Open Sourced content is editorially independent and produced by our journalists.

This article is auto-generated by Algorithm Source: www.vox.com

Related Posts

0

Ad Blocker Detected!

Refresh