Skip to main content

YouTube bans vaccine misinformation

YouTube bans vaccine misinformation

/

The platform is also banning prominent anti-vaccine activists

Share this story

Illustration by Alex Castro / The Verge

In a new attempt to stem the flow of anti-vaccine misinformation, YouTube said Wednesday that it won’t allow videos that claim vaccines approved by health authorities are dangerous or don’t work. The platform is also banning prominent anti-vaccine accounts, including Joseph Mercola’s channel and the Robert F. Kennedy Jr.-linked Children’s Health Defense Fund.

YouTube pulled ads from anti-vaccination content in 2019, and said in October 2020 that it would remove videos that pushed misinformation around COVID-19 vaccines. The new policy expands to block misinformation around other vaccines, including the flu shot, the HPV vaccine, and the measles, mumps, and rubella (MMR) vaccine. Videos that inaccurately claim that the MMR vaccine causes autism or that the flu shot causes infertility, for example, will not be allowed under the new policy.

There are some exceptions: YouTube will still allow videos that include people sharing their personal experiences with vaccination. It’ll remove that content if the channels they’re on “demonstrate a pattern of promoting vaccine misinformation.” The guidelines say that the platform will also allow videos with information violating the policy if that video includes other context, like statements from medical experts.

Along with the new policy, YouTube is also terminating the channels of major anti-vaxxers, a YouTube spokesperson confirmed to The Verge. Those include Joseph Mercola, the Children’s Health Defense Fund, Erin Elizabeth, and Sherri Tenpenny. Channels for two other major figures, Rashid Bhuttar and Ty and Charlene Bollinger, were terminated a few months ago, the spokesperson said.

Those anti-vaccine figures are all part of the “Disinformation Dozen,” a group identified by the Center for Countering Digital Hate as responsible for the bulk of misleading claims about vaccines on social media.

YouTube expanded its vaccine policies after noting that misinformation around all vaccines could contribute to mistrust around the COVID-19 vaccine, Matt Halprin, YouTube’s vice president of global trust and safety, told The Washington Post. Over the past few months, the backlash to COVID-19 vaccinations has been expanding to target other vaccines: the Tennessee Department of Health temporarily suspended outreach around childhood vaccinations this summer, and a Florida state senator said he wants to “review” school vaccination requirements.

Facebook similarly expanded its vaccine misinformation policy in February to block claims that the shots are dangerous.

Correction September 30, 8:38AM ET: A previous version of this article misidentified the Robert F. Kennedy Jr. linked fund as the Children’s Defense Fund. It is the Children’s Health Defense Fund. We regret the error.