Exclusive: YouTube removed 30,000 videos with COVID misinformation
Published Date: 3/11/2021
Source: axios.com

YouTube has taken down more than 30,000 videos that made misleading or false claims about COVID-19 vaccines over the last six months, YouTube spokesperson Elena Hernandez said, offering the company's first release of numbers for such content.

Why it matters: Multiple polls show that roughly 30% of Americans remain hesitant or suspicious of the vaccines, and many of those doubts have been stoked by online falsehoods and conspiracy theories.


What's happening: Videos spreading misinformation about COVID-19 vaccines are continuing to appear online as more and more Americans get vaccinated.

  • Platforms, including Facebook and Twitter, have rolled out policies to reduce the spread and reach of such content, but it's an ongoing challenge.

Background: YouTube first started including vaccination misinformation in its COVID-19 medical misinformation policy in October 2020.

  • Since February 2020, YouTube has taken down more than 800,000 videos containing coronavirus misinformation. The videos are first flagged by either the company's AI systems or human reviewers, then receive another level of review.
  • Videos that violate the vaccine policy, according to YouTube's rules, are those that contradict expert consensus on the vaccines from health authorities or the World Health Organization.
  • Accounts that violate YouTube's rules are subject to a "strike" system, which can result in accounts being permanently banned.

Our thought bubble: Platforms are eager to share data about the volume of misinformation they catch, and that transparency is valuable. But the most valuable data would tell us the extent of misinformation that isn't caught.