YouTube said Wednesday it was changing its policy to prevent the spread of misinformation about all vaccines — not just COVID-19 vaccines.
In a blog post on Wednesday, the company said it will begin removing any video that falsely “claims that approved vaccines cause chronic health effects, claims that vaccines do not reduce disease transmission or contraction, or contains misinformation about the substances in vaccines.”
YouTube states that its new guidelines will cover videos that contain conspiracy theories about vaccine viruses, such as those claiming shots contain tracking devices, or vaccines that cause cancer, autism, or infertility.
YouTube explained that the new misinformation policies apply to videos about specific vaccines as well as statements about vaccines generally.
YouTube pointed out that YouTube allows for exceptions to these guidelines. YouTube will allow videos discussing vaccines’ successes and failures as well as the testing trials.
“We have steadily seen false claims about the coronavirus vaccines turn into misinformation about vaccines in general, and we have now reached a point where it is more important than ever to extend the work we have begun with COVID-19 to other vaccines.” In a blog post, the company stated.
Earlier this year, YouTube said it 30,000 videos deleted which contained misinformation about COVID-19 vaccines over a six-month period. Despite their efforts, The Washington Post reported in the summer that the platform is still struggling to contain the spread of misinformation about the virus.