YouTube on Wednesday announced a sweeping crackdown on vaccine misinformation that kicked popular anti-vaccine influencers off its site and removed bogus claims that have been made about a range of vaccinations – and an area school board meeting. Jersey Shore was originally part of the videos launched.
Jersey Shore Area School District Superintendent Brian Ulmer noted that the district learned last week that its September 13 board meeting had been deleted from YouTube, but has since been reloaded. on the platform.
“The Jersey Shore area school district did not remove the video” he stated. “On the administrative side of our account, there is a warning that says“ your content has been removed due to a violation of our community guidelines. ”The specific policy violation is indicated as“ medical misinformation. ” When reviewing the medical disinformation section of the community guidelines, it says that “YouTube does not allow content that broadcasts false medical information that contradicts World Health Organization medical information or authorities on COVID-19, including methods to prevent, treat or diagnose COVID-19 and means of transmission of COVID-19. ‘ “
The Montoursville-area school district, the other district in Lycoming County to post meetings on YouTube, did not return calls asking if its videos were affected.
The video-sharing platform said it would no longer allow users to speculate baselessly that approved vaccines, like those given to prevent influenza or measles, are dangerous or cause disease.
YouTube’s latest attempt to stem a wave of vaccine misinformation comes as countries around the world struggle to convince a somewhat hesitant vaccine audience to agree to the free vaccinations that scientists say will end the pandemic of COVID-19 that started 20 months ago. The tech platform, which is owned by Google, previously tried to ban misinformation about the COVID-19 vaccine last year, at the height of the pandemic.
“We have regularly seen false claims about coronavirus vaccines turn into misinformation about vaccines in general, and we are now at a point where it is more important than ever to expand the work we started with COVID- 19 to other vaccines. “ YouTube said in a blog post.
Until Wednesday, anti-vaccine influencers, who have thousands of subscribers, had used YouTube to stoke fears about vaccines that health experts say have been administered safely for decades. The YouTube channel of an organization run by environmental activist Robert F. Kennedy Jr. was one of many popular anti-vaccine accounts that went missing Wednesday morning.
In an emailed statement to The Associated Press, Kennedy criticized the ban: “There is no example in history where censorship and secrecy have advanced democracy or public health.”
YouTube declined to provide details on the number of accounts deleted in the crackdown.
As part of its new policy, YouTube says it will remove misinformation about any vaccine approved by health authorities, such as the World Health Organization, and currently administered. False claims that these vaccines are dangerous or cause health problems, such as cancer, infertility or autism – theories scientists have discredited for decades but endured on the internet – should also be suppressed.
“The concept that vaccines harm – instead of helping – is at the root of a lot of misinformation,” said Jeanine Guidry, professor of media and public health at Virginia Commonwealth University School of Medicine.
She added that, if properly enforced, the new rules could prevent bad information from influencing a new parent who uses the internet to research whether or not to vaccinate their child, for example.
But, as is often the case when tech platforms announce tougher rules, gaps remain for anti-vaccine misinformation to spread on YouTube.
Complaints about the vaccines tested will always be allowed. Personal stories about vaccine reactions will also be allowed, provided they are not from an account with a habit of promoting vaccine misinformation.
Although tech companies have announced a slew of new rules regarding COVID-19 and vaccine misinformation during the pandemic, the lies have always found a large following on the platforms.
In March, Twitter began tagging content that made misleading COVID-19 vaccine claims and said it would ban accounts that repeatedly share such messages. Facebook, which also owns Instagram, had previously banned posts claiming that COVID-19 vaccines cause infertility or contain tracking microchips, and in February announced it would similarly drop claims that vaccines are toxic or can cause health problems such as autism.
Yet popular anti-vaccine influencers remain live on Facebook, Instagram and Twitter, where they actively use the platforms to sell books or videos. On Facebook and Instagram alone, a handful of anti-vaccine influencers still have a total of 6.4 million subscribers, according to social media watchdog, the Center for Countering Digital Hate. And misinformation about the COVID-19 vaccine has been so widespread on Facebook that President Joe Biden in July accused influencers on the platform of “killing people” with lies about the COVID-19 vaccine.
Other platforms have taken a harder line. Pinterest, for example, banned any kind of misinformation about vaccines even before the pandemic began. Now, if users are looking for vaccine content on the site, they are encouraged to visit authoritative websites operated by the CDC and WHO.