advertisement
Social media have become one of the preeminent ways of disseminating accurate information about vaccines. However, a lot of the vaccine information propagated across social media in the United States has been inaccurate or misleading.
At a time when vaccine-preventable diseases are on the rise, vaccine misinformation has become a cause of concern to public health officials.
A 2018 study showed that a lot of anti-vaccine information is generated by malicious automated programs – known as bots – and online trolls.
In a striking parallel with the 2016 presidential campaign and the 2018 midterm elections, some vaccine misinformation on American social media has been traced back to Russia.
At Saint Louis University’s Center for Health Law Studies, I monitor legal and policy responses to vaccine misinformation.
“Vaccine hesitancy” is what public health officials call the “delay in acceptance or refusal of vaccines” despite their availability.
The World Health Organization has classified vaccine hesitancy as one of 10 big threats to global health in 2019, on the list with air pollution, heart disease, cancer and pandemic outbreaks due to viruses like Ebola.
In a 2019 experiment, several journalists searched the term “vaccine” on Facebook. What came back was predominantly anti-vaccine content, even though the vast majority of parents – 91 per cent in one survey – are pro-vaccine.
One study by the Royal Society for Public Health in the UK found that 41 per cent of parents using social media reported having encountered “negative messages” related to vaccination. The number increased to 50 per cent among parents of children younger than 5.
Memes and other eye-catching visuals can also help propagate the idea that vaccines are unnecessary or harmful, without any reference to scientific or medical data.
Anyone with access to a computer can easily spread inaccurate information about vaccines through social media.
But bots trolling social media can accomplish this goal at a massive level, as they have been doing in the United States at least since 2014.
Bots account for a large percentage of online activity overall. A study analysing online bot activity in 2018 estimated that 20.4 per cent of bots were malicious. Researchers estimate that between 9 per cent and 15 per cent of active Twitter accounts, for instance, are run by bots, instead of people.
Calculations suggest that between 40 per cent and 52 per cent of all internet traffic is automated.
Researchers looked at over 1.7 million vaccine-related tweets between July 2014 and September 2017.
Accounts associated with these two categories tweeted at a higher rate about vaccines than average users. While there are no published studies about other social media, researchers have warned of similar activity on Facebook and YouTube.
In the case of Twitter, there seem to be at least two separate goals behind spreading misleading news about vaccines.
But content originating in Russia conveys both pro- and anti-vaccine messages. This is part of a broader strategy aimed at sowing discord in the US by stirring up conflict around divisive topics.
Some Russian tweets identified in the study used the Twitter #vaccinateUS hashtag. Of all the #vaccinateUS tweets that had Russian sources, 43 per cent were pro-vaccine, 38 per cent were anti-vaccine and 19 per cent were neutral.
A pro-vaccine one asked: “Do you still treat your kids with leaves? No? And why don’t you #vaccinate them? It’s medicine!” An example of an anti-vaccine one read: “#vaccines are a parents choice. Choice of a color of a little coffin.”
The US is not alone in facing increasing levels of vaccine misinformation on social media. Canada has also reported a rise in the number of online bots spreading vaccine misinformation. Moreover, as content from social media is consumed across borders, these issues are now turning into a global problem.
A 2015 study analysing vaccine pins on Pinterest found that the majority were anti-vaccine. By early 2019, the company decided to block all vaccine content from the platform.
Initially, the ban was absolute, regardless of the accuracy or source of the information. In late August, Pinterest announced that it would start allowing content from public health organisations, including the US Centers for Disease Control and Prevention, the American Academy of Pediatrics and World Health Organization.
In March 2019, Facebook announced that it would take steps to diminish anti-vaccine content. The company no longer allows anti-vaccine advertising and says it is considering removing fundraising tools from anti-vaccination Facebook pages.
It no longer “recommends” anti-vaccine content and reduced the rankings of groups and pages conveying vaccine misinformation. They’re less visible, but not banned – these groups and pages are still present on Facebook.
Also in 2019, YouTube prohibited advertising on channels and videos that run anti-vaccination content. Until then, most YouTube searches for “vaccine” served up misinformation at the top of the list results. Afterwards, John Oliver’s HBO episode on vaccines and similar content jumped to the top.
As I wrote this article, dozens of new tweets were added to the #vaccine hashtag on Twitter. Several were similar to this one, tweeted from an account with over 11,000 followers, that conveys an anti-vaccine message under the guise of scientific information.
This account, which appears to be closely related to a previously suspended one, tweeted multiple times per hour. Less than an hour before the tweet above, it had tweeted a visually more blunt message asserting the false link between vaccines and autism.
But for several hours, the vast majority of the tweets on the vaccine hashtag were spreading content that is not supported by current scientific consensus.
While the latest tweets were predominantly anti-vaccination, when I sorted results by “top tweets,” a tweet from the US Department of Health and Human Services, pointing readers toward its own vaccine information page, appeared first.
With outbreaks of vaccine-preventable diseases on the rise, public health institutions like the Centers for Disease Control and Prevention have been increasing their social media presences.
Social media platforms can continue to help reduce misinformation that could further increase vaccine hesitancy in the United States and elsewhere.
As suggested by Pinterest’s approach, these tech companies can increase the amount and visibility of vaccine content from reliable sources. While it’s virtually impossible to eliminate all inaccurate posts, I believe social media can and should be redesigned to facilitate the promotion of accurate vaccine information.
(This is an opinion piece and the views expressed above are the author’s own. The Quint neither endorses nor is responsible for the same. This article was originally published on The Conversation. Read the original article here.)
(Not convinced of a post or information you came across online and want it verified? Send us the details on Whatsapp at 9643651818, or e-mail it to us at webqoof@thequint.com and we'll fact-check it for you. You can also read all our fact-checked stories here.)
(At The Quint, we question everything. Play an active role in shaping our journalism by becoming a member today.)