Two studies found that popular social media posts are filled with inaccuracies about science, which can be damaging to public health amid coronavirus pandemic, as reported by CNN.
One study found that more than one in four of the most popular YouTube videos about the novel coronavirus contained misinformation. In contrast, another found that vaccine skeptics are gaining more engagement on Facebook.
The YouTube Misinformation
A team of researchers in Canada found that more than 70% of adults search for health information online. They examined YouTube videos that mentioned coronavirus that trended in a day earlier this year.
Researchers looked into these Facebook users and divided them into three groups, based on their opinions: pro-vaccine, anti-vaccine, and the undecided.
According to this study, while fewer people did not believe in vaccines, there was nearly three times the number of anti-vaccination groups on Facebook. These groups allowed to become more entwined with the undecided communities, which eventually swayed some opinions.
Neil Johnson, who co-authors the study, said that while anti-vaccine groups are relatively small, they appear big online because they offer numerous arguments. These include ideas that vaccines caused health problems while others emphasized on free choice. Others trickled conspiracy theories.
Tech Times listed the signs on how to spot a conspiracy theory in social media, based on a Conspiracy Theory Handbook.
Signs include conflicting ideas, overriding suspicion, evil intent, immunity to evidence, reinterpreting uncertainty, among others. These can guide anyone to spot conspiracy theories, particularly those linked to the coronavirus pandemic.
Johnson also said he is "seeing people in these groups saying that they won't get a Covid-19 vaccine," and they seek other options to keep themselves safe. He told CNN that this study hopefully will help health authorities to find new communication strategies to reach more audiences.