The year 2020 brought a COVID-19 pandemic as well as a pandemic of misinformation.
From the first reported case in Wuhan, China, scientists have worked around the clock to gather information about this new coronavirus. In a year, we have learned a lot about the structure of the new coronavirus, how it spreads, and ways to reduce transmission.
But with new information comes misinformation. There have been many potentially dangerous theories related to COVID-19, ranging from the new coronavirus being human-made to the idea that injecting bleach or other disinfectants could protect against infection.
With the coincidental rollout of 5G technology, rumors have also linked the new technology to the new coronavirus.
Factors behind the spread of misinformation
The COVID-19 pandemic resulted in widespread lockdowns across the world in 2020. With billions stuck at home, people have increasingly turned to social media, which is playing a pivotal role in the spread of misinformation.
According to an October 2020 study in Scientific Reports, some social media sites, such as Gab, have a far higher proportion of articles from questionable sources circulating than other platforms such as Reddit. Engagement with the content on social media platforms also varied, with Reddit users reducing the impact of unreliable information and Gab users amplifying its influence.
Not all misinformation is shared maliciously. A July 2020 modeling study in Telematics and Informatics found people shared COVID-19 articles — even if they were false — because they were trying to stay informed, help others stay informed, connect with others, or pass the time.
One particular social media platform, Twitter, has become a double-edged sword regarding coronavirus news. A 2020 commentary in the Canadian Journal of Emergency Medicine suggests that Twitter helps rapidly disseminate new information. Still, constant bad news can result in burnout, or push users to seek out more optimistic information that may be false.
But who is more likely to share articles from dubious sources? A 2016 study in PNAS found that like-minded individuals tend to share more articles with each other, but this can lead to polarized groups when article sharing involves conspiracy theories or science news.
Sharing articles with inaccurate information was most observed among conservatives and people over the age of 65 years, suggests a 2019 study in Science Advances. The research was looking at fake news surrounding the 2016 United States political election.
To investigate how misinformation spreads worldwide, an international team of researchers explored what types of misinformation were more likely to be shared with others, and the patterns in how that misinformation spread. Their findings appear in the Journal of Medical Internet Research.
Common misinformation terms
Using the World Health Organization (WHO) website, researchers compiled a list of words falsely associated with causing, treating, or preventing COVID-19. The scientists also included “hydroxychloroquine,” even though it was not part of the WHO new coronavirus mythbuster page at the start of the study.
The authors focused on four misinformation topics that claimed:
drinking alcohol, specifically wine, increases immunity to COVID-19sun exposure prevents the spread of COVID-19, or it is less likely to spread in hot, sunny areashome remedies may prevent or cure COVID-19COVID-19 spreads via 5G cellular networks.
From December 2019 to October 2020, the team used Google trends to look at the frequency of these search terms in eight countries spanning five different continents: Nigeria, Kenya, South Africa, the U.S., the United Kingdom, India, Australia, and Canada.