Merkley, Eric, and Peter Loewen. 2021. “Science and Health Misinformation in the Digital Age: How Does It spread? Who Is Vulnerable? How Do We Fight It?” Public Policy Forum
Misinformation on science, technology and public health poses serious challenges to society, a situation magnified by rapid advancement in communications technology and development of online social networks. As enabling as these developments have been for the sharing and dissemination of credible information, the same is true of misinformation — and there is no silver bullet to address it.
Misinformation comes from a variety of sources, exploiting the tendency of many to fail to evaluate the veracity of information they are receiving and to prefer information aligned with their political beliefs. If it were benign, the prevalence of misinformation — and, similarly, fake news — could be dismissed, but exposure to misinformation is a cause of misperceptions among the general public that shape how people act politically. Nowhere is that truer than in the context of public health. Misinformation has been particularly problematic in science, technology and health policy. It preys on people’s predisposition to have strong, intuitive reactions to scientific advances, while having little knowledge base to accurately distinguish facts from falsehoods. Fueled by misinformation, many people endorse science-related beliefs that run counter to established scientific consensus, and they are less likely to heed the advice of scientists and medical experts as a result.
While the proliferation of misinformation and fake news appears low, there is little data that tracks its exposure and consumption. This report looks to answer three questions related to science communication and misinformation — How is misinformation spread? Who is most likely to fall prey to misinformation? How do we combat misinformation and its effects? — in part by highlighting case studies on climate change, vaccines and COVID-19.
Broadly speaking, there are three approaches to this problem: controlling its spread; correcting its effects through debunking (fact-checking) or persuasion; and pre-emptive interventions that allow the public to resist misinformation they encounter. With this in mind, five recommendations are presented:
- Track misinformation and debunk when needed;
- Promote accuracy-focused messaging;
- Invest in targeted persuasion focusing on downstream behaviours;
- Build relationships with trusted community leaders;
- Start early to create digital literacy and interest in science.
When taken in concert, these recommendations have the potential to mitigate the consequences of misinformation in science and public health.
Owen, Taylor, Peter Loewen, Derek Ruths, Aengus Bridgman, Haji Mohammed Saleem, Eric Merkley, Oleg Zhilin. 2021. “Understanding Vaccine Hesitancy in Canada: Attitudes, Beliefs, and the Information Ecosystem.” Media Ecosystem Observatory.
- 65% of Canadians intend to take a vaccine, with some slight erosion since a high in July. Approximately 15% of Canadians are unwilling and 20% are unsure.
- Our best opportunity to reach those who are unsure is to address important concerns around the safety and effectiveness of the vaccine.
- Among those who do not plan to take a vaccine, many also believe that COVID-19 is not a serious threat. It will be very difficult to convince these individuals to take a vaccine. The efficacy, safety, country-of-origin, type of vaccine, and other characteristics of a hypothetical vaccine simply do not matter to this population, whereas for other Canadians these characteristics are critical for their decision to vaccinate or not.
- Canadians are increasingly talking about vaccines on social media. The overall positive sentiment that health officials have promoted regarding vaccines hasn’t taken hold in these conversations, however.
- There is minimal coverage of vaccine conspiracies in Canadian mainstream media. Instead, mainstream media coverage has focused on stories about development, provision, and access, with wide scale vaccination highlighted as the solution to the pandemic.
- Despite this positive coverage, vaccine-related stories from independent outlets have appeared on social media that more heavily feature conspiratorial thinking and cynicism about vaccines. This type of content tends to elicit stronger and more emotional responses from Canadians, which may cause this content to spread more widely and rapidly on social media platforms.
- The vaccine conversation on social media largely originates from U.S.-based discussions. Canadians on social media are heavily influenced by U.S.-based information and are far more likely to propagate non-Canadian content. This flood of U.S.-based information represents a unique Canadian vulnerability, where Canadian elites, medical professionals, scientists, and journalists may be comparatively less able to reach and inform Canadians.
Bridgman, Aengus, Robert Gorwa, Peter Loewen, Stephanie MacLellan, Eric Merkley, Taylor Owen, Derek Ruths, Oleg Zhilin. 2020. “Lessons in Resilience: Canada’s Digital Media Ecosystem and the 2019 Election.” Digital Democracy Project.
Researchers, policy-makers and the public at large are paying more attention to the threats that disinformation and other forms of online media manipulation pose to democratic institutions and political life. Starting with the Brexit referendum and United States election in 2016, and building through the European Parliament elections in 2019, concerns about co-ordinated disinformation campaigns organized by state or otheractors, automated social media accounts (“bots”), malicious and deceptive advertising, and polarization enhanced by algorithmic “filter bubbles” have reached an all-time high.
The Digital Democracy Project (DDP) was set up to help build the international evidence base on the impact of these trends with a robust Canadian case study whose methods could be applied in other locations. The project consists of three phases. The first was a two-day workshop for journalists about disinformation threats, held in May 2019 in Toronto. The second, and the focus of this report, was researching and monitoring the digital media ecosystem in real time ahead of the Canadian federal election on Oct. 21, 2019. The third and final phase, beginning in early 2020, will involve further research and consultations with experts and public representatives to develop policy recommendations.
We launched this phase of the project in August 2019 and continued collecting data until the end of November. This work builds on the growing field of study of election integrity and, in particular, on the study of the spread and influence of media exposure (both online and offline) and of disinformation and toxic content on the behaviour of voters. Using a novel approach that combined online data analysis with a battery of representative national surveys, we sought to contextualize and better understand developing patterns of online activity with measures of media consumption, trust and partisanship.
Overall, our findings suggest the Canadian political information ecosystem is likely more resilient than that of other countries, in particular the U.S., due to a populace with relatively high trust in the traditional news media, relatively homogeneous media preferences with only a marginal role for hyperpartisan news, high levels of political interest and knowledge, and — despite online fragmentation — fairly low levels of ideological polarization overall. While we do find affective polarization, which involves how individuals feel about other parties and their supporters, we find less polarization on issues, which has been a key point of vulnerability in other international elections.
Despite some worries about automated activity being used to game trending hashtags on Twitter or the presence of a few disreputable online outlets, our research suggests their impact was limited. While there remain significant blind spots in the online ecosystem caused by limited data access for researchers, based on the communication we could see, we did not find evidence of any impact attributable to a co-ordinated disinformation campaign.
Looking forward, however, we find evidence to suggest potential future vulnerabilities, most of which are related to growing partisanship and polarization, as well as the segmentation of the populace into online information environments that reinforce existing world views.