Research in Progress: Combating the spread of fake news and misinformation on social media

May 15, 2018

Leticia Bode, Melissa Tully, and Emily Vraga

By Leticia Bode, Georgetown University; Melissa Tully, University of Iowa; and Emily Vraga, George Mason University

During the 2016 United States presidential election campaign, the term “fake news” rose to prominence in popular culture and conversation. The realization that people and groups were intentionally creating, disseminating, and popularizing “news” stories that were deliberately fake—not based on any facts or evidence, and very often, only based on the creator’s imagination—struck a chord with Americans as it suggested that our democracy was under attack.

As journalists, researchers and educators began to grapple with the growing concern about news consumption on social media contributing to the spread of misinformation and “fake news,” calls for more and better media literacy education began to surface. The idea that we might have the power to combat lies and misinformation by becoming more media literate resonated with a public looking for solutions to a problem so embedded in social media news consumption.

Previous research suggests critical thinking plays a role in receptivity to and rejection of misinformation. If people can be made less susceptible to false information, it represents a proactive, rather than reactive, response to fake news and misinformation. In addition, while there is a burgeoning literature on correction of misinformation in general, most research has not looked at correction in the context of social media. Our project will lend insight into the best ways to correct and combat misinformation on social media. Combining news media literacy and corrective information efforts on social media brings together two areas of research and practice that are ripe for exploration.

Previous research has highlighted the difficulties both in creating effective news media literacy interventions and in correcting misperceptions once they are established. Research has shown that media literacy background and knowledge have promise for generating skepticism toward misinformation and reducing acceptance of political conspiracy beliefs.

Likewise, warning people about misinformation techniques related to climate change reduced their susceptibility to a falsely-balanced news article that undermines scientific consensus. Despite these promising studies, however, labeling only some stories as “misinformation” can create the sense that all other stories are true, and can generate skepticism toward accurate or “real” news.

At the same time, correcting misperceptions is notoriously difficult. People tend to resist attempts to correct misperceptions that relate to their identity or values, and misperceptions can linger even after corrections are accepted. Observational correction, which occurs when people see misinformation shared by another social media user being corrected by an algorithm, an expert, or another user, appears to hold promise in mitigating false beliefs, suggesting the need for more work in this area. In short, research has not yet optimized and rarely tested media literacy messages for the social media landscape nor have correction efforts and news media literacy efforts been effectively combined.

With the support of our Page & Johnson Legacy Scholar Grant, we will explore how news media literacy interventions and corrective information can combat the spread of fake news and misinformation on social media. In the next year, Page Center funds will support our work to develop and experimentally test news media literacy and corrective messages designed to be consumed on Twitter, where misinformation is known to circulate.

Not only will we test messaging, we will also test tone, specifically creating tweets that alter the civility of the correction. This project will contribute to developing empirically valid interventions that can inform media literacy and misinformation research, education and practice.

For further information on this study, email
This project was supported by a Page/Johnson Legacy Scholar Grant from the Arthur W. Page Center.