Cognitive cascades: How to model (and potentially counter) the spread of fake news

PLoS One. 2022 Jan 7;17(1):e0261811. doi: 10.1371/journal.pone.0261811. eCollection 2022.

Abstract

Understanding the spread of false or dangerous beliefs-often called misinformation or disinformation-through a population has never seemed so urgent. Network science researchers have often taken a page from epidemiologists, and modeled the spread of false beliefs as similar to how a disease spreads through a social network. However, absent from those disease-inspired models is an internal model of an individual's set of current beliefs, where cognitive science has increasingly documented how the interaction between mental models and incoming messages seems to be crucially important for their adoption or rejection. Some computational social science modelers analyze agent-based models where individuals do have simulated cognition, but they often lack the strengths of network science, namely in empirically-driven network structures. We introduce a cognitive cascade model that combines a network science belief cascade approach with an internal cognitive model of the individual agents as in opinion diffusion models as a public opinion diffusion (POD) model, adding media institutions as agents which begin opinion cascades. We show that the model, even with a very simplistic belief function to capture cognitive effects cited in disinformation study (dissonance and exposure), adds expressive power over existing cascade models. We conduct an analysis of the cognitive cascade model with our simple cognitive function across various graph topologies and institutional messaging patterns. We argue from our results that population-level aggregate outcomes of the model qualitatively match what has been reported in COVID-related public opinion polls, and that the model dynamics lend insights as to how to address the spread of problematic beliefs. The overall model sets up a framework with which social science misinformation researchers and computational opinion diffusion modelers can join forces to understand, and hopefully learn how to best counter, the spread of disinformation and "alternative facts."

MeSH terms

  • COVID-19*
  • Disinformation*
  • Humans
  • Models, Theoretical*
  • Public Opinion*
  • SARS-CoV-2*
  • Social Media*

Grants and funding

We thank the Tufts Data Intensive Studies Center (DISC) and National Science Foundation grant NSF-NRT 2021874, for supporting this research. LC and NR also thank NSF CCF-1934553 for additional support.