摘要
Recent evidence contradicts the common narrative that partisanship and politically motivated reasoning explain why people fall for 'fake news'.Poor truth discernment is linked to a lack of careful reasoning and relevant knowledge, as well as to the use of familiarity and source heuristics.There is also a large disconnect between what people believe and what they will share on social media, and this is largely driven by inattention rather than by purposeful sharing of misinformation.Effective interventions can nudge social media users to think about accuracy, and can leverage crowdsourced veracity ratings to improve social media ranking algorithms. We synthesize a burgeoning literature investigating why people believe and share false or highly misleading news online. Contrary to a common narrative whereby politics drives susceptibility to fake news, people are 'better' at discerning truth from falsehood (despite greater overall belief) when evaluating politically concordant news. Instead, poor truth discernment is associated with lack of careful reasoning and relevant knowledge, and the use of heuristics such as familiarity. Furthermore, there is a substantial disconnect between what people believe and what they share on social media. This dissociation is largely driven by inattention, more so than by purposeful sharing of misinformation. Thus, interventions can successfully nudge social media users to focus more on accuracy. Crowdsourced veracity ratings can also be leveraged to improve social media ranking algorithms. We synthesize a burgeoning literature investigating why people believe and share false or highly misleading news online. Contrary to a common narrative whereby politics drives susceptibility to fake news, people are 'better' at discerning truth from falsehood (despite greater overall belief) when evaluating politically concordant news. Instead, poor truth discernment is associated with lack of careful reasoning and relevant knowledge, and the use of heuristics such as familiarity. Furthermore, there is a substantial disconnect between what people believe and what they share on social media. This dissociation is largely driven by inattention, more so than by purposeful sharing of misinformation. Thus, interventions can successfully nudge social media users to focus more on accuracy. Crowdsourced veracity ratings can also be leveraged to improve social media ranking algorithms. Fabricated news is nothing new. For example, in 1835 The Sun newspaper in New York published six articles about purported life on the moon which came to be known as the 'Great Moon Hoax'. During the 2016 US Presidential Election and UK Brexit Referendum, however, a different form of fake news (see Glossary) rose to prominence (Box 1): false or highly misleading political 'news' stories, primarily originating on social media [1.Lazer D. et al.The science of fake news.Science. 2018; 359: 1094-1096Crossref PubMed Scopus (727) Google Scholar]. Concern about fake news was redoubled in 2020 in the face of widespread misinformation and disinformation [2.Wardle C. Information Disorder: The Essential Glossary, Shorenstein Center on Media, Politics, and Public Policy. Harvard Kennedy School, 2018Google Scholar] on social media about the coronavirus disease 2019 (COVID-19) pandemic [3.Loomba S. et al.Measuring the impact of COVID-19 vaccine misinformation on vaccination intent in the UK and USA.Nat. Hum. Behav. 2021; (Published online February 5, 2021. https://doi.org/10.1038/s41562-021-01056-1)Crossref Scopus (0) Google Scholar] and the 2020 US Presidential Election [4.Pennycook G. Rand D.G. Examining false beliefs about voter fraud in the wake of the 2020 Presidential Election.Harvard Kennedy Sch. Misinformation Rev. 2021; 2: 1-19Google Scholar]. Misleading hyperpartisan news, as well as yellow journalism [5.Kaplan R.L. Yellow journalism.in: Donsbach W. The International Encyclopedia of Communication. John Wiley & Sons, 2008Crossref Google Scholar], are related forms of problematic news content that are likely sources of political polarization [6.Faris R.M. et al.Partisanship, Propaganda, and Disinformation: Online Media and the 2016 U.S. Presidential Election. Berkman Klein Center for Internet and Society, 2017Google Scholar]. What is it about human psychology – and its interaction with social media [7.Lewandowsky S. et al.Technology and Democracy. Understanding the Influence of Online Technologies on Political Behaviour and Decision-Making, EU Science Hub2020Google Scholar,8.Kozyreva A. et al.Citizens versus the internet: confronting digital challenges with cognitive tools.Psychol. Sci. Public Interest. 2020; 21: 103-156Crossref PubMed Scopus (2) Google Scholar] – that explains the failure to distinguish between accurate and inaccurate content online? Apart from being of theoretical interest, this question has practical consequences: developing effective interventions against misinformation depends on understanding the underlying psychology.Box 1Prevalence of Fake NewsVarious analyses of social media and web browsing data have been used in an attempt to determine the prevalence of fake news, often with a focus on the 2016 US Presidential Election. For example, using web browsing data, archives of fact-checking websites, and a survey, Allcott and Gentzkow [19.Allcott H. Gentzkow M. Social media and fake news in the 2016 election.J. Econ. Perspect. 2017; 31: 211-236Crossref Scopus (1218) Google Scholar] estimated that a particular set of news stories that are known to be false were shared on Facebook at least 38 million times in the 3 months leading up to the 2016 election (30 million of which were for news favoring Donald Trump). This estimate represents a lower bound since it only reflects that specific set of known false news.Other analyses have focused on fake news publishers (i.e., websites) rather than on individual articles. Based on data from Twitter [117.Grinberg N. et al.Fake news on twitter during the 2016 U.S. Presidential election.Science. 2019; 363: 374-378Crossref PubMed Scopus (206) Google Scholar], Facebook [77.Allen J. et al.Scaling up fact-checking using the wisdom of crowds.PsyArXiv. 2020; (Published online October 2, 2020. http://dx.doi.org/10.31234/osf.io/9qdza)Google Scholar,118.Guess A.M. et al.Less than you think: prevalence and predictors of fake news dissemination on Facebook.Sci. Adv. 2019; 5eaau4586Crossref PubMed Scopus (4) Google Scholar], and web browsing [89.Guess A.M. et al.Exposure to untrustworthy websites in the 2016 US election.Nat. Hum. Behav. 2020; 4: 472-480Crossref PubMed Scopus (21) Google Scholar], these studies concluded that content from known fake news sites represents a small proportion of most people's media diets, and that the average social media user was exposed to little fake news during the 2016 election.These analyses have important limitations, however, because the only available data concern what people are sharing and what they visit when they click through to visit news sites off-platform. But, of course, the vast majority of the time that people are exposed to news on social media, they simply read the post without sharing it or clicking on the link to visit the actual source website. Furthermore, so-called 'fake news' only represents one category of misinformation, and misleading content from sources such as hyperpartisan news websites likely represents a much larger proportion of people's media diets [6.Faris R.M. et al.Partisanship, Propaganda, and Disinformation: Online Media and the 2016 U.S. Presidential Election. Berkman Klein Center for Internet and Society, 2017Google Scholar,119.Bradshaw S. et al.Sourcing and automation of political news and information over social media in the United States, 2016–2018.Polit. Commun. 2020; 37: 173-193Crossref Scopus (6) Google Scholar]. Thus, the actual on-platform exposure of the average user to misinformation remains an open question [120.Rogers R. The scale of Facebook's problem depends upon how 'fake news' is classified.Harvard Kennedy Sch. Misinformation Rev. 2020; 1: 1-15Google Scholar]. We feel it is premature to conclude that exposure rates are minimal, and thus that false and misleading news online is not a problem (also [7.Lewandowsky S. et al.Technology and Democracy. Understanding the Influence of Online Technologies on Political Behaviour and Decision-Making, EU Science Hub2020Google Scholar,8.Kozyreva A. et al.Citizens versus the internet: confronting digital challenges with cognitive tools.Psychol. Sci. Public Interest. 2020; 21: 103-156Crossref PubMed Scopus (2) Google Scholar]). This is especially true when looking beyond the 2016 election because new misinformation threats – such as false claims about COVID-19 [3.Loomba S. et al.Measuring the impact of COVID-19 vaccine misinformation on vaccination intent in the UK and USA.Nat. Hum. Behav. 2021; (Published online February 5, 2021. https://doi.org/10.1038/s41562-021-01056-1)Crossref Scopus (0) Google Scholar,44.Pennycook G. et al.Fighting COVID-19 misinformation on social media: experimental evidence for a scalable accuracy nudge intervention.Psychol. Sci. 2020; 31: 770-780Crossref PubMed Scopus (73) Google Scholar] and fraud in the 2020 US Presidential Election [4.Pennycook G. Rand D.G. Examining false beliefs about voter fraud in the wake of the 2020 Presidential Election.Harvard Kennedy Sch. Misinformation Rev. 2021; 2: 1-19Google Scholar] – have gained widespread traction through amplification by (mostly Republican) political elites.Accordingly, exposure to fake news (and misinformation more broadly) is not equally distributed across all users. In particular, political conservatives and older adults were far more likely to visit fake news websites or share fake news articles during the 2016 Presidential Election [19.Allcott H. Gentzkow M. Social media and fake news in the 2016 election.J. Econ. Perspect. 2017; 31: 211-236Crossref Scopus (1218) Google Scholar,89.Guess A.M. et al.Exposure to untrustworthy websites in the 2016 US election.Nat. Hum. Behav. 2020; 4: 472-480Crossref PubMed Scopus (21) Google Scholar,117.Grinberg N. et al.Fake news on twitter during the 2016 U.S. Presidential election.Science. 2019; 363: 374-378Crossref PubMed Scopus (206) Google Scholar,118.Guess A.M. et al.Less than you think: prevalence and predictors of fake news dissemination on Facebook.Sci. Adv. 2019; 5eaau4586Crossref PubMed Scopus (4) Google Scholar]. Studies have also found associations between political conservatism and belief in misinformation in the USA [20.Pennycook G. Rand D.G. Lazy, not biased: susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning.Cognition. 2019; 188: 39-50Crossref PubMed Scopus (177) Google Scholar,44.Pennycook G. et al.Fighting COVID-19 misinformation on social media: experimental evidence for a scalable accuracy nudge intervention.Psychol. Sci. 2020; 31: 770-780Crossref PubMed Scopus (73) Google Scholar], Chile [121.Halpern D. et al.From belief in conspiracy theories to trust in others: which factors influence exposure, believing and sharing fake news.in: Meiselwitz G. Social Computing and Social Media. Design, Human Behavior and Analytics. HCII 2019. Lecture Notes in Computer Science. vol 11578. Springer, Cham2019: 217-232Crossref Scopus (5) Google Scholar], and Germany [122.Zimmermann F. Kohring M. Mistrust, disinforming news, and vote choice: a panel survey on the origins and consequences of believing disinformation in the 2017 German Parliamentary Election.Polit. Commun. 2020; 37: 215-237Crossref Scopus (8) Google Scholar], but not in Hungary [24.Faragó L. et al.We only believe in news that we doctored ourselves: the connection between partisanship and political fake news.Soc. Psychol. (Gott). 2020; 51: 77-90Crossref Scopus (5) Google Scholar], and users who engage in less reasoning have been found to share content from lower-quality news sites on Twitter [71.Mosleh M. et al.Cognitive reflection correlates with behavior on Twitter.Nat. Commun. 2021; 12: 1-10Crossref PubMed Scopus (0) Google Scholar]. Thus, even if it was true that the average social media user was not exposed to that much misinformation, exposures rates are substantially higher in subpopulations that may be particularly vulnerable to believing inaccurate content. Finally, misinformation that originates on social media sometimes transitions to much larger audiences when it is picked up by traditional media outlets – either via direct repetition or debunking (which may result in inadvertent amplification). Various analyses of social media and web browsing data have been used in an attempt to determine the prevalence of fake news, often with a focus on the 2016 US Presidential Election. For example, using web browsing data, archives of fact-checking websites, and a survey, Allcott and Gentzkow [19.Allcott H. Gentzkow M. Social media and fake news in the 2016 election.J. Econ. Perspect. 2017; 31: 211-236Crossref Scopus (1218) Google Scholar] estimated that a particular set of news stories that are known to be false were shared on Facebook at least 38 million times in the 3 months leading up to the 2016 election (30 million of which were for news favoring Donald Trump). This estimate represents a lower bound since it only reflects that specific set of known false news. Other analyses have focused on fake news publishers (i.e., websites) rather than on individual articles. Based on data from Twitter [117.Grinberg N. et al.Fake news on twitter during the 2016 U.S. Presidential election.Science. 2019; 363: 374-378Crossref PubMed Scopus (206) Google Scholar], Facebook [77.Allen J. et al.Scaling up fact-checking using the wisdom of crowds.PsyArXiv. 2020; (Published online October 2, 2020. http://dx.doi.org/10.31234/osf.io/9qdza)Google Scholar,118.Guess A.M. et al.Less than you think: prevalence and predictors of fake news dissemination on Facebook.Sci. Adv. 2019; 5eaau4586Crossref PubMed Scopus (4) Google Scholar], and web browsing [89.Guess A.M. et al.Exposure to untrustworthy websites in the 2016 US election.Nat. Hum. Behav. 2020; 4: 472-480Crossref PubMed Scopus (21) Google Scholar], these studies concluded that content from known fake news sites represents a small proportion of most people's media diets, and that the average social media user was exposed to little fake news during the 2016 election. These analyses have important limitations, however, because the only available data concern what people are sharing and what they visit when they click through to visit news sites off-platform. But, of course, the vast majority of the time that people are exposed to news on social media, they simply read the post without sharing it or clicking on the link to visit the actual source website. Furthermore, so-called 'fake news' only represents one category of misinformation, and misleading content from sources such as hyperpartisan news websites likely represents a much larger proportion of people's media diets [6.Faris R.M. et al.Partisanship, Propaganda, and Disinformation: Online Media and the 2016 U.S. Presidential Election. Berkman Klein Center for Internet and Society, 2017Google Scholar,119.Bradshaw S. et al.Sourcing and automation of political news and information over social media in the United States, 2016–2018.Polit. Commun. 2020; 37: 173-193Crossref Scopus (6) Google Scholar]. Thus, the actual on-platform exposure of the average user to misinformation remains an open question [120.Rogers R. The scale of Facebook's problem depends upon how 'fake news' is classified.Harvard Kennedy Sch. Misinformation Rev. 2020; 1: 1-15Google Scholar]. We feel it is premature to conclude that exposure rates are minimal, and thus that false and misleading news online is not a problem (also [7.Lewandowsky S. et al.Technology and Democracy. Understanding the Influence of Online Technologies on Political Behaviour and Decision-Making, EU Science Hub2020Google Scholar,8.Kozyreva A. et al.Citizens versus the internet: confronting digital challenges with cognitive tools.Psychol. Sci. Public Interest. 2020; 21: 103-156Crossref PubMed Scopus (2) Google Scholar]). This is especially true when looking beyond the 2016 election because new misinformation threats – such as false claims about COVID-19 [3.Loomba S. et al.Measuring the impact of COVID-19 vaccine misinformation on vaccination intent in the UK and USA.Nat. Hum. Behav. 2021; (Published online February 5, 2021. https://doi.org/10.1038/s41562-021-01056-1)Crossref Scopus (0) Google Scholar,44.Pennycook G. et al.Fighting COVID-19 misinformation on social media: experimental evidence for a scalable accuracy nudge intervention.Psychol. Sci. 2020; 31: 770-780Crossref PubMed Scopus (73) Google Scholar] and fraud in the 2020 US Presidential Election [4.Pennycook G. Rand D.G. Examining false beliefs about voter fraud in the wake of the 2020 Presidential Election.Harvard Kennedy Sch. Misinformation Rev. 2021; 2: 1-19Google Scholar] – have gained widespread traction through amplification by (mostly Republican) political elites. Accordingly, exposure to fake news (and misinformation more broadly) is not equally distributed across all users. In particular, political conservatives and older adults were far more likely to visit fake news websites or share fake news articles during the 2016 Presidential Election [19.Allcott H. Gentzkow M. Social media and fake news in the 2016 election.J. Econ. Perspect. 2017; 31: 211-236Crossref Scopus (1218) Google Scholar,89.Guess A.M. et al.Exposure to untrustworthy websites in the 2016 US election.Nat. Hum. Behav. 2020; 4: 472-480Crossref PubMed Scopus (21) Google Scholar,117.Grinberg N. et al.Fake news on twitter during the 2016 U.S. Presidential election.Science. 2019; 363: 374-378Crossref PubMed Scopus (206) Google Scholar,118.Guess A.M. et al.Less than you think: prevalence and predictors of fake news dissemination on Facebook.Sci. Adv. 2019; 5eaau4586Crossref PubMed Scopus (4) Google Scholar]. Studies have also found associations between political conservatism and belief in misinformation in the USA [20.Pennycook G. Rand D.G. Lazy, not biased: susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning.Cognition. 2019; 188: 39-50Crossref PubMed Scopus (177) Google Scholar,44.Pennycook G. et al.Fighting COVID-19 misinformation on social media: experimental evidence for a scalable accuracy nudge intervention.Psychol. Sci. 2020; 31: 770-780Crossref PubMed Scopus (73) Google Scholar], Chile [121.Halpern D. et al.From belief in conspiracy theories to trust in others: which factors influence exposure, believing and sharing fake news.in: Meiselwitz G. Social Computing and Social Media. Design, Human Behavior and Analytics. HCII 2019. Lecture Notes in Computer Science. vol 11578. Springer, Cham2019: 217-232Crossref Scopus (5) Google Scholar], and Germany [122.Zimmermann F. Kohring M. Mistrust, disinforming news, and vote choice: a panel survey on the origins and consequences of believing disinformation in the 2017 German Parliamentary Election.Polit. Commun. 2020; 37: 215-237Crossref Scopus (8) Google Scholar], but not in Hungary [24.Faragó L. et al.We only believe in news that we doctored ourselves: the connection between partisanship and political fake news.Soc. Psychol. (Gott). 2020; 51: 77-90Crossref Scopus (5) Google Scholar], and users who engage in less reasoning have been found to share content from lower-quality news sites on Twitter [71.Mosleh M. et al.Cognitive reflection correlates with behavior on Twitter.Nat. Commun. 2021; 12: 1-10Crossref PubMed Scopus (0) Google Scholar]. Thus, even if it was true that the average social media user was not exposed to that much misinformation, exposures rates are substantially higher in subpopulations that may be particularly vulnerable to believing inaccurate content. Finally, misinformation that originates on social media sometimes transitions to much larger audiences when it is picked up by traditional media outlets – either via direct repetition or debunking (which may result in inadvertent amplification). We focus here primarily on online content that is presented in the form of news articles. However, false and misleading claims come in many forms, and there are several literatures that are clearly related, but outside the scope of our review (although we will draw some connections throughout). These include work on conspiracy belief [9.Sunstein C.R. Vermeule A. Conspiracy theories: causes and cures.J. Polit. Philos. 2009; 17: 202-227Crossref Scopus (126) Google Scholar], superstition [10.Lindeman M. Aarnio K. Superstitious, magical, and paranormal beliefs: An integrative model.J. Res. Pers. 2007; 41: 731-744Crossref Scopus (118) Google Scholar], rumors [11.Berinsky A.J. Rumors and health care reform: experiments in political misinformation.Br. J. Polit. Sci. 2017; 47: 241-262Crossref Scopus (171) Google Scholar], bullshit receptivity [12.Pennycook G. et al.On the reception and detection of pseudo-profound bullshit.Judgm. Decis. Mak. 2015; 10: 549-563Google Scholar], and misperceptions [13.Amazeen M.A. et al.Correcting political and consumer misperceptions: the effectiveness and effects of rating scale versus contextual correction formats.J. Mass Commun. Q. 2018; 95: 28-48Google Scholar], among others. Furthermore, our focus is on individual examples of misinformation, and not on organized disinformation campaigns (e.g., by the Russian Internet Research Agency, or campaigns relating to global warming or fraud in the 2020 US Presidential Election). When considering the factors that may influence what people believe, it is essential to distinguish between two fundamentally different ways to conceptualize belief in true and false news. One common approach is to focus on truth 'discernment', or the extent to which misinformation is believed 'relative' to accurate content. Discernment, typically calculated as belief in true news minus belief in false news (akin to 'sensitivity' or d' in signal detection theory [14.Wickens T. Elementary Signal Detection Theory. Oxford University Press, 2002Google Scholar]) captures the 'overall' accuracy of one's beliefs – and thus gives insight into failures to distinguish between true and false content ('falling for fake news'). Another approach is to focus on overall belief, or the extent to which news – regardless of its accuracy – is believed (calculated as the average or sum of belief in true news and belief in false news, akin to calculating 'bias' in signal detection theory [14.Wickens T. Elementary Signal Detection Theory. Oxford University Press, 2002Google Scholar]). Critically, factors that alter overall belief need not impact people's ability to tell truth from falsehood [15.Batailler, C. et al. A signal detection approach to understanding the identification of fake news. Perspect. Psychol. Sci. (in press)Google Scholar]: increasing or decreasing belief in true and false headlines to an equivalent extent has no effect on the overall accuracy of one's beliefs (i.e., does not affect truth discernment). A popular narrative is that the failure to discern between true and false news is rooted in political motivations. For example, it has been argued that people are motivated consumers of (mis)information [16.Kahan D.M. Misconceptions, misinformation, and the logic of identity-protective cognition.in: SSRN Electron. J. Cultural Cognition Project Working Paper Series No. 164, Yale Law School, Public Law Research Paper No. 605, Yale Law & Economics Research Paper No. 575. 2017Crossref Google Scholar] – that they engage in 'identity-protective cognition' when faced with politically valenced content, and this leads them to be overly believing of content that is consistent with their partisan identity and overly skeptical of content that is inconsistent with their partisan identity [17.Kahan D.M. Ideology, motivated reasoning, and cognitive reflection.Judgm. Decis. Mak. 2013; 8: 407-424Google Scholar]. A related theory argues that people place loyalty to their political identities above the truth – and thus fail to discern truth from falsehood in favor of simply believing ideologically concordant information [18.Van Bavel J.J. Pereira A. The partisan brain: an Identity-based model of political belief.Trends Cogn. Sci. 2018; 22: 213-224Abstract Full Text Full Text PDF PubMed Scopus (68) Google Scholar]. These accounts contend that a strong causal influence of political motivation on belief is thus the dominant factor explaining why people fall for fake news. It is clearly true that partisanship is associated with overall belief: People are more likely to believe news content that is concordant (versus discordant) with their political partisanship [19.Allcott H. Gentzkow M. Social media and fake news in the 2016 election.J. Econ. Perspect. 2017; 31: 211-236Crossref Scopus (1218) Google Scholar, 20.Pennycook G. Rand D.G. Lazy, not biased: susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning.Cognition. 2019; 188: 39-50Crossref PubMed Scopus (177) Google Scholar, 21.Pennycook G. et al.Shifting attention to accuracy can reduce misinformation online.Nature. 2021; https://doi.org/10.1038/s41586-021-03344-2Crossref PubMed Google Scholar, 22.Pereira A. et al.Identity concerns drive belief: the impact of partisan identity on the belief and dissemination of true and false news.PsyArXiv. 2020; (Published online September 18, 2018. http://dx.doi.org/10.31234/OSF.IO/7VC5D)Google Scholar, 23.Vegetti F. Mancosu M. The impact of political sophistication and motivated reasoning on misinformation.Polit. Commun. 2020; 37: 678-695Crossref Scopus (1) Google Scholar, 24.Faragó L. et al.We only believe in news that we doctored ourselves: the connection between partisanship and political fake news.Soc. Psychol. (Gott). 2020; 51: 77-90Crossref Scopus (5) Google Scholar, 25.Drummond C. et al.Limited effects of exposure to fake news about climate change.Environ. Res. Commun. 2020; 2: 081003Crossref Google Scholar] (Figure 1B ). It is important to note, however, that the effect of political concordance is typically much smaller than that of the actual veracity of the news [20.Pennycook G. Rand D.G. Lazy, not biased: susceptibility to partisan fake news is better explained by lack of reasoning than by motivated reasoning.Cognition. 2019; 188: 39-50Crossref PubMed Scopus (177) Google Scholar,21.Pennycook G. et al.Shifting attention to accuracy can reduce misinformation online.Nature. 2021; https://doi.org/10.1038/s41586-021-03344-2Crossref PubMed Google Scholar,26.Bago B. et al.Fake news, fast and slow: deliberation reduces belief in false (but not true) news headlines.J. Exp. Psychol. Gen. 2020; 149: 1608-1613Crossref PubMed Scopus (22) Google Scholar]. In other words, true but politically discordant news is typically believed much more than false but politically concordant news – politics does not trump truth. Furthermore, greater overall belief in politically consistent news does not necessarily indicate politically motivated reasoning. Such differences could even arise from unbiased rational (e.g., Bayesian) inference built on prior factual beliefs that differ across party lines (e.g., owing to exposure to different information environments) [27.Tappin B.M. et al.Thinking clearly about causal inferences of politically motivated reasoning: why paradigmatic study designs often undermine causal inference.Curr. Opin. Behav. Sci. 2020; 34: 81-87Crossref Scopus (7) Google Scholar, 28.Tappin B.M. et al.Rethinking the link between cognitive sophistication and politically motivated reasoning.J. Exp. Psychol. Gen. 2020; (Published online October 29, 2020. http://dx.doi.org/10.31234/OSF.IO/YUZFJ)Crossref PubMed Google Scholar, 29.Tappin B.M. et al.Bayesian or biased? Analytic thinking and political belief updating.Cognition. 2020; 204: 1-12Crossref Scopus (0) Google Scholar, 30.Baron J. Jost J.T. False equivalence: are liberals and conservatives in the United States equally biased?.Perspect. Psychol. Sci. 2019; 14: 292-303Crossref PubMed Scopus (34) Google Scholar, 31.Gerber A. Green D. Misperceptions about perceptual bias.Annu. Rev. Polit. Sci. 1999; 2: 189-210Crossref Scopus (185) Google Scholar, 32.Leeper T.J. Slothuus R. Political parties, motivated reasoning, and public opinion formation.Polit. Psychol. 2014; 35: 129-156Crossref Scopus (171) Google Scholar, 33.Friedman J. Motivated skepticism or inevitable conviction? Dogmatism and the study of politics.Crit. Rev. 2012; 24: 131-155Crossref Scopus (10) Google Scholar] (Box 2 for details).Box 2Challenges in Identifying Politically Motivated ReasoningThe observation that people are more likely to believe information that is consistent with their political ideology/partisanship (and are less likely to believe information that is inconsistent with their ideology/partisanship) is often taken as evidence for politically motivated reasoning [22.Pereira A. et al.Identity concerns drive belief: the impact of partisan identity on the belief and dissemination of true and false news.PsyArXiv. 2020; (Published online September 18, 2018. http://dx.doi.org/10.31234/OSF.IO/7VC5D)Google Scholar,123.Ditto P.H. et al.At least bias is bipartisan: a meta-analytic comparison of partisan bias in liberals and conservatives.Perspect. Psychol. Sci. 2019; 14: 273-291Crossref PubMed Scopus (78) Google Scholar,124.Clark C.J. Winegard B.M. Tribalism in war and peace: the nature and evolution of ideological epistemology and its significance for modern social science.Psychol. Inq. 2020; 31: 1-22Crossref Scopus (4) Google Scholar]. Critically, however, this pattern does not actually provide clear evidence of politicall