+ Reply to Thread
Results 1 to 1 of 1

Thread: Gullibility or Rational Skepticism ?

  1. Link to Post #1
    UK Avalon Member Clear Light's Avatar
    Join Date
    8th September 2015
    Age
    53
    Posts
    1,006
    Thanks
    1,815
    Thanked 5,304 times in 950 posts

    Thumbs up Gullibility or Rational Skepticism ?

    Oh, I've just come across this (2019) book "The Social Psychology of Gullibility", please see a few excerpts below including the final chapter "Belief in Conspiracy Theories - Gullibility or Rational Skepticism ?" which I'd say some here may find interesting eh ?

    Click image for larger version

Name:	The Social Psychology of Gullibility Cover.png
Views:	37
Size:	435.2 KB
ID:	41333

    Quote Introduction

    Gullibility as a scientific concept does not currently feature prominently in social psychology research, and one would search in vain the subject indexes of many social psychology textbooks for entries under “gullibility.” So why devote an entire book to this topic, and why do it now? The answer is twofold. First, in the past few years, and especially since Brexit, the election of Trump, and the emergence of crypto-fascist dictators in a number of countries including some inside the European Union such as Hungary (Albright, 2018), the question of human gullibility has become one of the dominant topics of interest in public discourse (see also Cooper & Avery, Chapter 16 this volume; Myers, Chapter 5 this volume). People opposed to these developments often suspect that those who voted for them must be gullible.

    Second, even though gullibility is rarely studied directly in social and cognitive psychology, these disciplines do have a great deal to contribute to our understanding of how human judgments and decisions can be distorted and undermined. In consequence, a book dealing with the social psychology of gullibility is highly topical, and as this volume demonstrates, there is a wealth of directly relevant empirical research we can draw upon to understand this phenomenon (Gilbert, 1991; Gilovich, 1991). The objective of this volume is thus to provide an integrative survey of the current state of social psychological research on human gullibility, and so offer an informative contribution towards understanding the role of gullibility in contemporary public affairs.
    Quote What Is Gullibility?

    Gullible as a term was first recorded in 1793, derived from the earlier word “cullibility” (1728), and possibly connected to “gull,” a cant term for “dupe, sucker,” which in turn is of uncertain origin. Its etymological roots can be traced perhaps from the bird (sea gull), or to the verb “gull” (to swallow). Some of the synonyms of gullibility, such as credulity, artlessness, ignorance, inexperience, simplicity, also confirm the pejorative character of gullibility. So consensually negative social evaluation, as we shall see later, is an essential component of gullibility.

    The standard definition of gullibility, as a failure of social intelligence in which a person is easily tricked or manipulated into an ill-advised course of action, confirms this view. Gullibility is closely related to credulity, which is the “tendency to believe unlikely propositions that are unsupported by evidence” (Wikipedia). Gullibility is thus a factor in social influence processes, as a person’s willingness to believe false or misleading information facilitates the influence.
    Quote Chapter 17 - Belief in Conspiracy Theories / Gullibility or Rational Skepticism ?

    Conspiracy theories are widespread in our society. Surprisingly large numbers of citizens believe allegations that the Moon landings were filmed in a TV studio, that humans created the HIV virus in the lab, and that the 9/11 terrorist strikes were an inside job of the U.S. government (e.g., Douglas, Sutton, & Cichocka, Chapter 4 this volume; Sunstein & Vermeule, 2009; van Prooijen, 2018). Conspiracy theories are commonly defined as assumptions about a group of actors that colludes in secret agreement to reach goals widely seen as evil (Bale, 2007). While conspiracy theories sometimes turn out to be true (e.g., the Iran-Contra scandal), quite often conspiracy theories are implausible in light of logic or scientific evidence, and therefore deviate from mainstream narratives. People who strongly believe conspiracy theories hence are highly skeptical of regular news sources and official readings of events, and often proclaim to be rational human beings who “just ask questions.” A qualitative analysis of interviews with citizens active in the Dutch conspiracy milieu reveals that believers actively reject the qualification “conspiracy theorist,” and prefer to see themselves as “critical freethinkers” that positively distinguish themselves from “the sheeple,” who are gullible and easily manipulated by powerholders (Harambam & Aupers, 2017). This self-perception as a critical and rational thinker is underscored by the following quote, which is drawn from a conspiracist website (www.sheepkillers.com) explicitly focused on protecting, and opening the eyes of, “the sheep” who supposedly are led astray by the powerful and immoral leaders that rule our nations: “If you think 9/11 was the result of cave-dwelling terrorists attacking our country, bringing down airplanes with box-cutters and collapsing entire buildings into their footprints, you really are a sheep.”

    How rational is the tendency to believe conspiracy theories? Looking at the specific contents of a range of conspiracy theories, one needs to acknowledge how well crafted, complex, and creative many conspiracy theories are. For instance, conspiracy theories about the 9/11 terrorist strikes often assume that not the impact of the planes but controlled demolition made the Twin Towers collapse. These theories are based on scientifically grounded arguments about the steel construction of the Twin Towers, the temperatures at which steel melts (about 2,750 oF) and the maximum temperatures reached by burning kerosene (about 1,500 oF). Even extremely far-fetched conspiracy theories are remarkably well designed. For instance, the flat-earth movement endorses the theory that our planet Earth is in fact flat, and that the public has been deceived for over 400 years by scientists and world governments to believe that the earth is round (or, to be more precise, somewhat oval). Their arguments include detailed accounts of how NASA routinely manipulates or fabricates satellite pictures, testimonies of airplane pilots who confirm to not see the Earth’s curvature at high altitude, and technical descriptions of how airplane windows are designed to provide a perceptual illusion of a curving Earth.

    While in the present contribution I will not seriously examine the contents of these conspiracy theories (I am comfortable asserting here that the impact of the planes and the fires that subsequently erupted caused the collapse of the Twin Towers on 9/11, and that the Earth is round albeit not perfectly so), I will seriously consider two opposing hypotheses about the social psychology of conspiracy theories. The first hypothesis is that, as suggested above, belief in conspiracy theories is grounded in a mindset characterized by rational skepticism. According to this view, people who believe conspiracy theories are indeed “critical freethinkers” who do not take official readings of events for granted, but instead carefully and independently collect and examine evidence to form their own objective judgments. Their conclusions may sometimes be wrong (just like scientists sometimes make honest mistakes when interpreting research data), but the epistemic process through which believers construct or accept conspiracy theories is deliberative, analytic, and utilizes the approach of a “lay scientist.” I refer to this idea as the rational conspiracist hypothesis.

    The second and alternative hypothesis, however, is that belief in conspiracy theories is grounded in a mindset characterized by gullibility. According to this view, people construct or accept conspiracy theories through System 1 processes including heuristics, emotions, and intuitive thinking (see also Myers, Chapter 8 this volume; Unkelbach & Koch, Chapter 3 this volume). A deep-rooted distrust in power holders or other groups leads believers to reflexively reject official accounts of impactful events, and to uncritically accept implausible conspiracy theories. Through motivated reasoning and the confirmation bias, believers subsequently justify their suspicious sentiments by selectively embracing evidence that supports their theory and rejecting evidence inconsistent with it, providing the illusion of a well-elaborated and irrefutable argument. I refer to this idea as the gullible conspiracism hypothesis. In the following, I review the psychological literature on conspiracy theories to test these two competing hypotheses. I will specifically examine the empirical relationships of belief in conspiracy theories with (1) a range of implausible beliefs that do not involve conspiracies, (2) cognitive biases, (3) stereotyping, and (4) cognitive style.

    Belief in Conspiracy Theories

    Although conspiracy theories vary widely in content, the tendency to believe them is grounded in similar underlying psychological processes. This insight is consistent with the finding that the single best predictor of belief in one conspiracy theory is belief in a different conspiracy theory (Abalakina-Paap, Stephan, Craig, & Gregory, 1999; Goertzel, 1994; Swami et al., 2011; Wood, Douglas, & Sutton, 2012). These findings are often interpreted as evidence that people differ in the extent to which they have a conspiratorial mindset that predisposes them to attribute impactful societal events to the deliberate actions of hostile conspiracies (e.g., van Prooijen & van Dijk, 2014). Relatedly, people differ structurally in their “conspiracy mentality,” that is, an individual difference variable designed to assesses people’s tendency to perceive a world full of conspiracies (Imhoff & Bruder, 2014). Furthermore, belief in conspiracy theories is highly susceptible to contextual factors. For instance, conspiracy theories gain momentum particularly following impactful societal crisis events (van Prooijen & Douglas, 2017). These insights have contributed to the study of belief in conspiracy theories as a growing research field in the social sciences (for overviews, see Douglas et al., Chapter 4 this volume; van Prooijen, 2018; van Prooijen & van Vugt, 2018).

    To assess the two competing hypotheses put forward in this chapter, here I propose more specific predictions that can be tested through a review of the empirical research literature. If the rational conspiracist hypothesis is true, it stands to reason that people who believe conspiracy theories are rational, or at least not irrational, in many other perceptual or cognitive domains. In particular, based on the rational conspiracist hypothesis one would expect that belief in conspiracy theories is either unrelated or negatively related with (a) implausible beliefs that do not involve conspiracies, such as beliefs in the paranormal, superstition, and pseudoscience; (b) cognitive biases that are well known to produce irrational judgments and decision-making; and (c) stereotyping, which involves mental simplifications and overgeneralizations of social categories. As to cognitive style (d), conspiracy beliefs should be positively related with a tendency to recognize the complexity of difficult problems. Furthermore, analytic thinking, and not intuitive thinking, should stimulate belief in conspiracy theories.

    If the gullible conspiracist hypothesis is true, however, one would expect that to the extent people believe conspiracy theories more strongly, they are more likely to (a) also believe implausible beliefs that do not involve conspiracies, (b) display cognitive biases, and (c) engage in stereotyping. In their cognitive style (d), conspiracy beliefs should predict a tendency to perceive difficult problems in an oversimplified fashion; moreover, analytic thinking should predict skepticism of conspiracy theories instead of belief in them. I will now assess the empirical evidence for these two competing hypotheses.

    Conspiracy Theories and Implausible Beliefs

    How is belief in conspiracy theories related with a range of implausible beliefs that are common, that do not involve conspiracies, and that are not supported by any evidence? Various studies examined the relationships between conspiracy beliefs and supernatural beliefs, such as superstition and belief in paranormal phenomena. These studies typically find a reliable positive correlation: The more strongly people believe conspiracy theories, the more likely it is that they also hold a range of supernatural beliefs. For instance, Darwin, Neave, and Holmes (2011) found positive correlations of conspiracy beliefs with beliefs in psi, witchcraft, spiritualism, extraordinary life forms, and precognition. Other studies confirm these positive relationships. For instance, Lobato, Mendoza, Sims, and Chin (2014) found positive correlations between conspiracy beliefs and beliefs in the paranormal and pseudoscience. The positive relationships between conspiracy beliefs and belief in various supernatural beliefs have been frequently replicated, and are now well established in this research domain (e.g., Barron, Morgan, Towell, Altemeyer, & Swami, 2014; Newheiser, Farias, & Tausch, 2011; Swami et al., 2011; van Prooijen, Douglas, & De Inocencio, 2018).

    An interesting illustration of how conspiracy beliefs are related with other implausible beliefs can be found in a seminal paper that introduced “pseudo-profound bull****” and “bull**** receptivity” as viable academic terms (Pennycook, Cheyne, Barr, Koehler, & Fugelsang, 2015). Pseudo-profound bull**** refers to statements that appear to have a deeper meaning but actually are empty. Bull**** receptivity refers to people’s tendency to perceive such statements as profound, that is, as containing some deeper truth. To measure this construct, Pennycook and colleagues (2015) designed a scale consisting of statements that are grammatically correct, yet contain randomly chosen buzzwords (example items include “Hidden meaning transforms unparalleled abstract beauty” and “Good health imparts reality to subtle creativity”). Results revealed that participants’ ratings of such statements as profound significantly predicted a range of variables indicative of gullibility, including reduced analytic thinking, reduced verbal intelligence, increased paranormal belief, and increased faith in intuition (see also Forgas, Chapter 10 this volume). Of importance for the present purposes, bull**** receptivity also predicted an increased tendency to believe conspiracy theories (Pennycook et al., 2015, study 4).

    The empirical relationships between conspiracy beliefs and such implausible beliefs are not necessarily harmless: Conspiracy theories can lead to irrational and harmful behavior. For instance, the link between conspiracy theories and belief in pseudoscience has real consequences for people’s health. One study reveals that belief in conspiracy theories predicts a preference for alternative medicine over regular, evidence-based medical approaches (Lamberty & Imhoff, 2018). Furthermore, in South Africa AIDS conspiracy theories are common, which for instance stipulate that AIDS was created by pharmaceutical companies in the lab to sell antiretroviral drugs, and that not the HIV virus but these drugs are dangerous to people’s health. A study conducted in Cape Town revealed that belief in such AIDS conspiracy theories is a major predictor of reduced condom use among both men and women (Grebe & Nattrass, 2012). In sum, belief in conspiracy theories reliably and consistently predicts a range of implausible beliefs and irrational behaviors, which supports the gullible conspiracist hypothesis and contradicts the rational conspiracist hypothesis.

    Conspiracy Theories and Cognitive Biases

    The second test of the competing hypotheses put forward here pertains to cognitive biases. It is reasonable to assume that people with a truly rational, critical mindset less likely fall prey to cognitive biases that deteriorate decision-making as compared to people with an irrational, uncritical mindset (Myers, Chapter 8 this volume). One cognitive bias of interest is the conjunction fallacy. This is an error in probabilistic reasoning characterized by overestimating the likelihood that two events co-occur (Tversky & Kahneman, 1983). A well-known example of the conjunction fallacy is that after a stereotypical description of a woman being a feminist, many people rate the probability that she is a feminist and a bank teller as higher than the probability that she is a bank teller. In fact, the statistical probability of a combination of two constituents co-occurring (feminist and bank teller) can never be higher than the probability of one of the individual constituents occurring (bank teller).

    One study investigated the relationships between conspiracy beliefs, paranormal beliefs, and conjunction fallacies in a range of judgment domains (Brotherton & French, 2014). Specifically, some of the conjunction statements were neutral; some of the conjunction statements were in the context of paranormal phenomena (e.g., about a person dreaming her sister’s house is on fire, and the sister’s house actually being on fire); and some of the conjunction statements involved possible conspiracies (e.g., about CEOs of petrol companies discussing the implications of a new device that increases fuel efficiency in cars, and the inventor of the device being found dead in his home). Results revealed that belief in conspiracy theories predicted an increased proportion of conjunction fallacies across judgment domains (i.e., neutral, paranormal, and conspiratorial). In fact, although paranormal beliefs also predicted increased conjunction fallacies, the effects were stronger for conspiracy beliefs across all three types of conjunction contexts.

    A related yet distinct cognitive bias that has been examined in the context of conspiracy beliefs is illusory pattern perception. Specifically, the human mind automatically and functionally looks for patterns, that is, meaningful and causal relationships between stimuli. Detecting the actual causal relationships between stimuli is important for any organism to adapt to their environment, for instance to distinguish friends from foes, edible foods from poisons, safe from dangerous situations, and so on. These functional qualities of pattern perception notwithstanding, one consequence of this cognitive mechanism is that people often detect non-existing patterns by perceiving causal and meaningful relationships between stimuli that are in fact unrelated. Such illusory pattern perception for instance predicts habitual gambling (Wilke, Scheibehenne, Gaissmaier, McCanney, & Barrett, 2014).

    Of importance to the present purposes, illusory pattern perception positively predicts belief in conspiracy theories. In a series of studies, participants’ tendency to perceive patterns in randomly generated strings of coin toss outcomes was associated with increased belief in conspiracy theories; similar findings were obtained for perceiving patterns in the chaotic modern art paintings by Jackson Pollock (van Prooijen et al., 2018). Moreover, a recent study examined participants’ perception of a range of existing, yet most likely spurious correlations that occur in everyday life (e.g., an increase in chocolate consumption is correlated with an increase in Nobel Prize winners in a country). The researchers found that the more strongly participants believed that these correlations in fact represented a direct causal relationship, the more strongly they believed conspiracy theories (van der Wal, Sutton, Lange, & Braga, 2018).

    Pattern perception is generally considered to be one out of two key cognitive components of conspiracy beliefs (Shermer, 2011; van Prooijen & van Vugt, 2018). The second cognitive component is agency detection: The human mind automatically makes a judgment of the intentionality behind the actions of others. Were certain outcomes caused by an intentional agent? Like pattern perception, also agency detection is, in principle, a functional cognitive mechanism to effectively navigate one’s social world. For instance, agency detection smoothes social interaction through increased mutual understanding of others’ actions, and helps to make valid judgments of accountability when an actor caused harm (van Prooijen, 2018). But people also make mistakes in agency detection by perceiving agency where none exists. One study assessed to what extent participants detected agency in the inanimate geometric figures from the classic Heider and Simmel (1944) footage, and found that such hyperactive agency detection predicted increased conspiracy beliefs (Douglas, Sutton, Callan, Dawtry, & Harvey, 2016). Likewise, the related construct of anthropomorphism – that is, the tendency to ascribe human qualities to nonhuman stimuli – is positively correlated with belief in conspiracy theories (Brotherton & French, 2015; Imhoff & Bruder, 2014). In sum, belief in conspiracy theories is reliably associated with a range of cognitive biases, specifically the conjunction fallacy, illusory pattern perception, and errors in agency detection.

    Conspiracy Theories and Stereotyping

    By definition, stereotyping is an oversimplification of groups of people, and it therefore seems reasonable to assume that a rationally skeptic mindset is associated with decreased stereotyping, and that gullibility is associated with increased stereotyping. How does stereotyping relate to conspiracy beliefs? One line of evidences comes from research on individual differences commonly known to reflect increased stereotyping, notably authoritarianism and social dominance orientation. Research found qualified support for the idea that these individual difference variables positively predict conspiracy beliefs. Specifically, various studies found a positive relationship of these individual difference variables with belief in specific conspiracy theories (e.g., the belief that President Kennedy was killed by a conspiracy), but no relationship with a generalized tendency to perceive a world full of conspiracies (e.g., Abalakina-Paap et al., 1999; Swami, 2012). At first blush, the evidence for authoritarianism and social dominance orientation as predictors of conspiracy beliefs seems inconsistent. How can this apparent discrepancy between specific versus generic conspiracy beliefs be reconciled?

    An important piece of this puzzle is offered in a study by Imhoff and Bruder (2014) who investigated conspiracy mentality (i.e., a generic tendency to perceive conspiracies in the world) in relation to authoritarianism, social dominance orientation, and stereotyping of a range of specific societal groups. These researchers replicated the finding that conspiracy mentality is unrelated to these two individual difference variables, but also, offered an explanation for this: Authoritarianism and social dominance orientation mainly predicts stereotyping of low-power or low-status societal groups, such as Muslims, asylum seekers, and gypsies. Conspiracy mentality, in contrast, mainly predicts stereotyping of high-power groups, including politicians, managers, big companies, and so on. A series of studies supported these ideas, by testing how stereotypes of a range of high-power versus low-power groups are related with conspiracy mentality, authoritarianism, and social dominance orientation. Furthermore, conspiracy mentality positively predicted anti-Americanist and anti-capitalist sentiments in a sample of German participants. It thus appears that besides specific conspiracy beliefs also a more general conspiracy mentality positively predicts stereotyping, but of high-power groups instead of low-power groups.

    One exceptional category in Imhoff and Bruder’s (2014) study was stereotyping of Jewish people (i.e., anti-Semitism), as this variable was positively correlated with all constructs of interest, that is, conspiracy mentality, authoritarianism, and social dominance orientation (see also Swami, 2012). Indeed, Jewish conspiracy theories are widespread in the world (e.g., allegations that there is a Jewish plot to attain world domination) and are common among extremist groups of varying ideological signatures (Bartlett & Miller, 2010). Such belief in Jewish conspiracy theories is a major predict of anti-Semitism (Golec de Zavala & Cichocka, 2012; Kofta & Sedek, 2005). In fact, historians have noted that Jewish conspiracy theories played a major role in fueling anti-Semitic sentiments in Nazi-Germany during the 1930s and 1940s (e.g., beliefs that a Jewish conspiracy caused the German defeat in the First World War; Moreover, Hitler believed that both capitalism and communism were the result of Jewish conspiracies for world domination. For details, see Pipes, 1997).

    More generally, it has been noted that believing conspiracy theories requires perceivers to ascribe hostile and evil qualities to an out-group – the conspiracy – which is facilitated by negative stereotypes of the out-group in question (van Prooijen & van Vugt, 2018). It thus seems that conspiracy theories go hand in hand with stereotyping of the alleged group of conspirators. General conspiracy mentality predicts stereotyping of the powerful groups frequently implicated in conspiracy theories; likewise, conspiracy theories about minority groups predicts stereotyping of the minority group in question. Both specific and more general conspiracy beliefs hence positively predict stereotyping, particularly of groups that are suspected to be part of the conspiracy.

    Conspiracy Theories and Cognitive Style

    Presumably the most direct test of the rational versus gullible conspiracist hypotheses pertains to how conspiracy believers versus disbelievers differ in their cognitive style. I will examine how conspiracy beliefs are related with mental simplicity, and more generally how conspiracy beliefs are related with System 1 thinking (i.e., intuitive and emotional) versus System 2 thinking (i.e., deliberative and analytic). As to mental simplicity, one series of studies found evidence that political extremism – at both sides of the spectrum – predict conspiracy theories (van Prooijen, Krouwel, & Pollet, 2015). Of interest for the present purposes, two Dutch nationally representative samples revealed that these findings were mediated by increased beliefs among extremists that there are simple solutions to the complex problems that society faces. Consistently, various studies found that higher education predicts a decreased likelihood to believe conspiracy theories (Douglas et al., 2016) and this relationship is partly mediated by a tendency to perceive simple solutions for complex problems among the lower educated (van Prooijen, 2017). Furthermore, conspiracy beliefs are related with an illusion of explanatory depth for political issues, that is, people’s tendency to overestimate the depth and knowledge of their understanding of complex political events (Vitriol & Marsh, 2018). Conspiracy theories hence are rooted in a belief that complex societal and political problems actually have simple causes and simple solutions.

    A study by Swami, Voracek, Stieger, Tran, and Furnham (2014) experimentally investigated the relationship between analytic thinking and conspiracy beliefs. These authors first measured base-rate levels of conspiracy thinking, and invited participants back into the lab at a later time. Then, in several studies the authors induced manipulations that varied whether or not participants were stimulated to think analytically. Results revealed that analytic thinking reduced belief in conspiracy theories. Furthermore, intuitive thinking predicted increased belief in conspiracy theories. Correlational findings are consistent with these results. For instance, van Prooijen (2017) found that the previously mentioned relationship between lower education and increased belief in simple solutions (which in turn predicted increased conspiracy beliefs) was mediated by reduced analytic thinking. Furthermore, Ståhl and van Prooijen (2018) found that a capacity to think analytically in and of itself is insufficient to reduce conspiracy beliefs; one also needs to be motivated to be rational and rely on evidence to come to informed judgments. These studies all consistently suggest that System 2 thinking stimulates skepticism of conspiracy theories instead of belief in them.

    If System 2 thinking reduces belief in conspiracy theories, does emotional System 1 thinking increase belief in them? Evidence indeed suggests that particularly negative emotions increase conspiracy theories. Experimental manipulations of lacking control (van Prooijen & Acker, 2015; Whitson & Galinsky, 2008), and subjective uncertainty (van Prooijen & Jostmann, 2013; Whitson, Galinsky, & Kay, 2015) have been found to increase conspiracy beliefs. Correlational evidence supports these experimental findings by revealing that conspiracy theories are related with feelings of powerlessness (Abalakina-Paap et al., 1999), trait anxiety (Grzesiak-Feldman, 2013), and feelings of relative deprivation (van Prooijen, Staman, & Krouwel, 2018). Such findings on the role of negative emotions are consistent with historical observations that conspiracy theories gain momentum among the public particularly in the wake of anxiety-provoking societal crisis events, such as terrorist attacks, wars, earthquakes, fires, and floods (Brotherton, 2015; Pipes, 1997; van Prooijen & Douglas, 2017). In sum, the evidence indicates that System 1 thinking – emotional, intuitive, and heuristic – promotes belief in conspiracy theories. System 2 thinking – analytic, deliberative, and rational – promotes skepticism of conspiracy theories.

    Discussion and Conclusion

    The evidence overwhelmingly supports the gullible conspiracist hypothesis and contradicts the rational conspiracist hypothesis. The more strongly people believe conspiracy theories, the more likely it is that they also endorse implausible non-conspiratorial beliefs including paranormal phenomena, superstition, pseudo-science, and pseudo-profound bull****. Furthermore, conspiracy beliefs predict an increased susceptibility to a range of common cognitive biases, including the conjunction fallacy, illusory pattern perception, and hyperactive agency detection. Belief in conspiracy theories also predict increased stereotyping, particularly of stigmatized minority groups that often are accused of conspiracy formation (e.g., Jewish people) as well as of powerful groups that are common actors in conspiracy theories (politicians, managers, capitalists, and so on). Finally, conspiracy beliefs are rooted in System 1 thinking, not in System 2 thinking. In particular, belief in conspiracy theories is associated with lower education levels, a tendency to perceive complex societal issues as simple, an increased illusion of explanatory depth in one’s understanding of political issues, and reduced analytic thinking. Instead, intuitive thinking and negative emotions increase belief in conspiracy theories.

    The findings reviewed here are hence consistent with a model suggesting that the decision to reject official readings of impactful events, and to subsequently embrace conspiracy theories, is often made reflexively instead of reflectively. Once accepted, a conspiracy theory is highly resilient to change as believers engage in epistemic processes that are tainted by motivated reasoning and the confirmation bias: Believers selectively embrace evidence and expert testimonies that support their suspicions, and reject evidence and expert testimonies that disconfirms them (Brotherton, 2015). The net result is an extensive theory that appears well elaborated, and supported by a lot of evidence. But while such theories may seem articulate, the decision to accept far-fetched conspiracy theories as true is actually rooted in gullibility.

    Two important observations need to be clarified in light of this conclusion. First, one might reason that people who believe conspiracy theories are critical and skeptic, but specifically about official readings of events and legitimate powerholders. Second, one might note that conspiracy theories can be quite rational from time to time. Corruption does occur in politics, business, and science, and there are many examples of conspiracy theories that turned out true eventually (e.g., Watergate; see Wright & Arbuthnot, 1974). I do not dispute either of these observations, and would like to clarify here that true skepticism is different from gullibly accepting whatever policy-makers propose. A “healthy” critical mindset includes constructively scrutinizing the actions of power holders, and expressing concern whenever one suspects malpractice or bad policy. But true skepticism also implies critically assessing the evidence for accusations of conspiracy formation, and recognizing when such accusations are implausible (see also Fiedler, Chapter 7 this volume). Put differently, what true skepticism does not entail is uncritically accepting bizarre conspiracy theories such as that the Earth is flat, that human beings never landed on the Moon, or that on 9/11 the impact of two passenger airplanes – flying at high speed and full of kerosene – had absolutely nothing to do with the Twin Towers collapsing shortly thereafter.

    While conspiracy beliefs are rooted in gullibility, this does not mean that conspiracy beliefs necessarily originate from closed-mindedness. In fact, studies found positive correlations between belief in conspiracy theories and the personality variable openness to experience (Swami et al., 2011; Swami et al., 2013). An interesting distinction here is between reflexive versus reflective open-mindedness (Pennycook et al., 2015). Reflexive open-mindedness refers to an intuitive mindset that is open to any new experience or information. Reflective open-mindedness, in contrast, refers to a critical mindset that is open to, yet also critically analyzes, new opportunities or ideas. Integrating these insights with the evidence presented in this chapter, it is possible that people who believe conspiracy theories are much like skeptics and scientists in their curiosity of, and openness to, novel ideas; but unlike skeptics and scientists, these novel ideas are evaluated through an intuitive, reflexive mindset instead of through a reflective mindset.

    One limitation of the current analysis, and a challenge for future research, pertains to sampling. I started out this chapter with the notion that people who are active on conspiracist websites perceive themselves as “critical freethinkers” (Harambam & Aupers, 2017). But while people who actively propagate conspiracy theories in online focus groups can be included in qualitative analyses, it is unclear at best if, and in what numbers, they took part in the quantitative studies that formed the basis of the current analysis. Put differently, there may be structural differences between the presumably small group of citizens that actively comes up with, and publishes online, novel conspiracy theories, as opposed to the large group of citizens that passively reads, believes, and spreads them. Based on the present analysis it is impossible to exclude the possibility that coming up with novel conspiracy theories, and successfully disseminating them among a large audience, is a creative process that requires analytic skills. Future research might therefore focus on differences in rationality versus gullibility between people who actively and successfully create new conspiracy theories versus people who passively accept them.

    To conclude, in the present chapter I compared the two opposing ideas that (a) belief in conspiracy theories originates from rational skepticism versus (b) belief in conspiracy theories originates from gullibility. The studies reviewed here unequivocally support the second idea. The mental processes that characterize rational skepticism fuels disbelief in most conspiracy theories. While conspiracy theorists appear to have much faith in their beliefs, on average one may question the accuracy of their self-perception as “critical freethinkers” (see also Dunning, Chapter 12 this volume). To return to the observations that motivated the current contribution: While some conspiracist websites are keen on persuading citizens who disbelieve conspiracy theories to think more critically, the present chapter suggests that these “sheep” in the end may not be the gullible ones.

  2. The Following 7 Users Say Thank You to Clear Light For This Post:

    Bill Ryan (7th August 2019), Constance (8th August 2019), Ioneo (7th August 2019), Johan (Keyholder) (8th August 2019), meeradas (8th August 2019), Nasu (7th August 2019), O Donna (8th August 2019)

+ Reply to Thread

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts