One noteworthy feature of conspiracy thinking is that it is stated in a way that puts opponents immediately on the wrong foot. In this example, the idea that scientists are concealing the truth means that opponents must prove a negative -- that scientists are not concealing the truth -- which is much more difficult to do than proving affirmative facts. This is also known as the "false dilemma" problem: The suggestion that there are only two options, in this case total credulity about official accounts or suspicious disbelief. It might be the case (and probably is) that scientists are only mostly right, but that they are also mistaken or short-sighted about some things. Conspiracy theories rely on an overriding sense of suspicion about official accounts. The more official or conventionally acceptable the source of information, the more strongly the conspiracy theorist will suggest that it is spreading deliberate lies. Conversely, the more fringe or suspect the source of information, the more strongly the conspiracy theorist will gravitate towards it because it is "telling you the things they don't want you to hear." A more mainstream view, that science is our best guess at the time and progressively corrects its own faults, is too nuanced for conspiracy thinking to accommodate.
Another common rhetorical move by conspiracy theorists is to question the motivations of people who present evidence against their views. C.S. Lewis gives a classic example of this strategy: One person says "it's clear that two plus two equals four," and his opponent responds "you only say that because you are a mathematician." The irony in Lewis's example is clear, but it's less obvious when Donald Trump says that his accusers are questioning his actions "because they are Democrats." Both are very clear examples of ad hominem arguments -- attacks "on the man" who is presenting the argument -- which has been recognized as an invalid logical approach since the days of ancient Greek rhetoricians. The Conspiracy Theory Handbook points out that ad hominem attacks are often based in an assumption of nefarious intent. Conspiracy theorists would never assume that the scientists are simply mistaken or have helpful or protective motives for concealing the monster's existence. Instead, they assume that scientists are malicious and deliberate, disseminating misinformation in order to conceal an important truth that they want to keep away from others. A related logical error is "whataboutism," which brings up unrelated information in a way that distracts from truth and puts one's opponent on the defensive -- e.g., "yes, the study about monsters that I cited is flawed, but the study that you cited isn't perfect either!" Often there are false parallels in a "what about ..." argument, such as my study being based on a single monster sighting by a single fisherman at dusk, and your study being based on high-tech sonar depth readings of Loch Ness that are reported with a 1% statistical margin of error.
Another common narrative in conspiracy theories involves the idea that the conspiracy theorist is a victim or is experiencing persecution. The conspiracy theorist takes on the role of an iconoclastic hero who is trying to free others from delusion or manipulation. This can be personally gratifying, because the conspiracy theorist gets to see him- or herself as smarter than others; when a commitment to conspiracies leads to negative social interactions or other consequences, the conspiracy theorist can even come to see him- or herself as a martyr for a cause. "Only I am telling you the truth about the Loch Ness monster," says the conspiracy theorist. "I have done the research, I'm not fooled by the mainstream media, and I know what's really going on in that lake."
A simple version of the persecution narrative is the idea that "something must be wrong": If people aren't happy, or if life doesn't make sense, then the conspiracy theorist concludes that someone is actively doing something to cause those conditions. For instance, the fact that the Loch Ness monster is rarely seen and never convincingly photographed proves to the conspiracy theorist that evidence is being actively suppressed. It makes little sense that a large monster would not be seen or documented in a small community, which to the conspiracy theorist proves that there is an active campaign by scientists to suppress the overwhelming evidence that must in fact exist. In fact, the cognitive dissonance that is naturally produced when one encounters evidence against one's position might be misinterpreted by conspiracy believers as a sense that they are being deceived.
The "something is wrong" belief also contributes to one of the most pernicious aspects of conspiracy theories, the fact that they are invulnerable to contradiction by evidence. Like all of us, people who believe in conspiracies are vulnerable to confirmation bias, the tendency to seek out information that agrees with our own pre-existing views. This is exacerbated by in-group communication and by social media channels that primarily give people content that agrees with their pre-existing views. (In the case of actual cults, the practice of "love bombing" involves giving a new group member extreme positive attention, and then threatening to withdraw it if they go against the group). But conspiracy thinking goes beyond that by actively rejecting competing evidence. For example, the conspiracy theorist might note that Loch Ness contains too small a volume of water and too few fish to support a genetically viable population of sea monsters for thousands of years. Nevertheless, they might argue that this fact actually proves "something is going on," because why would there be so many stories of monsters in a setting where such a thing is biologically impossible? A conspiracy theorist might argue that "the fact that people make so many arguments to disprove monsters' existence means that there really is something to hide." (This argument has in fact been made in support of ivermectin as a treatment for COVID-19: Each disconfirming scientific trial is taken as more evidence that "there's really something there" being hidden, and one talk show host summarized the evidence with the statement "even if it's fake, that's usually the warning").
Even worse than the re-interpretation of negative evidence as positive evidence is the conspiracy theorist's re-interpretation of random occurrences as positive data. Some random noise is inherent in life -- sometimes a wave will look like a serpent's neck, a cloud will look like a dark form crouching by the lake, or a piece of driftwood will look like a monster's head. Rather than applying Occam's Razor and assuming that a simple, mundane explanation is the most likely, the conspiracy theorist will reach instead for an exotic explanation that supports their belief. Another method for resisting evidence is the "thought-ending cliche," a statement that seems to the Narrative mind like a compelling retort but doesn't actually contribute any new information to the conversation. For example, a Loch-Ness-monster believer might respond to factual arguments against his or her view by telling an opponent to "do your homework," which implies that there are other relevant facts to consider without actually stating any.
Finally, a committed conspiracy theorist can even cope with beliefs that are mutually contradictory. The conspiracy theorist, for example, might both believe that evidence of the Loch Ness monster was fabricated by townspeople looking for tourist attention and also believe that contemporary scientists are concealing the evidence that proves a real monster exists. In their view, the historical monster hoax does not disprove the belief that a real monster is being concealed because both involve "official" explanations and therefore could have been designed as distractions. The conspiracy theorist's commitment to a "hidden truth" is so absolute that even direct evidence cannot dissuade them.
Daniel Kahneman has suggested that Narrative thinking is the way to avoid mistakes in thinking, and that most false beliefs come from Intuitive-level heuristics and biases. But these examples from conspiracy theories show that false beliefs can also come from purely Narrative-level sources, as a preferred story overrides all other possible explanations. In these cases, the Intuitive mind can actually serve as a check on some of these tendencies to err in one's thinking. Source-based debunking involves questioning the motives of the people who spread conspiracy theories, essentially encouraging the conspiracy theorist to believe "these people who you think are your friends are actually out to get you too." Given that the conspiracy theorist is likely inclined already toward suspiciousness, such a tactic might succeed better than fact-based arguments. Empathy-based debunking involves calling attention to the experiences of people who suffer from conspiracy theories: For example, the hardworking scientists who are being harassed for their studies of biology at Loch Ness. Calling on most people's empathic sense of social connectedness is an important strategy for reducing conspiracy thinking. Cognitive empowerment involves giving people a greater sense of control, for example by teaching them about the procedures used by scientists and encouraging them to use some of those methods to generate their own direct knowledge. Finally, fact-checking relies on both empowerment and social relatedness by giving people access to a reputable source like Snopes that investigates rumors and provides evidence for their truthfulness or falsity. In this case the facts may matter less than the feeling of having true knowledge not available to others, which seems to be a major motivation for conspiracy beliefs in the first place.
False narratives may be resistant to change because they arise from emotional sources (i.e., the Intuitive mind). Yet they are maintained as Narrative-level beliefs, and they involve clear logical errors such as over-interpreting randomness or assuming bad motives by others. Fighting these narratives with other narratives is unlikely to be successful, and may even have harmful effects if repeating the misinformation makes it more widely known. Instead, Intuitive-level factors like social reinforcement and empowerment may be stronger strategies for fighting these Narrative-mind mistakes.
Comments
Post a Comment