Skip to main content

If "Reason" is Unreasonable, How Do We Know Anything?


Daniel Kahneman and Amos Tversky's behavioral economics work led to a 2002 Nobel Prize based on the fundamental idea that people's decision-making is not entirely rational. Two Minds Theory builds on and expands this notion by suggesting (a) that our actual behaviors are produced by the intuitive mind rather than the narrative mind, and (b) that our narratives themselves can be flawed based on factors like moral judgments or strong desire for a particular outcome. In one sense this doesn't seem surprising, but it sometimes leads to a fundamental question about human nature: If we can't trust reason to give us the right answers, how do can we actually know anything?

The classical distinction between reason and emotion suggested that people should be guided by thoughts and not by feelings; indeed, much of our legal system is based on this distinction. Economics dating back at least as far as the utilitarianism of Jeremy Bentham and John Stuart Mill was based on the idea that people were rational actors who would make decisions based on their own self-interest; Kahneman's Nobel Prize was given for upending this assumption. In philosophy, premises based on reason have been given special privilege; this dates back to the skeptical approach of Rene Descartes and the argument that one cannot be sure of anything other than the existence of one's own thinking (cogito ergo sum). And going back even further, Aristotle defined human nature based on the "rational principle" that seems to be unique to our way of thinking. His teacher Plato went a step further, arguing that reason provides us access to the "world of forms" that is even more true, universal, and enduring than the world we can directly observe with our senses. Medieval scholastic thought used the idea of humans as "rational animals" to argue that humans have special status in the world of living things, marrying this with an idea from the Book of Genesis (1:26) that humans are divinely appointed to have dominion over the earth and all its creatures. Reason is still taken by many people as prima facie evidence that humans are special and distinct from other animals, and even among people who might reject that sense of specialness there is still a sense that one can arrive at right living through the diligent exercise of logic and reason. 

If decision-making that appears to be "rational" is affected by factors that should not theoretically affect one's decisions, what does this mean for our ability to know anything at all? C. S. Lewis makes the following argument for the primacy of reason in our attempts to know the world: When someone questions your beliefs, he says, "you must ask them whether any reasoning is valid or not. If they say no, then their own doctrines, being reached by reasoning, fall to the ground. If they say yes, then they will have to examine your arguments and refute them on the merits" (Pilgrim's Regress, p. 62). 

Lewis, writing in 1933, was arguing mainly with Freud's conception of the unconscious, which was among the first modern theories to point out the failings of human reason. Lewis's fellow Oxford professors in the ordinary language school of philosophy went so far as to argue that common sense should be the arbiter of all disputes, based on the idea that different people using reason independently would always come to basically the same conclusions. In the legal profession, this comes to us as the "reasonable man standard" that asks what a typical, cautious, prudent person might think in a particular situation. If we don't believe that reason is capable of producing truth, these traditional ways of resolving arguments no longer seem open to us. Even the logical positivism assumptions underlying most empirical science rely on the fact that an average well-informed scientist will come to the same interpretation if presented with the same empirical data; the arbiter of truth is in this view supposed to be one's direct observation of the world, but it ignores the possibility of people drawing different conclusions about what those observations mean

In our own time we are seeing the ultimate failure of logical positivism as groups of people insist not only on their own opinions but also on their own carefully curated set of facts. This would seem to be the inevitable result of past findings that undercut human reason as the fundamental basis for scientific thinking and knowledge about the world. Without it we seem to be left in a zone of relativism, where anyone can label anyone else's views as "fake news" or else (in one of the most popular rhetorical moves of our current historical moment) say that they only think what they do because of who they are -- e.g., a liberal, a Republican, a minority group member, or a military veteran. Reason provides a defense against ad hominem attacks of this kind: Arguing that "you say that Black Lives Matter only because you are a Democrat" is analogous to arguing that "you say that two plus two equals four only because you are a mathematician." (Mathematics example from Lewis again). I hope we can all still agree that the second one of these statements is ludicrous, but the first may no longer seem that way to many people even thought they are fundamentally the same type of argument. Traditionally, such arguments could be handled by an appeal to logic and reason. Once we have de-throned reason, once we do not think that our own minds can be trusted to produce accurate and consistent results, how can we counter such arguments?

Discovering that reason is fallible doesn't necessarily leave us adrift in a world of relativism. Kahneman provides evidence that narrative-system thinking is still more often accurate than intuitive-system thinking when we take the time to think things through carefully. And narrative-system thinking can be further augmented by adopting formal rules like "ad hominem arguments aren't admissible" that rule out the most common strategies for undercutting Narrative logic. Of course, we all have to agree to play by the rules for this to work, which is an argument in favor of following certain structures and protocols in our public discourse. I have argued elsewhere that the great success of science is because all members of the scientific community have agreed to play by the same rules in determining what evidence is admissible or not. 

Second, we can become more aware of the places where Narrative-system logic tends to break down, as in the examples from Haidt where people conclude that something is "just wrong" on moral grounds even though they can't articulate plausible logical grounds for saying so. We can commit to allowing other people to review and challenge our arguments, because they might be more able to see the flaws in our thinking than we are. (Peer review practices are another one of the great strengths of contemporary scientific approaches). We can commit not to draw conclusions until we are ourselves convinced -- Lewis argues that premature conclusions are a major source of errors in thought, and that we need to be more comfortable with uncertainty: "The people who live [in the city of Lewis's allegorical tale] have to give an opinion once a week or once a day, or else Mr. Mammon would soon cut off all their food. But out here in the country you can walk all day and all the next day with an unanswered question in your head; you need never speak until you have made up your mind" (Pilgrim's Regress, p. 61). Not drawing conclusions until we have some evidence (and perhaps taking a walk in nature while we ruminate) is an excellent practice to avoid errors in thinking. And we can adopt disciplines like routinely questioning our assumptions and seeking out contradictory information, which are not intuitively appealing but which we know are likely to help us catch some of the failures of Narrative thought.

Finally, we can take advantage of Intuitive-level processes that help us arrive at the truth. Sometimes we might come to a logical-seeming conclusion that nevertheless "feels wrong"; rather than trying to force the issue, we might take this as a prompt to continue our search for more information or flaws in our reasoning. Formal practices like the courtroom or the academic lecture hall can seem stifling and exclusionary, but they also have a purpose in making us take things seriously; former U. S. Attorney Preet Bharara suggests that the pursuit of legal justice is not purely an exercise in reason and that some of these rituals and trappings of authority are beneficial in avoiding false conclusions. 

Acknowledging that reason isn't infallible is not the same thing as abandoning all hope of finding out the truth. It does mean that we need to be more humble than past generations that thought reason conferred on them special status among all the animals, or that rational people had better answers than those who paid attention to things like intuition and emotion. (In fact, pure reason that ignores all emotion has its own drawbacks as a strategy for making important decisions in life). A combination of strategies drawn from both the Narrative and Intuitive systems is most likely to generate helpful solutions to complex problems, and to provide us with a view of the world closest to objective truth.

Comments

Popular posts from this blog

Prototypes and Willingness: The Theory of Planned Behavior Revisited

  You may recall my blog post from last year on the Theory of Planned Behavior (TPB) , titled "in praise of a failed model." My evaluation of this model was that it accurately describes the Narrative Mind, which does control intentions. But the ultimate goal of the TPB is to predict behavior, and the relationship between intentions and behavior is weak at best -- in fact, it is entirely attributable to the fact that when someone says they don't intend to do something, they probably won't do it. When they say they do intend to do it, their actual results are no better than chance, a result of the intention-behavior gap as described in Two Minds Theory.  The full TPB is shown in this diagram: Cognitive constructs like attitudes, subjective norms, and perceived behavioral control (i.e., self-efficacy) are Narrative-system phenomena, and they do indeed have relationships with each other and with intentions (which are also products of the Narrative Mind). Perceived behavi...

Leventhal's Common-Sense Model and Two Minds Theory

Leventhal, Diefenbach, and Leventhal's (1992) "common sense model" of self-regulation. My 2018 paper describing Two Minds Theory (TMT) cites work by my colleague and coauthor Dr. Paula Meek, who conducted studies of patients experiencing the symptom of breathlessness due to chronic obstructive pulmonary disorder (COPD). Paula's research used a model by Howard and Elaine Leventhal (with Michael Diefenbach) that was an early iteration of the dual-process approach also used in TMT. She found that people who focused their attention on different aspects of the feeling of breathlessness then in turn had different interpretations of what that symptom meant for them, and that those interpretations changed their perception of the symptom's intensity. This example illustrates a back-and-forth between perceptions and thoughts, which is characteristic of Leventhal's model. Leventhal's dual-process model, sometimes called the "common sense model" of self-reg...

Intuitive Decision-Making by People with Diabetes

People with diabetes often find it challenging to maintain their blood sugar levels, in part because diabetes is a complicated disease. When the kidneys don't produce enough insulin fast enough to adjust for changes in digestion or activity, blood sugar can fluctuate rapidly, even over the course of a single day. To manage this, people with diabetes often need to make changes in multiple areas: adopting a low-carbohydrate diet, managing the timing and amount of exercise they get, keeping track of the times when their blood sugar rises and falls, potentially giving themselves a dose of insulin around mealtimes, managing stress, and other preventive measures as well.  But despite all of this complexity, the people who manage their diabetes most successfully are often the least  obsessive about the fine details. When my Dad was first diagnosed with diabetes, he checked his blood sugar often (using finger sticks; continuous glucose monitoring [CGM] devices weren’t yet a thing). Bu...