A colleague sent me this video from the UK's Royal Society, about overcoming bias in the review process for scientific grants: https://royalsociety.org/topics-policy/publications/2015/unconscious-bias/. I particularly like the depiction of bias as Alice falling down the rabbit hole, although I'm not sure the Mad Hatter is the best representative for rational thought!
The video's main argument is that bias is primarily a function of the Intuitive Mind, which uses fast-and-cheap heuristics to make snap judgments about the world. Unfortunately, Intuitive-level heuristics sometimes lead us astray, for example resulting in fewer successful applications by minority scientists, or from people who didn't attend the top tier of schools in their own education. The video suggests that the Narrative Mind can provide an antidote to this kind of flawed thinking, which is also Daniel Kahneman's primary message in the book that popularized the idea of two mental systems, Thinking: Fast and Slow. Narrative-mind strategies are the most commonly proposed alternative to errors produced by Intuitive thinking. The argument is that if we could only slow down and think through our decisions more logically and less emotionally, we would produce better-quality decisions overall.
Two of the solutions proposed by the Royal Society fit squarely with a Narrative-mind-enhancing approach: (1) slow down decision-making, and (2) reconsider the reasons for decisions. These time-tested methods are part of the reason that grant review panels exist at all. The process brings together experts with relevant expertise but no personal interest in the proposed research, and uses formal written criteria and a process of discussion and debate to draw out the strong and weak points of each application. The idea is that funding decisions will be made entirely on the merits of proposed research, without any unwanted contamination due to biases, assumptions, or stereotypes. At the extreme, Narrative-mind thinking implies that the decision-making process could be reduced entirely to algorithms: Just get the inputs and the formulas right, and you will be protected from any mistakes.
As the video notes, the Intuitive mind is indeed vulnerable to making judgments based on a potential grant recipient's background, culture, or personal experiences. But unfortunately, that also extends to judgments about the "quality," the "significance," or the "level of innovation" of a proposed research study, which are all things that grant funding agencies would like to prioritize. If it were sufficient to simply tell reviewers "base your decisions solely on the application's merits," then we wouldn't have a problem in the first place. It's therefore highly unlikely that the Royal Society's first two suggestions can get us out of our predicament, no matter how careful and deliberative we ask our Narrative minds to be. Their suggestion number (3) is also Narrative-mind-oriented, but it has a different target. This strategy is to actively question cultural stereotypes, and it might be a more effective use of the Narrative mind. Essentially, this turns the Narrative mind's logical and diagnostic tendencies loose on its own decision-making process. It provides an additional type of information -- potential for bias -- that can then be considered alongside the usual information streams about innovation, quality, and significance, and having this additional type of information might lead to reconsideration of reviewers' ratings on those dimensions. It takes some courage for reviewers to actively question their own biases and stereotypes, but this strategy is more likely to be helpful than just asking the reviewers to make more careful decisions using the same process they have always used.
The Royal Society's final suggestion includes both Narrative and Intuitive elements: (4) Monitor other panel members for unconscious bias. This is in some ways a simpler assignment for the Narrative mind than examining its own biases, because it's easier to notice flaws in someone else's thinking than in our own. This solution also taps into a powerful Intuitive mechanism for behavior change, which is to consider how someone else might be perceiving our actions or intentions. The video doesn't explicitly tell reviewers to "remember that other panel members will be looking for flaws in your thinking too," but that implication is there. The Intuitive mind is very tuned-in to this type of social judgment, which can create a countervailing pressure that does make the Narrative mind more attentive to its own mistakes or omissions.
The video makes one other point worth considering, around 1:45 (image of cards on a table). The finding that people want to be part of an in-group, and to avoid association with an out-group, is based in a different Intuitive-mind system than the one that jumps to conclusions in logic problems like those described starting at 0:35 in the video. It is the part of the Intuitive mind that engages in social comparisons, that monitors whether others are friends or foes, and that unconsciously ranks the relative power of people in a group. Even more than our internal biases and stereotypes, the relative power of panel members or social groupings can have an effect on our decision-making. As with the concern that other people might see us as biased, our sense that powerful people will dislike us can motivate us to reach certain types of decisions. Certainly, knowing that peers will listen to my critique of a research application is a strong part of my motivation when I spend hours going through a 200-page grant with a fine-toothed comb, rechecking my facts, and questioning my own conclusions. I'm not doing all of that for the token honorarium. Instead, I don't want to look like an idiot in front of my peers! This social aspect of panels suggests that it might be especially important for the most experienced or influential members of a review panel to explicitly state their commitment to questioning bias, to note their own assumptions, and to make an effort to correct their mistakes. Modeling is a very effective form of social influence that can free up other group members to do the same.
In a different video about decision-making, the Royal Society tackles some of the interpersonal dynamics of peer review, looking at the parts of the process that are really about power rather than attitudes and bias: https://royalsociety.org/topics-policy/publications/2018/making-better-decisions-in-groups Suggestions in this video are even more strongly targeted at the workings of the Intuitive mind: For instance, the Royal Society suggests that we should defer to the judgment of people with content expertise regardless of their social standing in our scientific community, actively seek out perspectives from people who aren't like us, and use hierarchy-disrupting communication strategies like fixed speaking times for everyone or anonymous feedback methods. Some U.S. grant review panels have deliberately included patients or family members among their grant reviewers, in order to disrupt the power of the presumably better informed research experts and gain a broader perspective on the importance of any proposed research. The ideas in this second video are mainly about structuring a situation to deliberately draw out competing views.
One drawback of power-based methods is that they are often less comfortable for higher-status people in the group who might not be accustomed to so much dissent. Extra efforts have to be made to ensure that lower-status people are safe from retribution if they do contradict the more powerful members of the group. And there may be a significant need to correct misconceptions: For example, much of the current furor over Critical Race Theory (CRT) is based on the false idea that it says certain groups (such as White people) have racist attitudes; instead, it says that certain groups benefit from positions of power that were achieved through historical racist practices and that are at least partially maintained through ongoing unequal treatment of people. When critics claim that CRT treats them as "inherently racist," they are confusing the attitudes-and-biases argument with the power-hierarchies argument. And in fact, powerful people have solid motivation to not see the ways in which power practices and hierarchies contribute to their own elevated social standing. They therefore often fall back on the original "objective" Narrative-mind methods that have been leading to poor results. Methods that attend to social context may be critiqued as being based on "emotion" rather than reason (although emotion is only one subsystem of the Intuitive mind). But there's no reason not to use deliberative, Narrative-level strategies to make a final decision once the potential for bias has been carefully considered. Methods that attend to power dynamics face a tougher barrier: The possibility of active resistance from powerful people who don't want to acknowledge any Intuitive-level influences on their decision-making. Openly discussing power dynamics in groups is a first step towards resolving that problem.
Despite these challenges, research shows that groups with a broader range of perspectives and a flatter decision-making hierarchy tend to produce better quality decisions. The reason for doing all of this is not out of a desire to be "woke" or politically correct. Instead, it's because we face hard problems in science and we need to get the solutions right. As in other areas of life, such as the justice system, peer review of scientific articles, and clinical decision-making, the best way to get to a high-quality solution is to use the strengths of our Narrative and Intuitive minds in combination. Ultimately, our Narrative minds can produce better decisions when Intuitive-level factors like unexamined heuristics, social perception, and power are also taken into account.
Comments
Post a Comment