Advertisement

Rooting for the Null Hypothesis: Key Strategies for Avoiding Bias in Clinical Decision-Making

      The greatest of faults, I should say, is to be conscious of none.— Thomas CarlyleA man travels many miles to consult the wisest guru in the land. When he arrives, he asks the wise man, “Oh, wise guru, what is the secret of a happy life?”“Good judgment,” says the guru.“But oh, wise guru,” says the man, “how do I achieve good judgment?”“Bad judgment,” says the guru.— Tavris and Aronson (2007, 213)
      My father passed away in 2012 of end-stage chronic obstructive pulmonary disease (COPD). The last year of his life was fraught with exacerbations, but he found considerable relief with the use of chronic oral prednisone. It started with prednisone bursts but eventually led to a daily dose of 10 mg. He tried multiple times to reduce the dose or taper off altogether, but each time he found it harder to breathe; so he, and eventually I, concluded that the benefits more than outweighed the risks. What I did not realize at the time was that I was forming a bias for the use of oral prednisone in patients with COPD: the strong emotional connection to how it had helped my father’s quality of life influenced my perception.
      Bias can simply be defined as beliefs that influence our judgment one way or another, which can in turn impact our objectivity and fair treatment of others. The effects of bias can have a profound impact in the clinical setting, particularly as medical providers’ decisions directly impact the lives of others.
      As clinicians, our first impressions — which become a part of our belief system and can lead to bias — have a major impact on diagnosis and decision-making. This could be in the form of patients’ appearance, their list of medications, their allergies, their medical history, or simply whether they remind us of other patients we know. First impressions are the snap judgments or stereotypes that are necessary to help us simplify the world and reduce the amount of information processing we have to do when taking in new information. However, from the moment of our first impression, we may be subject to confirmation bias: we begin to seek information that confirms our initial judgments and we dismiss or downplay information that opposes those judgments (Med Decis Making 2017;37:9–16).
      When bias leads to a decision that causes harm, it can be very difficult to admit to our mistakes. Often, if we have the perspective that we are good, highly-educated, experienced professionals who care about our patients, we will experience cognitive dissonance when we realize a mistake. We will subconsciously work to reduce this dissonance through self-justification, reasoning, and defensiveness, which can lead to blaming others or resolutely taking an incorrect course of action (Med Educ 2019;53:1178–1186; Schmalenbach Bus Rev 2014;66:191–222).
      Take, for example, my bias for the use of chronic oral prednisone in all patients with advanced COPD. For a long time, I justified this prescribing habit, reasoning that my patients had such a poor quality of life that it was worth trying, even if it did little to improve their symptoms. I downplayed the risks, and I became defensive when well-meaning colleagues and medical directors challenged its use. When interpreting the effects, my bias selected for any sign, no matter how small, of positive results and downplayed or ignored the negative ones. I did not become aware of this blind spot until I was a researcher participating in a study in nursing homes and reviewing the latest COPD treatment data and the regimens of dozens of patients who were not my own. The uncomfortable realization of my blind spot led to a change in my treatment of patients with COPD and planted the seed for discovering more areas of bias in my decision-making.
      Admitting mistakes is especially difficult in the practice of medicine for fear of lawsuits, losing our patients’ trust, showing vulnerability, or dealing with shame. Yet the decisions of health care providers can affect a large number of lives. The consequences are thus greater when health care providers cannot recognize their biases and admit fault. The good news is that in reality we tend to avoid lawsuits, gain our patients’ trust, and diminish shame by admitting our fallibility and showing remorse [C. Tavris and E. Aronson, Mistakes Were Made (but Not by Me): Why We Justify Foolish Beliefs, Bad Decisions, and Hurtful Acts, Houghton Mifflin Harcourt, 2008]. Such vulnerable actions allow us to better relate to our patients by showing that we are human.
      So how can we become aware of our biases, reduce the potential for making bad decisions, and acknowledge and even learn from our mistakes? Fortunately, there are several strategies health care providers can use.

      Key Strategies for Avoiding Bias

      One of the most effective strategies is inviting other people’s perspectives into the decision-making process. To make this strategy effective, remain open-minded and present yourself in a way that invites differing opinions. For this process, our colleagues working in post-acute and long-term care communities are our best resource. I personally collaborate with a small group of physicians whom I consult regularly; our intention is to allow disagreement and offer constructive criticism. If we are open to it, this process can often reveal blind spots.
      Another strategy is to separate the act of decision-making from our sense of identity. A major reason why we become defensive and rationalize our decisions is because we often see those decisions as defining who we are as a person. Attaching our decisions to our identity can insidiously turn into arrogance; and if a decision turns out to be harmful, this same attachment to our identity can make us feel like a bad person. The path to better decision-making lies in separating the doing from the being: recognizing the distinction of what I did versus who I am (Peter Attia, MD [blog], Nov. 14, 2021, https://bit.ly/3g1QREq). The definition of being a good provider isn’t that you don’t make mistakes, but rather how you manage your mistakes and learn from them.
      Because cognitive dissonance is uncomfortable, a good gauge of whether bias is playing a role in the new information you receive is whether it feels good or uncomfortable. We can improve our decisions by seeking this discomfort and learning to accept it (Harv Bus Review, 2015;93[5]:64–71). I like to think of this as always rooting for the null hypothesis by asking questions such as “What else could this be?” or “If I am wrong, what harm will this do?” or “What evidence is there that I may be wrong?” We want to have passionate beliefs that give our lives color and meaning — but hold them lightly enough that we can admit when we are wrong. When we find evidence that is in conflict with our beliefs, it can at least plant a seed of doubt to move us to a new belief system.

      Making Use of Trip Wires

      The movement toward value-based care offers opportunities to design trip wires into the decision-making process that can stop an overly determined commitment to a harmful practice. For instance, someone other than the person originally responsible for the decision around a treatment or chosen practice can take responsibility for deciding about its continuation. Such pre-established trip wires can account for adverse events and sunk-cost bias when evaluating a provider’s process in both diagnosis and treatment decisions (BMJ 2017;356:j437).
      Following pre-established standards of care or algorithms can also reduce dissonance. Although care should be individualized, there are many situations where using algorithms or meeting certain criteria will improve outcomes and reduce the risk of bias. At a minimum, it is helpful for providers to document their decision-making process, including their rationale for not following pre-established criteria or well-accepted algorithms (Harvard Bus Rev 2015;93(5):64–71). The act of putting decisions into writing highlights the discomfort from cognitive dissonance and can be a catalyst for increased awareness of when bias is playing a major role in decision-making.
      Being aware of bias leads to refined judgment and a more discriminating eye for excellence — or, conversely, awareness of flaws in the process. Typically, the more we understand, the more we realize how little we understand. When bias is kept in check, hubris gets replaced by humility, and real confidence emerges from the recognition of the limits of the accuracy of our own perceptions. Becoming more educated about bias and cognitive dissonance, finding people willing to disagree with you and inviting their opinions, using pre-established trip wires, following algorithms, and generally rooting for the null hypothesis by questioning our own decision-making are the cornerstones of critical thinking in medicine — all leading to better collaboration and outcomes.
      Mr. Neill is a physician assistant who has worked in PALTC for over 10 years. He teaches at the University of Colorado PA program, serves on the executive board of CMDA — The Colorado Society for PALTC Medicine, is an assistant medical director for multiple nursing facilities, and contributes regularly to the Colorado Geriatric Journal Club.