The Vaccine Blog

"I want it that way" - Confirmation Bias

Confirmation Bias


  1. Why is it worth talking about more?

What is confirmation bias?

Confirmation, (“myside”) bias is noticing, favouring, and remembering information that confirms/strengthens existing beliefs or values. We will also disregard or place less emphasis on evidence contradicting them, and certainly won`t seek it out. It is extremely difficult, bordering on impossible, to remove. This is particularly true if the person has had the bias for a long period of time. It is also evident when opposing parties are both highly confident in their decisions.  It can therefore shape the dynamic of discussions around especially sensitive issues. These include vaccines and vaccine hesitancy. It is in this context that it applies to the vaccine debate, and is vital to understand.


To do this, consider a practical example. Someone thinks - all brown eyed people are good at maths. Whenever they encounter somebody with brown eyes who happens to be good at maths, they will interpret it as "evidence" for their claim. They are more likely to remember these people. They might put themselves in scenarios where there are likely to be people who are good at math anyway i.e. math championships etc. They might then continue to put themselves in these circumstances. This creates more "evidence" for them. However, they will disregard people with brown eyes who struggle with maths. Specifics are not important here. It`s just an analogy. The psychologist Jeremy E. Sherman summarised it in an excellent phrase “We want to keep out ideas that demote us and let in ideas that promote us”. 



To understand the significance of this, we need to put it into historical context. As with many mental models, it was first described by the Ancient Greeks. 


In The HIstory of the Peloponesian War, the Athenian historian and general Thucydides notes that it "is a habit of humanity to entrust to careless hope what they long for, and to use sovereign reason to thrust aside what they do not fancy."


The influential English psychologist Peter Wason studied CB in 1960, using a method called "Wasons Rule Discovery Test". Participants were shown a number sequence and asked to find a rule applying to all 3 numbers. They then had to come up with a number sequence following the same rule. For instance, the numbers 2,4,6 all satisfy the rule. Many participants said that the numbers doubled, and gave sequences such as 4,8,12 etc. However, this was incorrect. The rule was in fact that the numbers simply increased. People were looking to confirm pre existing beliefs rather than truly trying to find the number - confirmation bias. However, there is now speculation about how valid these original results were.  


Other studies

In a second study in 1968, Wason used a 4 card problem to demonstrate confirmation bias. There are four cards shown. The cards have a colour on one side, a number on the other. The participants are then asked to select two cards from the small deck to test the claim that even numbers and the colour red always appear on the same card. It is also a slight change to the initial experiment 8 years earlier. In this case, less than 10% of participants found the solution, with this result being replicated in 1993.


This idea was explored further by Greg's and Cox in 1982. This time it was situated in a tavern. In this scenario, the bartender is strict about following the law regarding underage drinking. Participants were asked to turn over two cards to test the claim that "If you are drinking then you are over 19". In contrast to Wasons original experiments, 73% of participants chose correctly, meaning the original confirmation bias was no longer evident.


The hypothesis of humans as confirmers was further disputed in 1987 by Klayman and Ha. They argued that a more accurate representation was that it is actually a positive test strategy under uncertain and changing circumstances. In an article, Dr. Gary Klein, Senior Scientist at MacroCognition LLT posed some interesting questions:  How would we know what information to search for if we did not have beliefs to frame our search, and put it into context? Confirmation bias is therefore not always dysfunctional. Therefore, like everything in science, our understanding of it evolves and changes over time.  



Why does it happen?

But why do people do this? The brain`s unconscious processing power is estimated at the equivalent of 11 million bits of information per second; our conscious mind 40 - 50 bits. So, we need mental shortcuts (heuristics) to cut down the amount of information we process. We don't have time or energy to think about all the possible variables in a scenario. So, discarding irrelevant data allows us to make the quickest, most well informed decision possible. That gives a sense of security and stability in our choices. It also solidifies what we believe and value. This promotes a sense of wellbeing.


Feeling that we are more accurate than others also improves mental health. Everyone wants to feel good about themselves. Feeling that we are inaccurate or incorrect has the exact opposite effect of this. This is particularly true if it is a deeply held and valued belief. We may have made irrevocable decisions because of these beliefs.  Maybe we raised our children with beliefs. We may have formed genuinely fulfilling friendships, relationships and professional connections because of them. Therefore, we may have based a really important part of our social lives on these beliefs. Letting go of a false belief means we need to change our self-identity. This is extremely painful. Devastating even, depending on how deeply these relationships ran. This is a really key reason why it happens


. Why is it prevalent i.e. Evolutionary basis?


What is the evolutionary basis of this?

Confirmation bias is generally seen as epistemically problematic. It can lead us to disregard accurate information and only engage with information confirming our opinion. This results in little to no motivation to even think that our beliefs might be wrong. Let alone examine or correct mistaken ones. However, as I`ll discuss, confirmation bias has persisted over time. It is extremely prevalent today.  So, it has been suggested by many scientific and philosophical studies that it is evolutionarily adaptive. It is thought to be an unavoidable part of human psychology, as well as nonhuman animals. 


It has been suggested that it evolved as a way to integrate ourselves into a tribe, giving a stable source of food, mates, and other resources. Ancestral humans were unlikely to be accepted into a tribe if they were unlike the other members of the group.  Further, it also enables quick decision-making in dangerous or otherwise stressful scenarios. Those who made quick decisions were more likely to escape predators, and survive to reproduce.   There was a huge selective pressure for mental shortcuts.


Therefore, we're left with a large proportion of the modern human population showing confirmation bias. Multiple studies have shown this to be true. It`s found in areas as diverse as psychiatry,  medical decision making (i.e. estimating amount of blood loss and amniotic fluid volume), to epidemiological research to child sexual abuse interviews. So, we're all vulnerable to confirmation bias. Vaccines affect everyone, so confirmation bias affects everyone's perceptions of vaccines, and other medical procedures. That makes it really influential in the antivaccine movement and therefore is worth discussing more. So, it's important that it's understood



Types of confirmation bias


There are several steps to answering a question. These include looking for information about it, interpreting the question and remembering the results at a later point. Cognitive biases can affect one or all of these stages. This amplifies the effect. Therefore, while we think we've done a thorough investigation, all we've really done is strengthen the original bias that was there to begin with.


Language use can reflect bias when we ask questions. Thinking about a question with two variables; "Why is ManU better than Liverpool", Google will return data of Man U wins and their strong point. Conversely, switching the order of the variables. i.e. Why Liverpool is better than Man U, you'll be returned data on Liverpool wins. Therefore, using affirmative language can skew data you get to support your hypothesis, even if it isn't true. Further, studies show that people formulate questions more likely to have a yes answer, known as the "congruence heuristic".


Interpretation bias involves inappropriately analysing or misinterpreting data to support a pre-existing belief, or beliefs. For instance, a vivid report of a significant adverse event following vaccination (i.e. anaphylaxis, death of a child) will be used as evidence to support the narrative of vaccine hesitant parents. Further, news reports on side effects and deaths influence public perception and therefore harm vaccination campaigns. This is similar to historical anti vaccine campaigns, such as the distribution of leaflets criticising smallpox vaccination in the early 19th century, or the polio eradication campaign in Nigeria in 2003.  This makes the idea of vaccine side effects less abstract. This is because parents regularly watch and read the media. They also interact with the types of communities portrayed in the media. So, if the harm of infectious disease is not communicated in the same way, parents will see vaccination (a benign behaviour), as hostile. This is specifically called negative interpretation bias. As pandemics come with isolation and uncertainty, it is unsurprising that this bias arises when new pandemic vaccines are newly developed and released.


The final type of bias is called memory bias. It can be subdivided into several types of bias. This is a heuristic that causes memories to be either easier or more difficult to recall. The specific memories we recall can be in line with our current emotional state i.e. stressful moods bring on stressful memories, calm moods calm memories etc.  A parent may be emotionally primed watching a news report, or documentary about adverse events following vaccination, such as Vaccine Roulette 1982 (discussed in the previous post). Particularly if their own child and/or family members experienced something similar. There were images of children in fear of vaccination, and in pain following vaccination. Further, there were images of children being disabled after vaccination. When significant time and energy is invested into raising a child, it can be very difficult when their life does not go according to how the parent envisioned it.


Therefore, everything linked to vaccination, including the word vaccination, images of needles, vials, and doctors are likely to trigger emotional trauma the parent has experienced. This emotional attachment to the memory of vaccination makes it easier to recall. This is called “availability bias”, or the availability heuristic. It is eloquently defined by the cognitive psychologist Celia Gleason as a bias “i The ease with which traumatic memories of vaccination can be recalled means that vaccine hesitant patients overestimate how likely it is that an adverse event will occur, as well as how commonly they occur. This is regardless of data around rates of adverse events occurring.


How can we mitigate confirmation bias?

There are therefore many ways in which confirmation bias can manifest itself. So, it is likely that we need a multifaceted solution to the problem. People`s circumstances vary to a huge degree. So, it is vital to look at each vaccine hesitant patient individually. The key is to tackle the underlying mistrust. Misinformation, antivaccination campaigns, are a symptom of mistrust towards healthcare authorities who recommend and administer vaccines, policymakers of vaccines, manufacturers, and governments. For this reason, information rarely changes the mind of vaccine hesitant patients.  We need to build trust. Meaning, we need to break the issue down into ways to build trust.


First, a clear communication style is critical. Too much medical jargon should be avoided, Short sentences and clear, precise language are key. Further, active listening is vital in patient interaction. This is possible by mirroring patient communication style and body language. Another key element is transparency. Some believe that making mistakes makes people trust them less. That is false. Making mistakes in itself is not the biggest killer of trust. Lying about them is. Covering them up is. This can lead to the end of a career, as we saw with the case of Andrew Wakefield. So, if errors are made, it is critical to apologise to the patient first and foremost, to explain how the error will be corrected, and what systems will be put in place to prevent this and similar errors happening in the future. This will restore a patient`s trust as much as possible. The patient needs to be informed and engaged throughout treatment. And first and foremost, the patient is the ultimate decision maker about their health. They have the right to ask for or refuse treatment. And it is important for healthcare professionals to accept these decisions. Whether it aligns with their own personal values or not.


The common denominator in all of these is compassion. It`s not enough to have empathy; we need to act on it. Basic respect in healthcare should be a right, not a privilege. I genuinely appreciate you taking the time to read this article.






  1. The Curious Case of Confirmation Bias | Psychology Today, - part of a collection of essays called “Snapshots of the Mind.”(2022)


  1. Confidence drives a neural confirmation bias - PMC (

Under a creative commons licence


  1. Confirmation Bias and the Wason Rule Discovery Test ( - Confirmation Bias,




  1. What Is the Function of Confirmation Bias? | SpringerLink



  1. Confirmation Bias: Going Beyond Our Personal Narrative - TechTello




  1. Availability Heuristic and Decision Making - Simply Psychology