Confirmation Bias
Confirmation bias is a Cognitive Bias where the tendency to search for, interpret, favour, and recall information in a way that confirms or supports prior beliefs or values.
Often referred to as confirmatory bias, myside bias, or congeniality bias, it involves selecting information that supports existing views while ignoring contrary information or interpreting ambiguous evidence as supporting current attitudes.
The effect is strongest for desired outcomes, emotionally charged issues, and deeply entrenched beliefs.
This bias operates at a subconscious level, making it difficult for individuals to realise or prevent it. It is a result of automatic, unintentional strategies rather than deliberate deception.
Mechanisms and Associated Effects
Confirmation bias arises from both cognitive and motivational mechanisms.
Cognitive explanations focus on the limited human capacity to process information and the use of shortcuts known as heuristics, such as the availability heuristic or the positive test strategy.
Motivational explanations involve the effect of desire on belief, such as wishful thinking or a drive for consistency to avoid Cognitive Dissonance.
The bias is used to explain several specific effects, including attitude polarization, where a disagreement becomes more extreme despite parties being exposed to the same evidence. It also contributes to belief perseverance, which occurs when beliefs persist even after the evidence for them is shown to be false.
Other associated outcomes include the irrational primacy effect, characterised by a greater reliance on information encountered early in a series, and illusory correlation, where an association between two events is falsely perceived.
Manifestations in Professional Practice
In clinical medicine, confirmation bias is a significant variable in decision-making that can lead to diagnostic errors and skewed treatment decisions.
A practitioner may prematurely focus on a particular disorder early in a diagnostic session and subsequently seek only confirming evidence. Within the legal field, cognitive biases affect investigations, judicial proceedings, and contract formation. A detective may identify a suspect early in an investigation and then largely seek supporting evidence while downplaying falsifying evidence.
In finance, the bias can lead investors to be overconfident and ignore evidence that their strategies will lose money. Scientific research is also susceptible, as researchers may rate studies reporting findings consistent with their prior beliefs more favourably than those reporting inconsistent findings.
Impact of Digital Media and Automated Systems
In social media, confirmation bias is amplified by the use of filter bubbles and algorithmic editing, which display to individuals only information they are likely to agree with while excluding opposing views.
This environment encourages the spread of misinformation, as users are more likely to believe and share unproven claims that align with their biased perspectives. Large language models employed in investment analysis have been shown to exhibit model-specific preferences that can harden into confirmation bias, causing the models to cling to initial judgements despite the presence of counter-evidence.
Furthermore, automated decision-making systems are vulnerable to feedback loops, where decisions have dynamic feedback effects on the system itself that can perpetuate and exacerbate existing biases over time. These loops can affect sampling processes, individual characteristics, or the data sets used to retrain machine learning models.
Debiasing and Mitigation Techniques
Debiasing refers to the process of minimising or eliminating the effects of cognitive biases. Cognitive strategies for individual decision-making include considering the opposite, which requires asking for reasons why an initial judgement might be wrong to assist in focusing attention on contrary evidence.
Taking an outsider’s view involves imagining the decision from an external perspective to counter the distortion caused by strong internal beliefs. Jonathan Haidt's research shows that left-wingers, and liberals, in general, tend to have a "blind spot" regarding certain moral perspectives which makes it difficult for them to understand or consider "outsider" views, particularly those of conservatives. See Moral Foundations Theory.
In organisational contexts, building a critical thinking culture can encourage the disclosure of dissenting views rather than prioritising social cohesion.
Structured techniques for groups include making a premortem, where a team imagines a future failure and identifies its likely causes to reduce optimism bias. Another approach is the appointment of a red team to act as a devil's advocate by identifies fundamental flaws and stress-testing a team’s plan from a competitor's perspective.
In technical systems, fine-tuning search engines to identify oppositions or incorporating fairness guarantees as constraints can help mitigate algorithmic bias.