The short answer is yes – they are, and they have a consequence.

You see, interpretation and assessment errors in daily clinical care occur for many reasons, some of which are based in cognitive biases. These result from limited perspectives, faulty mental shortcuts, or unconscious biases, and yet practitioners are usually unaware they exist.

Clinical practices such as Nutritional Therapy and other lifestyle orientated approaches are subject to interpretation and application error due to the elevated case complexity (despite there being extended time to examine) and the need for rapid interpretation of multiple points of information.

Albeit that related risks to these errors are generally low, it should be a constant intent for all practitioners to refine their interpretation/diagnostic skills to reduce to a minimum any errors and related loss of optimal outcomes.

Cognitive errors

Let’s cover this off in more detail, as cognitive bias is an important concept to recognise in understanding error and the influence this can have on our decision-making.

Cognitive biases, also known as ‘heuristics’, are cognitive (mental and emotional) short cuts used to aid our decision-making. A heuristic can be thought of as a cognitive ‘rule of thumb’ or cognitive guideline that you subconsciously apply to a complex situation to make decision-making easier and hopefully more efficient. In effect a process that confers benefit, supported by experience to enhanced outcomes. But they are highly prone to creating patterns that facilitate error.

Most, if not all, clinical decision-makers are at risk of error due to bias, it seems to be a ubiquitous phenomenon and does not correlate with intelligence nor any other measure of cognitive ability. Yet rather ironically, a lack of insight into one’s own bias is common.

The causes of bias are varied, and include learned or innate biases, social and cultural biases, a lack of appreciation for statistics and mathematical rationality, and even simply environmental stimuli competing for our attention.

Confirmation bias is one type of cognitive error and one that we all need to be aware of. See if you recognise anything of your own workflow in this thought process:

“Once supporting information is found for a ‘diagnosis’, the search for information to rule out the diagnosis stops.”

i.e. practitioners tend to interpret the information gained during a consultation to fit their preconceived diagnosis, rather than the converse.

Example:

Suspecting the patient has a Gastrointestinal Dysbiosis and the change in bacterial species in the stool test proves this, rather than ‘I wonder why the bacterial composition is altered and what other findings or related changes are there?’

Of course correcting for the related biases that pervade our decision making, requires us to first recognise that biases exist, and this poster suggests that there at least 24 of them!

Improving our understanding and awareness of our own dominant biases is a sensible first step in enhancing our understanding of clinical decision-making. Identifying, exploring and rationalising patterns in our lives, especially ones relating to heuristics, that may in turn diminish our analytical and critical thinking skills is an ongoing challenge.

But by taking this developmental challenge on, we can then improve our skills, establish greater confidence in decision making and translate this into better patient/client care. Outcomes can then inform future decision making, support research and equip practitioners with greater analytical skills for the cognitive rigors of clinical life.

To take a pragmatic approach, remembering a few important points about how to apply this in our clinical life and following some suggested and tested rules for good decision-making such as the 8 below can really help.

  1. Download the free poster set or donate. Review the summary comments, consider further exploration.
  2. Slow down – take time to reflect on your decision-making flow.
  3. Be aware of base rates for your determinations (i.e check that the underlying incident rates of conditions or population-based knowledge are not being ignored as if they do not apply to the patient in question.)
  4. Consider what data is truly relevant – use a time-line to pull out key events.
  5. Actively seek alternative diagnoses/explanations – test your first thoughts.
  6. Ask questions to disprove your hypothesis.
  7. Remember you are often wrong.
  8. Consider the immediate implications of this.

CE Forum (ceforum.info), a peer led discussion site, free to professional members provides extensive case analysis as well as direct support from experienced practitioners. An ideal place to safely test your current thoughts against those of others. To conclude, a useful quote to muse over…

“The human brain is a complex organ with the wonderful power of enabling man to find reasons for continuing to believe whatever it is that he wants to believe.” – Voltaire

Leave a Reply

Your email address will not be published. Required fields are marked *