Doctor, Heal Thyself - Five Pitfalls in Decisions about Diagnosis and Prescribing

The consequences of medical misdiagnoses are often catastrophic. Some psychologists hold that doctors are guided by heuristics-based cognitive processes that allow them to make constant on-the-spot decisions. Associate Professor of Marketing and doctor of social psychology Jill Klein outlines five examples of the most common forms of cognitive biases that can, however unwittingly, thwart doctors' best-laid plans.

by Jill Klein
Last Updated: 23 Jul 2013

Inaccurate medical diagnoses can obviously have catastrophic consequences. Some psychologists maintain that the types of constant, often immediate decisions that doctors must make are aided by "heuristics": certain mental strategies, supported by professional knowledge, that routinely provide shortcuts to major decisions that must be made very quickly. But many psychologists have also come to conclude that such heuristics can often, though completely subconsciously, lead to faulty judgments.

In this article - Five Pitfalls in Decisions About Diagnosis and Prescribing - published in April of 2005 the British Medical Journal, Associate Professor of Marketing and doctor of social psychology Jill Klein outlines five examples of the most common forms of cognitive bias that can affect the decision-making processes of medical professionals, and offers suggestions for avoiding them.

The often tragic irony underlying such misjudgements is that doctors, like many other highly trained professionals, not only often think themselves immune from such pitfalls, but are often also prone to certain cognitive biases which lead them to regard themselves as exceptional decision makers.

Klein details five of the most common - and potentially serious - heuristic pitfalls befalling doctors:

- The representative heuristic: an assumption that something viewed as similar to other things in a certain category should be seen as belonging to that category.

- The availability heuristic: a cognitive overemphasis on things coming to mind easily, mostly due to their being easily remembered or recently encountered.

- Overconfidence: Most educated people are poor at assessing the gaps in their knowledge, tending to overestimate both how much, and how reliably they "know" a subject.

- Confirmatory bias: the tendency to seek out, notice and remember whatever information fits best with one's pre-established expectations.

- Illusory correlation: the inclination towards accepting events as causally related, when the actual connection between them is either coincidental, or non-existent.

The author provides both everyday and medical examples of such heuristically-based misconceptions, or long-term biases. For example, recent research discloses a tendency for many doctors to overestimate the likelihood of addiction when prescribing opiod analgesics. This has often led to under treating patients, even those in severe pain.

In regard to confirmatory bias, doctors often tend far more to ask questions that reinforce their earliest diagnostic judgements. Even worse, they may stop asking major questions, since they have already effectively made conclusions skewered by earlier analysis. This can obviously also greatly hamper ongoing treatments. And while acknowledging that in everyday practice, such biases may not necessarily fit any of these five categories neatly, Klein also offers straightforward guidelines for better decision making.

British Medical Journal, April 2005

Find this article useful?

Get more great articles like this in your inbox every lunchtime


Get your essential reading delivered. Subscribe to Management Today