The pop culture conception is that Emergency Medicine is a profession of endless action and doing - CPR, intubations, chest tubes, resuscitating patients from the brink of death. The reality of the job is seemingly somewhat more banal. We are far more often engrossed in eliciting a history from patients with vague or contradictory symptoms, performing physical exams, and mentally working through an array of diagnostic possibilities from the benign to the life-threatening. This is where the true challenge of Emergency Medicine lies. Learning to place a chest tube, intubate a patient, or put in an arterial line is a veritable walk in the park in comparison to learning how to make clinical decisions that are accurate and reflect a wary understanding of potential life threatening disease processes all while operating in an environment full of distractions and interruptions. The reality of the Emergency Department is that not everybody is sick, but every patient could be sick. The task of finding the sick patients among the non-sick is far more challenging than it may appear and the diagnostic process is far more fraught with potential sources of error than one would like.
There are a host of resources out there that explore the complexities of clinical decision making in the Emergency Department.
Understanding the common biases that are present in our clinical decision making is an important first step in detecting and countering their potentially negative effects. Below is a list of common biases that are encountered in the Emergency Department. Definitions are adapted from Croskerry (2002).
Anchoring is the tendency to fixate on specific features of a presentation too early in the diagnostic process, and to base the likelihood of a particular event on information available at the outset (i.e., the first impression gained on first exposure, the initial approximate judgement). This may often be an effective strategy. However, this initial impression exerts an overly powerful effect in some people and they fail to adjust it sufficiently in the light of later information.
+ Premature Closure
Physicians typically generate several diagnoses early in their encounter with a clinical problem. Premature closure occurs when one of these diagnoses is accepted before it has been fully verified. The tendency to apply closure to the problem-solving process can result from vivid presenting features that may be convincing for a particular diagnosis, or by anchoring on to salient features early in the presentation.
+ Representativeness Restraint
Representativeness underlies the medical maxim: if it looks like a duck, walks like a duck, and quacks like a duck, it is a duck. The patient’s signs and symptoms are matched against a physician’s mental templates for their representativeness. Representativeness restraint is the error that occurs when the physician restrains from considering a particular diagnosis for a patient because the patient is not sufficiently representative of the class.
+ Search Satisficing
Search satisfying is the tendency to call off a search once something is found. This includes calling off the search for an alternative diagnosis and calling off the search for additional diagnoses.
+ Sutton’s Slip
Sutton’s law is a clinical law based on the diagnostic strategy of ‘‘go- ing for where the money is.’ Sutton’s law is also characterized by Occam’s razor, the principle of parsimony in philosophy and psychology, and by the popular acronym KISS (keep it simple, stupid). Applications of Sutton’s law, Occam’s razor, and KISS may often be successful and may avoid costly, time-delaying diagnostic tests. However, whenever they are used there should be an awareness of the associated pitfalls. Sutton’s slip is the error associated with Sutton’s law.
Triage-cueing is a phenomenon mostly restricted to the ED, but it potentially exists wherever there is a triage process between the patient and the point of care at which definitive assessment is made. Thus, the geographical disposition of patients from triage predetermines how they may be seen and diagnosed by the physician and, importantly, how their destiny might unfold.
+ Unpacking Principle
The judged likelihood of a particular event or possibility increases when a more detailed or explicit description is available. The more specific the description we receive, the more likely we judge an event to be. If all the various possibilities in a problem space are not specified (are not unpacked), we have a tendency to ignore them. Unpacking is a strategy to improve the availability (see availability heuristic, above) of all possibilities and/or events. Not surprisingly, alternate descriptions of the same event, situation, or possibility may lead to different judgements about their likelihood of occurrence. Failing to unpack all relevant possibilities may result in delayed or missed diagnoses.
+ Vertical Line Failure
Much of our cognitive activity in the ED is straightforward. Many problems encountered require a vertical, straightforward approach that leads to a clear diagnosis and management. This approach emphasizes economy, efficacy, and utility, and is invariably rewarded. Thus, the presentation of a patient with flank pain, nausea, and hematuria will inevitably lead to a presumptive diagnosis of ureteral colic. However, this orthodox approach is so well reinforced, that it may become ingrained and lead to a reduced flexibility in those situations that require some lateral thinking. Rigidity and inflexibility in the approach to clinical problems may lead to important diagnoses’ being delayed or missed.
+ Playing the Odds
Playing the odds refers to the process by which the physician, consciously or otherwise, decides that the patient does not have the disease on the basis of an odds judgment; i.e., the decision has been primarily determined by inductive thinking (the physician’s perception of the odds), rather than by objective evidence that has ruled out the disease.
+ Omission Bias
Omission bias is the tendency toward inaction, or reluctance to treat. Inaction is preferred over action through fear of being held directly responsible for the outcome.
+ Confirmation Bias
Confirmation bias is reflected in a tendency to look for confirming evidence to support the hypothesis, rather than look for disconfirming evidence to refute it. In difficult cases, confirming evidence feels good, whereas disconfirming evidence undermines the hypothesis and means that the thinking process may need to be restarted.
In general, we usually think we know more than we do, often without having gathered sufficient information, and generally place too much faith in our own opinions. Those who are overconfident tend to spend insufficient time accumulating evidence and synthesizing it before action. They are more inclined to act on incomplete information and hunches. When overconfident people believe that their involvement might have a significant impact on outcomes (whether it actually does or not), they tend to believe strongly that the outcome will be positive. Thus, they disproportionately value their contribution.
+ Psych-Out Error
A psych-out error is any error involving psychiatric patients.
+ Diagnosis Momentum
Diagnosis momentum refers to the tendency for a particular diagnosis to become established without adequate evidence. It has some similarities with, but differs from premature closure (see below). Premature closure occurs when a physician adopts a particular diagnosis without adequate verification, whereas diagnosis momentum may involve several intermediaries including the patient. Typically, the pro- cess starts with an opinion, not necessarily a medical one, of what the source of the patient’s symptoms might be. As this is passed from person to person (e.g., friend or relative to patient to paramedic to nurse to physician), the diagnosis gathers momentum to the point that it may appear almost certain by the time the patient sees a physician.
Written by Jeffery Hill, MD MEd