Article Type
Changed
Fri, 09/14/2018 - 12:37
Display Headline
The Psychology of Error

What do your patient care errors have in common with financial mistakes that may compromise your retirement? Both have their underpinnings in the psychological strategies and tendencies we call heuristics.

The word derives from the Greek term “heuriskein” for discovery, but in the medical context we frequently think of these as these as mental shortcuts. Heuristics allow us to operate quickly despite the bewildering degree of complexity and uncertainty we encounter as we operate in the world but also lay the groundwork for disaster when they lead us astray. Let’s examine two mistakes and look at what they have in common: one that led to a drubbing in the stock market and the other that cost a patient his life.

A Market Misadventure

During the height of the market boom a young internist purchased shares of an exciting new biotech company poised at the forefront of tailored medical therapy based on genetic sequencing.

The stock nearly doubled, but as he rode the wild ride of the market’s fluctuations it became evident that the overall trend had changed. Almost daily monitoring of the press releases from the dynamic CEO helped reinforce his decision to hold the stock even after the dizzying drop that changed a strong gain to a significant loss. Finally after waiting months for the stock ticker to nudge back up to his entry point, he was glumly forced to face the loss.

The field of behavioral finance suggests humans are subject to cognitive predispositions leading to predictable errors. The first heuristic failure demonstrated by the unfortunate internist in our example is that of anchoring (see sidebar, p. 35).

The initial impression of the value of the company or particular price at which he purchased the stock has significance to him but is completely irrelevant to the value of the company once events and profit prospects changed. Thus, when new information about the company came to light, the focus should have been exclusively on the future valuation without regard to the past. That didn’t happen in this case. Our hapless investor had become anchored to the original price and refused to sell as it plummeted in the vain hopes that it would rise again despite the absence of evidence that this was likely.

Anchoring bias affects all of us and is as true in medicine as it is in the markets. The first diagnosis, which seems likely as we hear a case described, can be surprisingly hard to shake even when the facts on the ground have changed.

A second human tendency we see leading to both financial and medical calamity is the desire to be right. A strong self-image (and many physicians have a strong one, indeed) is bolstered by seeking information that confirms prior beliefs.

Unfortunately this confirmation bias can also cause us to overvalue the positive press about a company we are invested in and discount or not read at all things that might change our minds. Back in the clinical environment, examples abound where a physician becomes fixed on a diagnosis and orders tests designed to confirm the initial impression but fails to explore alternatives. The more invested in a diagnosis we become, the more selective we tend to be in seeking and interpreting data to reinforce our convictions.

Key Points

  • Heuristics are ubiquitous and help us function despite the bewildering complexity and ambiguity in medicine.
  • Heuristics function as short-cuts that serve us well most of the time but that lead us astray in predictable circumstances.
  • Cognitive forcing strategies help to guard against heuristic failures. Examples include deliberate use of the differential diagnosis and including diagnostic uncertainty as part of checkout at transitions.
  • Meta-cognition is the process of conscious attention to our own decision-making. A moment spent to reflect on how you came to a diagnosis may be time well spent.

The Bottom Line

Human psychology creates predictable tendencies to error. Awareness of the particular cognitive traps that befall the hospitalist allows the clinician to guard against being led astray.

 

 

Higher Stakes

Years later and hundreds of miles away a nocturnist gets a call from the emergency department (ED) on the seventh new admission of the night.

“I’ve got another rule-out myocardial infarction (MI) for you” said the ED physician, who briefly provided the assessment that the patient was low risk, with negative enzymes, chest X-ray, and electrocardiogram.

The nocturnist noted the atypical severity of the pain, systolic blood pressure more than 200, and positive cocaine history. But this did not alter the plan as the patient was passed from the ED physician to the nocturnist and then to the hospitalist who assumed care the next morning. Unfortunately, it took the patient experiencing a severe increase in tearing pain radiating to his back during the exercise stress test to prompt the discovery of his ascending aortic dissection. The patient died on the operating room table, leaving all three physicians wondering how they could have missed the diagnosis when in retrospect it seemed so obvious.

Present the same clinical scenario at grand rounds and the third-year medical students could tell you dissection should have been considered. How did three smart experienced people all make the same fatal mistake?

This case demonstrates a number of heuristic failures. Availability bias is a form of pattern recognition and arises from our habit of perceiving the things we see often as more likely than those which we have not seen or thought about recently. Hoof beats in Kentucky, as they say, are usually not a herd of zebra. ED physicians see what at times seems like hordes of patients with low-risk chest pain, the vast majority of which lack a life-threatening etiology. Thus, we can become complacent in assuming that the next admission for chest pain reflects the same cause as the seven before.

Pattern recognition serves a vital role. Most expert physicians rely on this more than classic deductive reasoning and, much less, Bayesian analysis. Casino operators exploit this tendency to see false patterns to their profit by installing displays that show the last 10 to 20 results over the roulette table. However, just as each turn of the roulette wheel is not influenced by prior spins, each patient is unique. One must beware of the misleading power of the availability bias.

Once the initial misdiagnosis had been made, the anchoring bias and confirmation bias continued the cascade of events—turning a mistake from a temporary error to a disaster. The phrase “chest pain rule out MI” not only encourages the physician to minimize the potential severity of the symptom via the framing effect but also telegraphs the anchoring phenomenon by fixing on a single disease concern for a symptom whose etiologies are legion.

However, even accepting that the initial diagnosis by the ED doctor was influenced by the availability bias, why was this not corrected by the nocturnist or by the hospitalist on the next day? The answer lies in diagnosis momentum.

Each physician does not evaluate the patient in isolation but rather has a tendency to include the assessment of the prior clinician as part of their own decision-making process. The more people who have seen the patient and agreed with the diagnosis, the higher the mental hurdle becomes to disagree and take the work-up in a different direction.

click for large version
click for large version

What You Can Do

Does the mere existence of these many heuristics condemn the physician to a career of repeating these potentially fatal errors? The obvious answer is no, but the solution requires a concerted effort on the part of the physician to avoid these mistakes.

 

 

Step one is to recognize that many heuristics are essentially abbreviations of full conscious reasoning. Now take a physician who is tired, stressed, or inundated with multiple tasks. In an effort to organize the seemingly chaotic world of medicine the mind seeks a crutch. These mental shortcuts allow us to quickly process massive amounts of information and come up with a reasonable plan that will be right most of the time.

When rushed, stressed, and distracted, we are most prone to use these shortcuts. These times of pressure are exactly when it is most important to pause and consider whether we’re acting on gut feeling or on full consideration of all the evidence. Awareness of the predictable circumstances that create the set-up for heuristic failures allows for a moment of reflection to prevent falling into one of these psychological traps. This process of deliberately considering our own decision-making is referred to as meta-cognition.

An additional familiar tool available to the physician is differential diagnosis. This is essentially a form of cognitive forcing strategy designed to guard against availability and anchoring biases. By deliberately creating a list of alternative possibilities, we become less prone to anchor on a single diagnosis.

By briefly reviewing the rare possibilities we have not seen recently and bringing them to the forefront of memory, we diminish the power of the availability bias. Spending a second or two considering the differential—even in seemingly routine cases—will defuse the hold of these particular heuristics.

Hospitalists by the nature of our practice tend to have multiple transitions in patient care. At times this offers a fresh perspective to correct mistakes, but it also offers potential to compound them via diagnosis momentum.

We habitually convey diagnosis and treatment plans to our partners at handoffs. Including a level of uncertainty as part of checkout would create a cue for the accepting physician to decrease the risk of this heuristic failure. One might imagine the patient in the case above would have had a greater probability of survival if the nocturnist had conveyed a diagnosis of “chest pain of uncertain etiology” to his partner rather than “chest pain rule-out MI.”

As illustrated by the cases above, heuristics are not mistakes in and of themselves. They are the assumptions and pattern-recognition techniques that serve us well the majority of the time in and out of medicine. Recognizing when you take one of these mental shortcuts, being aware of the circumstances that predispose to error creation, and evaluating your decision-making process allows the astute physician to guard against the times when they fail. Greater self-awareness of the process of your own cognition can make for a better clinician—and perhaps even make you a better investor. TH

Drs. Cumbler and Trosterman are assistant professors in the Section of Hospital Medicine at the University of Colorado.

Issue
The Hospitalist - 2007(11)
Publications
Sections

What do your patient care errors have in common with financial mistakes that may compromise your retirement? Both have their underpinnings in the psychological strategies and tendencies we call heuristics.

The word derives from the Greek term “heuriskein” for discovery, but in the medical context we frequently think of these as these as mental shortcuts. Heuristics allow us to operate quickly despite the bewildering degree of complexity and uncertainty we encounter as we operate in the world but also lay the groundwork for disaster when they lead us astray. Let’s examine two mistakes and look at what they have in common: one that led to a drubbing in the stock market and the other that cost a patient his life.

A Market Misadventure

During the height of the market boom a young internist purchased shares of an exciting new biotech company poised at the forefront of tailored medical therapy based on genetic sequencing.

The stock nearly doubled, but as he rode the wild ride of the market’s fluctuations it became evident that the overall trend had changed. Almost daily monitoring of the press releases from the dynamic CEO helped reinforce his decision to hold the stock even after the dizzying drop that changed a strong gain to a significant loss. Finally after waiting months for the stock ticker to nudge back up to his entry point, he was glumly forced to face the loss.

The field of behavioral finance suggests humans are subject to cognitive predispositions leading to predictable errors. The first heuristic failure demonstrated by the unfortunate internist in our example is that of anchoring (see sidebar, p. 35).

The initial impression of the value of the company or particular price at which he purchased the stock has significance to him but is completely irrelevant to the value of the company once events and profit prospects changed. Thus, when new information about the company came to light, the focus should have been exclusively on the future valuation without regard to the past. That didn’t happen in this case. Our hapless investor had become anchored to the original price and refused to sell as it plummeted in the vain hopes that it would rise again despite the absence of evidence that this was likely.

Anchoring bias affects all of us and is as true in medicine as it is in the markets. The first diagnosis, which seems likely as we hear a case described, can be surprisingly hard to shake even when the facts on the ground have changed.

A second human tendency we see leading to both financial and medical calamity is the desire to be right. A strong self-image (and many physicians have a strong one, indeed) is bolstered by seeking information that confirms prior beliefs.

Unfortunately this confirmation bias can also cause us to overvalue the positive press about a company we are invested in and discount or not read at all things that might change our minds. Back in the clinical environment, examples abound where a physician becomes fixed on a diagnosis and orders tests designed to confirm the initial impression but fails to explore alternatives. The more invested in a diagnosis we become, the more selective we tend to be in seeking and interpreting data to reinforce our convictions.

Key Points

  • Heuristics are ubiquitous and help us function despite the bewildering complexity and ambiguity in medicine.
  • Heuristics function as short-cuts that serve us well most of the time but that lead us astray in predictable circumstances.
  • Cognitive forcing strategies help to guard against heuristic failures. Examples include deliberate use of the differential diagnosis and including diagnostic uncertainty as part of checkout at transitions.
  • Meta-cognition is the process of conscious attention to our own decision-making. A moment spent to reflect on how you came to a diagnosis may be time well spent.

The Bottom Line

Human psychology creates predictable tendencies to error. Awareness of the particular cognitive traps that befall the hospitalist allows the clinician to guard against being led astray.

 

 

Higher Stakes

Years later and hundreds of miles away a nocturnist gets a call from the emergency department (ED) on the seventh new admission of the night.

“I’ve got another rule-out myocardial infarction (MI) for you” said the ED physician, who briefly provided the assessment that the patient was low risk, with negative enzymes, chest X-ray, and electrocardiogram.

The nocturnist noted the atypical severity of the pain, systolic blood pressure more than 200, and positive cocaine history. But this did not alter the plan as the patient was passed from the ED physician to the nocturnist and then to the hospitalist who assumed care the next morning. Unfortunately, it took the patient experiencing a severe increase in tearing pain radiating to his back during the exercise stress test to prompt the discovery of his ascending aortic dissection. The patient died on the operating room table, leaving all three physicians wondering how they could have missed the diagnosis when in retrospect it seemed so obvious.

Present the same clinical scenario at grand rounds and the third-year medical students could tell you dissection should have been considered. How did three smart experienced people all make the same fatal mistake?

This case demonstrates a number of heuristic failures. Availability bias is a form of pattern recognition and arises from our habit of perceiving the things we see often as more likely than those which we have not seen or thought about recently. Hoof beats in Kentucky, as they say, are usually not a herd of zebra. ED physicians see what at times seems like hordes of patients with low-risk chest pain, the vast majority of which lack a life-threatening etiology. Thus, we can become complacent in assuming that the next admission for chest pain reflects the same cause as the seven before.

Pattern recognition serves a vital role. Most expert physicians rely on this more than classic deductive reasoning and, much less, Bayesian analysis. Casino operators exploit this tendency to see false patterns to their profit by installing displays that show the last 10 to 20 results over the roulette table. However, just as each turn of the roulette wheel is not influenced by prior spins, each patient is unique. One must beware of the misleading power of the availability bias.

Once the initial misdiagnosis had been made, the anchoring bias and confirmation bias continued the cascade of events—turning a mistake from a temporary error to a disaster. The phrase “chest pain rule out MI” not only encourages the physician to minimize the potential severity of the symptom via the framing effect but also telegraphs the anchoring phenomenon by fixing on a single disease concern for a symptom whose etiologies are legion.

However, even accepting that the initial diagnosis by the ED doctor was influenced by the availability bias, why was this not corrected by the nocturnist or by the hospitalist on the next day? The answer lies in diagnosis momentum.

Each physician does not evaluate the patient in isolation but rather has a tendency to include the assessment of the prior clinician as part of their own decision-making process. The more people who have seen the patient and agreed with the diagnosis, the higher the mental hurdle becomes to disagree and take the work-up in a different direction.

click for large version
click for large version

What You Can Do

Does the mere existence of these many heuristics condemn the physician to a career of repeating these potentially fatal errors? The obvious answer is no, but the solution requires a concerted effort on the part of the physician to avoid these mistakes.

 

 

Step one is to recognize that many heuristics are essentially abbreviations of full conscious reasoning. Now take a physician who is tired, stressed, or inundated with multiple tasks. In an effort to organize the seemingly chaotic world of medicine the mind seeks a crutch. These mental shortcuts allow us to quickly process massive amounts of information and come up with a reasonable plan that will be right most of the time.

When rushed, stressed, and distracted, we are most prone to use these shortcuts. These times of pressure are exactly when it is most important to pause and consider whether we’re acting on gut feeling or on full consideration of all the evidence. Awareness of the predictable circumstances that create the set-up for heuristic failures allows for a moment of reflection to prevent falling into one of these psychological traps. This process of deliberately considering our own decision-making is referred to as meta-cognition.

An additional familiar tool available to the physician is differential diagnosis. This is essentially a form of cognitive forcing strategy designed to guard against availability and anchoring biases. By deliberately creating a list of alternative possibilities, we become less prone to anchor on a single diagnosis.

By briefly reviewing the rare possibilities we have not seen recently and bringing them to the forefront of memory, we diminish the power of the availability bias. Spending a second or two considering the differential—even in seemingly routine cases—will defuse the hold of these particular heuristics.

Hospitalists by the nature of our practice tend to have multiple transitions in patient care. At times this offers a fresh perspective to correct mistakes, but it also offers potential to compound them via diagnosis momentum.

We habitually convey diagnosis and treatment plans to our partners at handoffs. Including a level of uncertainty as part of checkout would create a cue for the accepting physician to decrease the risk of this heuristic failure. One might imagine the patient in the case above would have had a greater probability of survival if the nocturnist had conveyed a diagnosis of “chest pain of uncertain etiology” to his partner rather than “chest pain rule-out MI.”

As illustrated by the cases above, heuristics are not mistakes in and of themselves. They are the assumptions and pattern-recognition techniques that serve us well the majority of the time in and out of medicine. Recognizing when you take one of these mental shortcuts, being aware of the circumstances that predispose to error creation, and evaluating your decision-making process allows the astute physician to guard against the times when they fail. Greater self-awareness of the process of your own cognition can make for a better clinician—and perhaps even make you a better investor. TH

Drs. Cumbler and Trosterman are assistant professors in the Section of Hospital Medicine at the University of Colorado.

What do your patient care errors have in common with financial mistakes that may compromise your retirement? Both have their underpinnings in the psychological strategies and tendencies we call heuristics.

The word derives from the Greek term “heuriskein” for discovery, but in the medical context we frequently think of these as these as mental shortcuts. Heuristics allow us to operate quickly despite the bewildering degree of complexity and uncertainty we encounter as we operate in the world but also lay the groundwork for disaster when they lead us astray. Let’s examine two mistakes and look at what they have in common: one that led to a drubbing in the stock market and the other that cost a patient his life.

A Market Misadventure

During the height of the market boom a young internist purchased shares of an exciting new biotech company poised at the forefront of tailored medical therapy based on genetic sequencing.

The stock nearly doubled, but as he rode the wild ride of the market’s fluctuations it became evident that the overall trend had changed. Almost daily monitoring of the press releases from the dynamic CEO helped reinforce his decision to hold the stock even after the dizzying drop that changed a strong gain to a significant loss. Finally after waiting months for the stock ticker to nudge back up to his entry point, he was glumly forced to face the loss.

The field of behavioral finance suggests humans are subject to cognitive predispositions leading to predictable errors. The first heuristic failure demonstrated by the unfortunate internist in our example is that of anchoring (see sidebar, p. 35).

The initial impression of the value of the company or particular price at which he purchased the stock has significance to him but is completely irrelevant to the value of the company once events and profit prospects changed. Thus, when new information about the company came to light, the focus should have been exclusively on the future valuation without regard to the past. That didn’t happen in this case. Our hapless investor had become anchored to the original price and refused to sell as it plummeted in the vain hopes that it would rise again despite the absence of evidence that this was likely.

Anchoring bias affects all of us and is as true in medicine as it is in the markets. The first diagnosis, which seems likely as we hear a case described, can be surprisingly hard to shake even when the facts on the ground have changed.

A second human tendency we see leading to both financial and medical calamity is the desire to be right. A strong self-image (and many physicians have a strong one, indeed) is bolstered by seeking information that confirms prior beliefs.

Unfortunately this confirmation bias can also cause us to overvalue the positive press about a company we are invested in and discount or not read at all things that might change our minds. Back in the clinical environment, examples abound where a physician becomes fixed on a diagnosis and orders tests designed to confirm the initial impression but fails to explore alternatives. The more invested in a diagnosis we become, the more selective we tend to be in seeking and interpreting data to reinforce our convictions.

Key Points

  • Heuristics are ubiquitous and help us function despite the bewildering complexity and ambiguity in medicine.
  • Heuristics function as short-cuts that serve us well most of the time but that lead us astray in predictable circumstances.
  • Cognitive forcing strategies help to guard against heuristic failures. Examples include deliberate use of the differential diagnosis and including diagnostic uncertainty as part of checkout at transitions.
  • Meta-cognition is the process of conscious attention to our own decision-making. A moment spent to reflect on how you came to a diagnosis may be time well spent.

The Bottom Line

Human psychology creates predictable tendencies to error. Awareness of the particular cognitive traps that befall the hospitalist allows the clinician to guard against being led astray.

 

 

Higher Stakes

Years later and hundreds of miles away a nocturnist gets a call from the emergency department (ED) on the seventh new admission of the night.

“I’ve got another rule-out myocardial infarction (MI) for you” said the ED physician, who briefly provided the assessment that the patient was low risk, with negative enzymes, chest X-ray, and electrocardiogram.

The nocturnist noted the atypical severity of the pain, systolic blood pressure more than 200, and positive cocaine history. But this did not alter the plan as the patient was passed from the ED physician to the nocturnist and then to the hospitalist who assumed care the next morning. Unfortunately, it took the patient experiencing a severe increase in tearing pain radiating to his back during the exercise stress test to prompt the discovery of his ascending aortic dissection. The patient died on the operating room table, leaving all three physicians wondering how they could have missed the diagnosis when in retrospect it seemed so obvious.

Present the same clinical scenario at grand rounds and the third-year medical students could tell you dissection should have been considered. How did three smart experienced people all make the same fatal mistake?

This case demonstrates a number of heuristic failures. Availability bias is a form of pattern recognition and arises from our habit of perceiving the things we see often as more likely than those which we have not seen or thought about recently. Hoof beats in Kentucky, as they say, are usually not a herd of zebra. ED physicians see what at times seems like hordes of patients with low-risk chest pain, the vast majority of which lack a life-threatening etiology. Thus, we can become complacent in assuming that the next admission for chest pain reflects the same cause as the seven before.

Pattern recognition serves a vital role. Most expert physicians rely on this more than classic deductive reasoning and, much less, Bayesian analysis. Casino operators exploit this tendency to see false patterns to their profit by installing displays that show the last 10 to 20 results over the roulette table. However, just as each turn of the roulette wheel is not influenced by prior spins, each patient is unique. One must beware of the misleading power of the availability bias.

Once the initial misdiagnosis had been made, the anchoring bias and confirmation bias continued the cascade of events—turning a mistake from a temporary error to a disaster. The phrase “chest pain rule out MI” not only encourages the physician to minimize the potential severity of the symptom via the framing effect but also telegraphs the anchoring phenomenon by fixing on a single disease concern for a symptom whose etiologies are legion.

However, even accepting that the initial diagnosis by the ED doctor was influenced by the availability bias, why was this not corrected by the nocturnist or by the hospitalist on the next day? The answer lies in diagnosis momentum.

Each physician does not evaluate the patient in isolation but rather has a tendency to include the assessment of the prior clinician as part of their own decision-making process. The more people who have seen the patient and agreed with the diagnosis, the higher the mental hurdle becomes to disagree and take the work-up in a different direction.

click for large version
click for large version

What You Can Do

Does the mere existence of these many heuristics condemn the physician to a career of repeating these potentially fatal errors? The obvious answer is no, but the solution requires a concerted effort on the part of the physician to avoid these mistakes.

 

 

Step one is to recognize that many heuristics are essentially abbreviations of full conscious reasoning. Now take a physician who is tired, stressed, or inundated with multiple tasks. In an effort to organize the seemingly chaotic world of medicine the mind seeks a crutch. These mental shortcuts allow us to quickly process massive amounts of information and come up with a reasonable plan that will be right most of the time.

When rushed, stressed, and distracted, we are most prone to use these shortcuts. These times of pressure are exactly when it is most important to pause and consider whether we’re acting on gut feeling or on full consideration of all the evidence. Awareness of the predictable circumstances that create the set-up for heuristic failures allows for a moment of reflection to prevent falling into one of these psychological traps. This process of deliberately considering our own decision-making is referred to as meta-cognition.

An additional familiar tool available to the physician is differential diagnosis. This is essentially a form of cognitive forcing strategy designed to guard against availability and anchoring biases. By deliberately creating a list of alternative possibilities, we become less prone to anchor on a single diagnosis.

By briefly reviewing the rare possibilities we have not seen recently and bringing them to the forefront of memory, we diminish the power of the availability bias. Spending a second or two considering the differential—even in seemingly routine cases—will defuse the hold of these particular heuristics.

Hospitalists by the nature of our practice tend to have multiple transitions in patient care. At times this offers a fresh perspective to correct mistakes, but it also offers potential to compound them via diagnosis momentum.

We habitually convey diagnosis and treatment plans to our partners at handoffs. Including a level of uncertainty as part of checkout would create a cue for the accepting physician to decrease the risk of this heuristic failure. One might imagine the patient in the case above would have had a greater probability of survival if the nocturnist had conveyed a diagnosis of “chest pain of uncertain etiology” to his partner rather than “chest pain rule-out MI.”

As illustrated by the cases above, heuristics are not mistakes in and of themselves. They are the assumptions and pattern-recognition techniques that serve us well the majority of the time in and out of medicine. Recognizing when you take one of these mental shortcuts, being aware of the circumstances that predispose to error creation, and evaluating your decision-making process allows the astute physician to guard against the times when they fail. Greater self-awareness of the process of your own cognition can make for a better clinician—and perhaps even make you a better investor. TH

Drs. Cumbler and Trosterman are assistant professors in the Section of Hospital Medicine at the University of Colorado.

Issue
The Hospitalist - 2007(11)
Issue
The Hospitalist - 2007(11)
Publications
Publications
Article Type
Display Headline
The Psychology of Error
Display Headline
The Psychology of Error
Sections
Disallow All Ads
Content Gating
No Gating (article Unlocked/Free)