Crozet Annals of Medicine: Thinking about Thinking

0
177

We faculty don’t expect much from the new interns. They start in July after just having graduated from medical school. Almost everything in their job description is new to them. They have never had to make medical decisions or do complex procedures, never had to create and maintain a medical chart, never had to break bad news to a patient or family. So, we do not expect much from them this early in the year. We encourage them to go slowly and carefully. We do not rely on them to see many patients or care for very sick patients by themselves. 

That places a burden on the new second-year residents who are expected to step up from the end of their own internship year and start “moving the meat,” as is sometimes colloquially said, i.e. seeing, treating and dispositioning more patients more quickly. This they are usually able to do because we supervise them fairly closely. They never really have to make the final medical decisions alone. By the end of the year they are pretty good at moving the meat and they get pretty cocky.

It is the new third-year residents who face the greatest challenge in this transition month. At the end of this year they will be expected to make medical decisions independently, and so they have to start doing this now. They try, but this early in the year they are prone to make a lot of errors and it rattles them. The cockiness disappears, replaced by humility and earnest striving.  

As I watch the new third-years struggle with their mistakes, I try to analyze where they go astray. It is rarely a lack of medical knowledge. Some of them seem to know more stuff than I do.  It is in applying their vast stores of medical facts to actual cases that they can get mixed up. I try to identify the cognitive biases that influenced their thinking and led them from the right facts to the wrong conclusion. In short, I try to teach them how doctors think, to quote Jerome Groopman, MD, a leader in this field. 

Doctors rely on heuristics when making complex medical decisions. Heuristics are mental shortcuts that allow rapid decision making when faced with large amounts of uncertainty. Stereotyping people based on appearances is a type of heuristic that most people are familiar with. While it allows for rapid judgments, it is also prone to error. 

Heuristics are necessary for rapid decision making, but they are prone to cognitive biases and errors. I thought it might be instructive to examine some of the most common cognitive biases that lead doctors astray because they are common in non-medical fields as well. 

“Anchoring” is the most common cognitive bias. Anchoring involves focusing on the first piece of knowledge or impression that someone is exposed to. When buying a car, the initial sales price is the anchor. Anything less than that may be perceived as a very good deal, even if it is not. Anchoring is worsened by “confirmation bias,” where the person ignores evidence that does not support the decision and focuses only on the aspects that support the decision. 

Residents anchor all the time and for very good reasons. Most residents read their patients’ charts extensively before they see them. They read past medical notes and current nurses’ notes all before meeting the patients and before asking them what is wrong. This usually saves some time and helps guide the discussion, but obviously can lead to anchoring. 

I watched a resident this week very carefully and thoroughly work up a middle-aged patient for her chief complaint of headache. He first found out in her past medical records that she had seen several doctors for this headache and their conclusion was that she had a migraine but needed a CAT scan to be sure it wasn’t something else.

In the ER the resident did a thorough neurologic exam, ordered a CAT scan of her head and gave her a “migraine cocktail” of medicines specific for her headache. She seemed to be doing well and her head CT was negative. This confirmed the resident’s anchor that her headache was a migraine. 

She had also had a little chest pressure, but she hadn’t mentioned it prominently to the resident and he hadn’t explored it. Fortunately, the triage nurse, who hadn’t read up on the patient and so had to ask a million questions, had elicited the chest pressure and ordered a cardiac enzyme blood test. It was positive. She was having a heart attack. 

Heart disease can look very different in middle-aged women and I doubt this resident will ever overlook this again. The resident had anchored too early on the diagnosis of migraine, focused on the evidence that supported his diagnosis such as the CAT scan, and ignored any evidence that did not fit this diagnosis such as her chest pressure. This is confirmation bias. 

I almost never read anything about the patients before I see them for this very reason, but I, too, have anchoring bias. My anchor is my strong initial first impression of what is wrong with the patient. I ask a few questions to exclude other things, but I am often anchored. My antidote to my myopia is to try and listen to the resident and even more so the medical student’s long and fanciful list of possible diagnoses and consider each one at least a little. 

There are numerous other cognitive biases that I am working on recognizing, but as a patient the lessons from these two common cognitive errors can guide your interaction with your doctors. When given a diagnosis, especially a serious one, you should ask your doctor, “What else could this be?” (Anchoring bias) and “What supports this diagnosis and what doesn’t fit?” (Confirmation bias). Help us help you. 

LEAVE A REPLY

Please enter your comment!
Please enter your name here