a compliment to lj

Monday, January 29, 2007

How Doctors are Taught to Think

WHAT'S THE TROUBLE?
by JEROME GROOPMAN
How doctors think.
Issue of 2007-01-29
Posted 2007-01-22
On a spring afternoon several years ago, Evan McKinley was hiking in the woods near Halifax, Nova Scotia, when he felt a sharp pain in his chest. McKinley (a pseudonym) was a forest ranger in his early forties, trim and extremely fit. He had felt discomfort in his chest for several days, but this was more severe: it hurt each time he took a breath. McKinley slowly made his way through the woods to a shed that housed his office, where he sat and waited for the pain to pass. He frequently carried heavy packs on his back and was used to muscle aches, but this pain felt different. He decided to see a doctor.
Pat Croskerry was the physician in charge in the emergency room at Dartmouth General Hospital, near Halifax, that day. He listened intently as McKinley described his symptoms. He noted that McKinley was a muscular man; that his face was ruddy, as would be expected of someone who spent most of his day outdoors; and that he was not sweating. (Perspiration can be a sign of cardiac distress.) McKinley told him that the pain was in the center of his chest, and that it had not spread into his arms, neck, or back. He told Croskerry that he had never smoked or been overweight; had no family history of heart attack, stroke, or diabetes; and was under no particular stress. His family life was fine, McKinley said, and he loved his job.
Croskerry checked McKinley's blood pressure, which was normal, and his pulse, which was sixty and regular—typical for an athletic man. Croskerry listened to McKinley's lungs and heart, but detected no abnormalities. When he pressed on the spot between McKinley's ribs and breastbone, McKinley felt no pain. There was no swelling or tenderness in his calves or thighs. Finally, the doctor ordered an electrocardiogram, a chest X-ray, and blood tests to measure McKinley's cardiac enzymes. (Abnormal levels of cardiac enzymes indicate damage to the heart.) As Croskerry expected, the results of all the tests were normal. "I'm not at all worried about your chest pain," Croskerry told McKinley, before sending him home. "You probably overexerted yourself in the field and strained a muscle. My suspicion that this is coming from your heart is about zero."
Early the next evening, when Croskerry arrived at the emergency room to begin his shift, a colleague greeted him. "Very interesting case, that man you saw yesterday," the doctor said. "He came in this morning with an acute myocardial infarction." Croskerry was shocked. The colleague tried to console him. "If I had seen this guy, I wouldn't have gone as far as you did in ordering all those tests," he said. But Croskerry knew that he had made an error that could have cost the ranger his life. (McKinley survived.) "Clearly, I missed it," Croskerry told me, referring to McKinley's heart attack. "And why did I miss it? I didn't miss it because of any egregious behavior, or negligence. I missed it because my thinking was overly influenced by how healthy this man looked, and the absence of risk factors."
Croskerry, who is sixty-four years old, began his career as an experimental psychologist, studying rats' brains in the laboratory. In 1979, he decided to become a doctor, and, as a medical student, he was surprised at how little attention was paid to what he calls the "cognitive dimension" of clinical decision-making—the process by which doctors interpret their patients' symptoms and weigh test results in order to arrive at a diagnosis and a plan of treatment. Students spent the first two years of medical school memorizing facts about physiology, pharmacology, and pathology; they spent the last two learning practical applications for this knowledge, such as how to decipher an EKG and how to determine the appropriate dose of insulin for a diabetic. Croskerry's instructors rarely bothered to describe the mental logic they relied on to make a correct diagnosis and avoid mistakes.
In 1990, Croskerry became the head of the emergency department at Dartmouth General Hospital, and was struck by the number of errors made by doctors under his supervision. He kept lists of the errors, trying to group them into categories, and, in the mid-nineties, he began to publish articles in medical journals, borrowing insights from cognitive psychology to explain how doctors make clinical decisions—especially flawed ones—under the stressful conditions of the emergency room. "Emergency physicians are required to make an unusually high number of decisions in the course of their work," he wrote in "Achieving Quality in Clinical Decision Making: Cognitive Strategies and Detection of Bias," an article published in Academic Emergency Medicine, in 2002. These doctors' decisions necessarily entail a great deal of uncertainty, Croskerry wrote, since, "for the most part, patients are not known and their illnesses are seen through only small windows of focus and time." By calling physicians' attention to common mistakes in medical judgment, he has helped to promote an emerging field in medicine: the study of how doctors think.
There are limited data about the frequency of misdiagnoses. Research from the nineteen-eighties and nineties suggests that they occur in about fifteen per cent of cases, but Croskerry suspects that the rate is significantly higher. He believes that many misdiagnoses are the result of readily identifiable—and often preventable—errors in thinking.
Doctors typically begin to diagnose patients the moment they meet them. Even before they conduct an examination, they are interpreting a patient's appearance: his complexion, the tilt of his head, the movements of his eyes and mouth, the way he sits or stands up, the sound of his breathing. Doctors' theories about what is wrong continue to evolve as they listen to the patient's heart, or press on his liver. But research shows that most physicians already have in mind two or three possible diagnoses within minutes of meeting a patient, and that they tend to develop their hunches from very incomplete information. To make diagnoses, most doctors rely on shortcuts and rules of thumb—known in psychology as "heuristics."
Heuristics are indispensable in medicine; physicians, particularly in emergency rooms, must often make quick judgments about how to treat a patient, on the basis of a few, potentially serious symptoms. A doctor is trained to assume, for example, that a patient suffering from a high fever and sharp pain in the lower right side of the abdomen could be suffering from appendicitis; he immediately sends the patient for X-rays and contacts the surgeon on call. But, just as heuristics can help doctors save lives, they can also lead them to make grave errors. In retrospect, Croskerry realized that when he saw McKinley in the emergency room the ranger had been experiencing unstable angina—a surge of chest pain that is caused by coronary-artery disease and that may precede a heart attack. "The unstable angina didn't show on the EKG, because fifty per cent of such cases don't," Croskerry said. "His unstable angina didn't show up on the cardiac-enzymes test, because there had been no damage to his heart muscle yet. And it didn't show up on the chest X-ray, because the heart had not yet begun to fail, so there was no fluid backed up in the lungs."
The mistake that Croskerry made is called a "representativeness" error. Doctors make such errors when their thinking is overly influenced by what is typically true; they fail to consider possibilities that contradict their mental templates of a disease, and thus attribute symptoms to the wrong cause. Croskerry told me that he had immediately noticed the ranger's trim frame: most fit men in their forties are unlikely to be suffering from heart disease. Moreover, McKinley's pain was not typical of coronary-artery disease, and the results of the physical examination and the blood tests did not suggest a heart problem. But, Croskerry emphasized, this was precisely the point: "You have to be prepared in your mind for the atypical and not be too quick to reassure yourself, and your patient, that everything is O.K." (Croskerry could have kept McKinley under observation and done a second cardiac-enzyme test or had him take a cardiac stress test, which might have revealed the source of his chest pain.) When Croskerry teaches students and interns about representativeness errors, he cites Evan McKinley as an example.
Doctors can also make mistakes when their judgments about a patient are unconsciously influenced by the symptoms and illnesses of patients they have just seen. Many common infections tend to occur in epidemics, afflicting large numbers of people in a single community at the same time; after a doctor sees six patients with, say, the flu, it is common to assume that the seventh patient who complains of similar symptoms is suffering from the same disease. Harrison Alter, an emergency-room physician, recently confronted this problem. At the time, Alter was working in the emergency room of a hospital in Tuba City, Arizona, which is situated on a Navajo reservation. In a three-week period, dozens of people had come to his hospital suffering from viral pneumonia. One day, Blanche Begaye (a pseudonym), a Navajo woman in her sixties, arrived at the emergency room complaining that she was having trouble breathing. Begaye was a compact woman with long gray hair that she wore in a bun. She told Alter that she had begun to feel unwell a few days earlier. At first, she said, she had thought that she had a bad head cold, so she had drunk orange juice and tea, and taken a few aspirin. But her symptoms had got worse. Alter noted that she had a fever of 100.2 degrees, and that she was breathing rapidly—at almost twice the normal rate. He listened to her lungs but heard none of the harsh sounds, called rhonchi, that suggest an accumulation of mucus. A chest X-ray showed that Begaye's lungs did not have the white streaks typical of viral pneumonia, and her white-blood-cell count was not elevated, as would be expected if she had the illness.
However, a blood test to measure her electrolytes revealed that her blood had become slightly acidic, which can occur in the case of a major infection. Alter told Begaye that he thought she had "subclinical pneumonia." She was in the early stages of the infection, he said; the virus had not yet affected her lungs in a way that would show up on a chest X-ray. He ordered her to be admitted to the hospital and given intravenous fluids and medicine to bring her fever down. Viral pneumonia can tax an older person's heart and sometimes cause it to fail, he told her, so it was prudent that she remain under observation by doctors. Alter referred Begaye to the care of an internist on duty and began to examine another patient.
A few minutes later, the internist approached Alter and took him aside. "That's not a case of viral pneumonia," the doctor said. "She has aspirin toxicity."
Immediately, Alter knew that the internist was right. Aspirin toxicity occurs when patients overdose on the drug, causing hyperventilation and the accumulation of lactic acid and other acids in the blood. "Aspirin poisoning—bread-and-butter toxicology," Alter told me. "This was something that was drilled into me throughout my training. She was an absolutely classic case—the rapid breathing, the shift in her blood electrolytes—and I missed it. I got cavalier."
Alter's misdiagnosis resulted from the use of a heuristic called "availability," which refers to the tendency to judge the likelihood of an event by the ease with which relevant examples come to mind. This tendency was first described in 1973, in a paper by Amos Tversky and Daniel Kahneman, psychologists at the Hebrew University of Jerusalem. For example, a businessman may estimate the likelihood that a given venture could fail by recalling difficulties that his associates had encountered in the marketplace, rather than by relying on all the data available to him about the venture; the experiences most familiar to him can bias his assessment of the chances for success. (Kahneman won the Nobel Prize in Economics in 2002, for his research on decision-making under conditions of uncertainty.) The diagnosis of subclinical pneumonia was readily available to Alter, because he had recently seen so many cases of the infection. Rather than try to integrate all the information he had about Begaye's illness, he had focussed on the symptoms that she shared with other patients he had seen: her fever, her rapid breathing, and the acidity of her blood. He dismissed the data that contradicted his diagnosis—the absence of rhonchi and of white streaks on the chest X-ray, and the normal white-blood-cell count—as evidence that the infection was at an early stage. In fact, this information should have made him doubt his hypothesis. (Psychologists call this kind of cognitive cherry-picking "confirmation bias": confirming what you expect to find by selectively accepting or ignoring information.)
After the internist made the correct diagnosis, Alter recalled his conversation with Begaye. When he had asked whether she had taken any medication, including over-the-counter drugs, she had replied, "A few aspirin." As Alter told me, "I didn't define with her what 'a few' meant." It turned out to be several dozen.
Representativeness and availability errors are intellectual mistakes, but the errors that doctors make because of their feelings for a patient can be just as significant. We all want to believe that our physician likes us and is moved by our plight. Doctors, in turn, are encouraged to develop positive feelings for their patients; caring is generally held to be the cornerstone of humanistic medicine. Sometimes, however, a doctor's impulse to protect a patient he likes or admires can adversely affect his judgment.
In 1979, I treated Brad Miller (a pseudonym), a young literature instructor who was suffering from bone cancer. I was living in Los Angeles at the time, completing a fellowship in hematology and oncology at the U.C.L.A. Medical Center. "You look familiar," Brad said to me when I introduced myself to him in his hospital room as the doctor who would be overseeing his care. "I see you running with two or three friends around the university," he said. "I'm a runner, too—or, at least, was."
I told Brad that I hoped he would be able to run again soon, though I warned him that his chemotherapy treatment would be difficult.
About six weeks earlier, Brad had noticed an ache in his left knee. He had been training to run in a marathon, and at first he thought that the ache was caused by a sore muscle. He saw a specialist in sports medicine, who examined the leg and recommended that he wear a knee brace when he ran. Brad followed this advice, but the ache got worse. The physician ordered an X-ray, which showed an osteosarcoma, a cancerous growth, around the end of the femur, just above the knee.
Several years earlier, the surgical-oncology department at U.C.L.A. had devised an experimental treatment for this kind of sarcoma, involving a new chemotherapy drug called Adriamycin. Oncologists had nicknamed Adriamycin "the red death," because of its cranberry color and its toxicity. Not only did it cause severe nausea, vomiting, mouth blisters, and reduced blood counts; repeated doses could injure cardiac muscle and lead to heart failure. Patients had to be monitored closely, since once the heart is damaged there is no good way to restore its pumping capacity. Still, doctors at U.C.L.A. had found that giving patients multiple doses of Adriamycin often shrank tumors, allowing them to surgically remove the cancer without amputating the affected limb—the standard approach in the past.
I began administering the treatment that afternoon. Despite taking Compazine to stave off vomiting, Brad was acutely nauseated. After several doses of chemotherapy, his white-blood-cell count dropped precipitately. Because his immune system was weakened, he was at great risk of contracting an infection. I required visitors to Brad's room to wear a mask, a gown, and gloves, and instructed the nurses not to give him raw food, in order to limit his exposure to bacteria.
"Not to your taste," I said at the end of the first week of treatment, seeing an untouched meal on his tray.
"My mouth hurts," Brad whispered. "And, even if I could chew, it looks pretty tasteless."
I agreed that the food looked dismal.
"What is to your taste?" I asked. "Fried kidney?"
I had told Brad when we met that I had studied "Ulysses" in college, in a freshman seminar. The professor had explained the relevant Irish history, the subtle references to Catholic liturgy, and a number of other allusions that most of us in the class would otherwise not have grasped. I had enjoyed Joyce's descriptions of Leopold Bloom eating fried kidneys.


Brad was my favorite patient on the ward. Each morning when I made rounds with the residents and the medical students, I would take an inventory of his symptoms and review his laboratory results. I would often linger a few moments in his room, trying to distract him from the misery of his therapy by talking about literature.
The treatment called for a CAT scan after the third cycle of Adriamycin. If the cancer had shrunk sufficiently, the surgery would proceed. If it hadn't, or if the cancer had grown despite the chemotherapy, then there was little to be done short of amputation. Even after amputation, patients with osteosarcomas are at risk of a recurrence.
One morning, Brad developed a low-grade fever. During rounds, the residents told me that they had taken blood and urine cultures and that Brad's physical examination was "nonfocal"—they had found no obvious reason for the fever. Patients often get low fevers during chemotherapy after their white-blood-cell count falls; if the fever has no identifiable cause, the doctor must decide whether and when to administer a course of antibiotics.
"So you feel even more wiped out?" I asked Brad.
He nodded. I asked him about various symptoms that could help me determine what was causing the fever. Did he have a headache? Difficulty seeing? Pressure in his sinuses? A sore throat? Problems breathing? Pain in his abdomen? Diarrhea? Burning on urination? He shook his head.
Two residents helped prop Brad up in bed so that I could examine him; I had a routine that I followed with each immune-deficient patient, beginning at the crown of the head and working down to the tips of the toes. Brad's hair was matted with sweat, and his face was ashen. I peered into his eyes, ears, nose, and throat, and found only some small ulcers on his inner cheeks and under his tongue—side effects of his treatment. His lungs were clear, and his heart sounds were strong. His abdomen was soft, and there was no tenderness over his bladder.
"Enough for today," I said. Brad looked exhausted; it seemed wise to let him rest.


Later that day, I was in the hematology lab, looking at blood cells from a patient with leukemia, when my beeper went off. "Brad Miller has no blood pressure," the resident told me when I returned the call. "His temperature is up to a hundred and four, and we're moving him to the I.C.U."
Brad was in septic shock. When bacteria spread through the bloodstream, they can damage the circulation. Septic shock can be fatal even in people who are otherwise healthy; patients with impaired immunity, like Brad, whose white-blood-cell count had fallen because of chemotherapy, are at particular risk of dying.
"Do we have a source of infection?" I asked.
"He has what looks like an abscess on his left buttock," the resident said.
Patients who lack enough white blood cells to fight bacteria are prone to infections at sites that are routinely soiled, like the area between the buttocks. The abscess must have been there when I examined Brad. But I had failed to ask him to roll over so that I could inspect his buttocks and rectal area.
The resident told me that he had repeated Brad's cultures and started him on broad-spectrum antibiotics, and that the I.C.U. team was about to take over.
I was furious with myself. Because I liked Brad, I hadn't wanted to add to his discomfort and had cut the examination short. Perhaps I hoped unconsciously that the cause of his fever was trivial and that I would not find evidence of an infection on his body. This tendency to make decisions based on what we wish were true is what Croskerry calls an "affective error." In medicine, this type of error can have potentially fatal consequences. In the case of Evan McKinley, for example, Pat Croskerry chose to rely on the ranger's initial test results—the normal EKG, chest X-ray, and blood tests—all of which suggested a benign diagnosis. He didn't arrange for follow-up testing that might have revealed the source of the ranger's chest pain. Croskerry, who had been an Olympic rower in his thirties, told me that McKinley had reminded him of himself as an athlete; he believed that this association contributed to his misdiagnosis.
As soon as I finished my work in the lab, I rushed to the I.C.U. to check on Brad. He was on a respirator and opened his eyes wide to signal hello. Through an intravenous line attached to one arm, he was receiving pressors, drugs that cause the heart to pump more effectively and increase the tone of the vessels to help maintain blood pressure. Brad's heart was holding up, despite all the Adriamycin he had taken. His platelet count had fallen, as often happens with septic shock, and he was receiving platelet transfusions. The senior doctor in the I.C.U. had told Brad's parents, who lived nearby, that he was extremely ill. I saw his parents sitting in a room next to the I.C.U., their heads bowed. They had not seen me, and I was tempted to avoid them. But I forced myself to speak to them and offered a few words of encouragement. They thanked me for my care of their son, which only made me feel worse.
The next morning, I arrived before the residents to review my patients' charts. Rounds lasted an hour longer than usual, as I insisted on double-checking each bit of information that the residents offered about the patients in our care.
Brad Miller survived. Slowly, his white-blood-cell count increased, and the infection was resolved. After he left the I.C.U., I told him that I should have examined him more thoroughly that morning, but I did not explain why I had not. A CAT scan showed that his sarcoma had shrunk enough for him to undergo surgery without amputation, but a large portion of his thigh muscle had to be removed along with the tumor. After he recovered, he was no longer able to run, but occasionally I saw him riding his bicycle on campus.


Medical education has not changed substantially since Pat Croskerry and I were trained. Students are still expected to assimilate large amounts of basic science and apply that knowledge as they are taught practical aspects of patient care. And young physicians still learn largely by observing more senior members of their field. ("See one, do one, teach one" remains a guiding maxim at medical schools.) This approach produces confident and able physicians. Yet the ideal it implies, of the doctor as a dispassionate and rational actor, is misguided. As Tversky and Kahneman and other cognitive psychologists have shown, when people are confronted with uncertainty—the situation of every doctor attempting to diagnose a patient—they are susceptible to unconscious emotions and personal biases, and are more likely to make cognitive errors. Croskerry believes that the first step toward incorporating an awareness of heuristics and their liabilities into medical practice is to recognize that how doctors think can affect their success as much as how much they know, or how much experience they have. "Currently, in medical training, we fail to recognize the importance of critical thinking and critical reasoning," Croskerry told me. "The implicit assumption in medicine is that we know how to think. But we don't."