Craig Bowron and Michael K. Cummings (2017) Heuristic Thinking Isn’t Heretical: How Should Doctors Think?. Journal of the Minneapolis Heart Institute Foundation: Fall/Winter 2017, Vol. 1, No. 2, pp. 94-96.
Craig Bowron, MD, FACPMichael K. Cummings, MD
Abbott Northwestern Hospitalist Service, Minneapolis, MN
Address for correspondence:
Craig Bowron, MD, FACP
800 E 28th St
Minneapolis, MN 55407
Doctors are intelligent people, but are we good thinkers, and how should we think? The evidence-based medicine movement harnessed the power of computers and the electronic medical record to champion a highly analytical approach to individual patient care. As an unintended consequence, the movement came to depict heuristic thinking—intuitive “rule of thumb” thinking—as a folksy and imprecise way of dealing with our patients. Yes, analytical thinking is deliberate, deductive, and rule-following, but it is not always as “logical” as we suppose it to be, and it is not intellectually superior to intuitive thinking. Rather, it is intellectually complementary to heuristic thinking, which allows experienced clinicians to collapse unmanageable complexity and uncertainty into categories that allow for pragmatic decisions. To provide our patients with the best possible care, we’ll have to overcome our blind faith in anything with a probability value of less than 0.05, and our disdain for the idea of “acting on a hunch.”
Keywords: analytical thinking, heuristic thinking, heuristics, evidence-based medicine, randomized controlled trial
Doctors are intelligent people, but are we good thinkers? And how should we think?
There are two basic kinds of thinking: analytical and intuitive (and maybe good and bad, so that’s four). Within medicine, analytical thinking can perhaps be best exemplified in the evidence-based movement, which began in the early 1990s. It was a gilded age, full of promise, and bolstered by the reality that computers would give physicians instant access to the most thoroughly researched standards of care. Within our specialty of internal medicine, we watched the sacred texts of medical wisdom—Cecil, Harrison’s, and Scientific American—get leapfrogged by electronic medical resources like UpToDate.
“A new paradigm for medical practice is emerging. Evidence-based medicine de-emphasizes intuition, unsystematic clinical experience, and pathophysiologic rationale as sufficient grounds for clinical decision making and stresses the examination of evidence from clinical research.”1
The beauty and allure of the new paradigm—practicing evidence-based medicine—was its elegant simplicity: We’ll do what works, and we won’t do what doesn’t work.
This accent on data and analytical thinking was a boon to younger doctors, who were more computer savvy and had little to no clinical experience (intuitive thinking) to lean back on. With individual patients being morphed into data points and cohorts, computers also made it possible to measure a physician’s clinical outcomes. How much simpler could it get? Round patients would be put into the round hole; square ones into square holes. It had never been easier to be a young and inexperienced doctor. Older physicians who groused about these changes could be seen as backwards luddites, in love with a romantic and antiquated notion of the doctor-patient relationship. William Osler was dead.
But evidence-based medicine has its problems, including that it’s only as good as the evidence it’s based on, and much of that comes from randomized controlled trials (RCT). Behold the RCT: the dispenser of all clinical knowledge, our most powerful tool in the quest to practice evidence based medicine, and perfect in so many ways—and yet imperfect.
We study groups but we treat individuals. On the face of it, when an RCT concludes that a particular drug reduced cardiovascular mortality by 20%, we assume—most of us at least—that the majority of the people in the study benefited from taking the drug, and that most of them had a 20% reduction in mortality. But RCTs suffer from a heterogeneity of benefit: oftentimes a small minority of high-risk patients accounts for most of the positive outcomes seen in the trial. Many participants in the trial may have had no benefit. Some may have even been harmed.
Take something as “simple” as sodium intake. Once we proved that higher sodium intake was associated with a higher risk of hypertension and strokes, good old #11 on the periodic table of elements became element non grata. But a more sophisticated view of the data eventually revealed that although some people are indeed quite sodium sensitive, most are insensitive, and some are quite sensitive to the lack of dietary sodium—which revs up their renin system and eggs on the very physiology we were trying to avoid—while their taste buds suffered.
And it isn’t just this heterogeneity of benefit that hobbles the RCT. A team at the University of Oxford’s Centre for Evidence-Based Medicine recently began monitoring clinical trials for switched outcomes.2 When they reviewed 67 different trials published in the top five medical journals in 2015 and 2016, they found that 58 of these trials were considered methodologically flawed. Of the trials reviewed, 40% had negative outcomes that they chose not to report. On average, each trial quietly included five clinical outcomes that their study was not designed to measure.
Of course, it would be ridiculous to suggest that we don’t need data or analytical thinking. But what makes the practice of medicine so unique is that we are attempting to apply objective science to our very human patients. And each patient is his or her own art form—both in terms of their unique physiology and their unique psychology (that is, how they express their illness).
And so, physicians must also rely on heuristic thinking—from the Greek word heuriskein, meaning “to discover.” A heuristic is a mental shortcut, a way of quickly and intuitively organizing disparate clues into something we can recognize and work with. If we are to embrace heuristic thinking, we’ll have to overcome our blind faith in anything with a P value of less than 0.05, and our disdain for the idea of “acting on a hunch.” Yes, analytical thinking is deliberate, deductive, and rule-following, but it is not always as “logical” as we suppose it to be, and it is not intellectually superior to intuitive thinking. Rather, it is intellectually complementary.
Heuristic thinking is not for dummies. It is an unconscious, context-sensitive, associative process that rapidly makes connections. It allows experienced clinicians to collapse unmanageable complexity and uncertainty into categories that allow for pragmatic decisions. It helps us conclude that something is wrong even before we know what that something is. It is a highly skilled way of thinking, and it takes practice.
Yes, heuristic thinking can be flawed. Psychologist Daniel Kahneman won a Nobel Prize in economics for his research into how humans think, and he has catalogued the various forms of cognitive biases that intuitive thinking is vulnerable to.3 The “gambler’s fallacy” demonstrates our tendency to see independent events (“luck”) as being streaky. We feel that five “heads” in a row means a “tails” is overdue, even though each coin toss is independent of the other. Diagnosing your first pheochromocytoma might lead you to think that you’ll never see one again, or it might lead you to falsely overrepresent the prevalence of the problem because, having finally seen a case, the diagnosis now seems more “real” and therefore more probable in your mind.
We all want to be good at what we do, and yet the pace of discovery in medical science continues to quicken. As writer/poet/farmer Wendell Berry pointed out, “The radii of knowledge have only pushed back—and enlarged—the circumference of our mystery.”4
The more we know, the more we realize how much we don’t know. And so as researcher Deepika Mohan, MD, noted in a recent Viewpoint article in JAMA, expert clinicians will have to have “… an unparalleled ability to parse complexity and sift through uncertainty.”5 To do that, we’ll have to embrace both rational/analytical and intuitive/associative thinking. We’ll need our computers and our brains.
“Gosh, it would be awful pleasin’, to reason out the reason, for things I can’t explain,” sang Dorothy’s friend and confidant, the Scarecrow. He wanted what we want: “I’d unravel every riddle, for any individ’al, in trouble or in pain.”
Craig Bowron is a hospitalist at Abbott Northwestern Hospital. Michael Cummings is an internist and codirector of Abbott Northwestern General Medicine Associates. His presentation “Critical thinking, heuristic wisdom and the practice of medicine” is available at the Minneapolis Heart Institute Foundation’s Cardiovascular Grand Rounds website.6
|1.||Evidence-Based Medicine Working Group. Evidence-based medicine: a new approach to teaching the practice of medicine. JAMA. 1992;268:2420–2425. [Crossref] [Google Scholar]|
|2.||Goldacre B. Drysdale H. Powell-Smith A.et al. The COMPare trials project. Available at: http://compare-trials.org/. Accessed October 31, 2017. [Google Scholar]|
|3.||Nobel Media AB2014. Daniel Kahneman—biographical. Available at: http://www.nobelprize.org/nobel_prizes/economic-sciences/laureates/2002/kahneman-bio.html. Accessed October 31, 2017. [Google Scholar]|
|4.||Berry W.Life is a Miracle: An Essay Against Modern Superstition. Washington, DC: Counterpoint Press;2001:135. [Google Scholar]|
|5.||Mohan D. Schell J. Angus DC.Not thinking clearly?Play a game, seriously! JAMA. 2016;316:1867–1868. [Google Scholar]|
|6.||Cummings MK.Critical thinking, heuristic wisdom and the practice of medicine. February 27,2017. Available at: https://mplsheart.org/news-event/past-grand-rounds/. Accessed October 31, 2017. [Google Scholar]|