Barnes-Jewish Hospital | Washington University Physicians
POLICY | 
how health care works

GATHERING EVIDENCE

Originally published May 2018

BY ANDREA MONGLER
ILLUSTRATION FROM NLM/SCIENCE SOURCE/SCIENCE PHOTO LIBRARY

For thousands of years, bloodletting was a widely accepted and commonly used treatment for a variety of illnesses. The practice stemmed from the belief that disease was caused by an imbalance of the four bodily fluids, or “humors”: phlegm, black bile, yellow bile and blood. Doctors and their patients believed that restoring health required a rebalancing of those humors through the draining of “excess” blood from the body.

bloodletting

Then, in the early 1800s, a physician in France performed a study to evaluate the effectiveness of bloodletting. Other researchers followed suit, and as the evidence that bloodletting was not effective accumulated, the practice diminished. Today we know that it wasn’t just ineffective; it was often harmful and sometimes even deadly.

This is just one example of why evidence-based medicine — or considering and applying the current best available evidence when making treatment decisions — is so important for caregivers and those being cared for.

But even with high-quality evidence, there is never a one-size-fits-all treatment or care plan. A review published in The Lancet in 2017 explains it this way: Evidence-based medicine has “progressed to recognise limitations of evidence alone, and has increasingly stressed the need to combine critical appraisal of the evidence with patients’ values and preferences through shared decision making.”

Sometimes, one of those “limitations of evidence” is simply that there just isn’t much evidence available. Other times, the evidence that is available doesn’t apply to some people. “If a trial from 10 years ago studied patients between 30 and 50 years old without multiple comorbidities, it may not be easy for a physician to know how the drug or treatment studied in those patients will work in an elderly person with multiple medical problems,” says Victoria Fraser, MD, chair of the Department of Medicine at Washington University School of Medicine. “In addition to following evidence-based guidelines, physicians have to use critical thinking to assess the quality of the evidence and how it relates to their individual patient’s characteristics and conditions.”

Although the term “evidence-based medicine” was coined in 1991, the idea that experiments can be performed to prove whether a treatment works is not a new one, as the bloodletting example illustrates. Another example comes from the 1700s, when James Lind demonstrated that citrus fruits could cure scurvy, a debilitating disease we now know is caused by vitamin C deficiency. During his research, Lind also tried giving people vinegar, cider and seawater to cure scurvy, none of which helped. And in the mid-19th century, the Hungarian physician Ignaz Semmelweis, after testing several other hypotheses, figured out that hand-washing decreased the rate of childbed fever.

In part because of these historical figures and many others like them, Kenneth Ludmerer, MD, says the term “evidence-based medicine” is actually a misnomer: “It suggests that this attention to evidence is something new, but from the dawn of scientific medicine, doctors have always demanded evidence.” Ludmerer is Washington University’s Mabel Dorn-Reeder Distinguished Professor of the History of Medicine.

As the burden of disease has shifted from acute to chronic conditions, Ludmerer adds, the type of research needed to analyze data has also changed. As he puts it, today we have a need for a “much more sophisticated architecture of clinical epidemiology.”

Still, though medical research itself is nothing new, study findings are not always quickly embraced. Lind and Semmelweis, for example, have more in common than having made important scientific discoveries: Almost no one listened to them at first. The British Navy didn’t issue an order for the distribution of lemon juice to sailors until 42 years after Lind’s discovery. And Semmelweis’ hand-washing revelation angered doctors who didn’t want anyone to think they were responsible for childbed fever. Semmelweis ended up losing his job. Even today, despite government-funded research, high-quality clinical trials and medical education that teaches doctors how to analyze and interpret studies, it can be difficult to translate good, solid medical and scientific evidence into clinical practice.

For example, Fraser says, “Just because research demonstrates that optimum control of diabetes and high blood pressure reduces the risk of heart attack and stroke doesn’t mean it is easy to ensure that all patients in all settings get access to and can adhere to medical regimens to control their diabetes and blood pressure.” And, she adds, “additional research is needed to determine how to speed the translation of research findings into practice and understand the barriers to implementing best practices in different settings and patient populations.” That said, physicians have access to a wealth of information they can consider when making treatment decisions for their patients. This includes systematic reviews of medical literature and clinical-practice guidelines. Both assess large bodies of evidence and synthesize the information, resulting in conclusions and recommendations that practicing clinicians can use as references.

PHYSICIANS HAVE TO USE CRITICAL THINKING TO ASSESS THE QUALITY OF THE EVIDENCE AND HOW IT RELATES TO THEIR INDIVIDUAL PATIENT’S CHARACTERISTICS AND CONDITIONS.

VICTORIA FRASER, MD, CHAIR, DEPARTMENT OF MEDICINE, WASHINGTON UNIVERSITY SCHOOL OF MEDICINE

In addition, staying up-to-date on the latest research is much easier for physicians than it used to be. “Once upon a time, medical students had to go to Index Medicus and do literature searches by hand. Now they can do it digitally,” Ludmerer says. “At Washington University School of Medicine, for medical students and residents, we have increased our focus on learning to analyze studies critically — not just to review findings but also to review the methods used and to look at the study in a very rigorous fashion.”

This ability to critically analyze studies is crucial for recognizing flawed research methods or questionable conclusions. In addition, the rapid increase in numbers and types of clinical-practice guidelines can mean that for people with multiple comorbidities, many individual guidelines might be relevant, making clinical application difficult. But Fraser and Ludmerer agree that research is constantly improving — and will continue to do so — as will a physician’s ability to interpret it and use it to make treatment decisions for individual patients.

“The creation of modern medicine and every step forward since then has occurred because of evidence,” Ludmerer says. “And the continued ability of physicians to think critically and to determine whether a research conclusion is justified — and whether it applies to their patients — may trigger ideas about how to make the medicine of tomorrow even better.”


What is Trending: