Wednesday, January 19, 2011

Analysis: Medical 'Best Practice' Often Not Based on Scientific Evidence

Even when following medical guidelines to the letter, doctors often use treatments that have little or no scientific support, U.S. researchers said Monday.

They found only one in seven treatment recommendations from the Infectious Diseases Society of America — a society representing healthcare providers and researchers across the country — were based on high-quality data from clinical trials.

By contrast, more than half the recommendations relied solely on expert opinion or anecdotal evidence.

"Despite tremendous research efforts, there is still a lot of uncertainty as to what is the best patient care," said Dr. Ole Vielemeyer, an expert in infectious diseases at Drexel University College of Medicine in Philadelphia and one of the study's authors.

"A cookie-cutter approach, where you just have a set of guidelines that you apply no matter what, can be dangerous."

For example, he said, 2003 guidelines advising that patients with pneumonia get antibiotics within hours of seeing the doctor ended up producing an increase in mistaken pneumonia diagnoses and treatment, with no apparent benefits. Indeed, prescribing antibiotics unnecessarily exposes people to the drugs' side effects and could help breed resistant bacteria.

The new analysis, published Monday in the Archives of Internal Medicine, is based on more than 4,200 recommendations made by IDSA between 1994 and 2010.

"These data reinforce that absolute certainty in science or medicine is an illusion," an editorial in the journal notes. "Rather, evaluating evidence is about assessing probability."

Vielemeyer said the findings are likely to apply to other areas of medicine as well, and mentioned an earlier study that found similar results in cardiology.

Doctors across the world look to guidelines when deciding how to treat patients, and insurance companies may use them in coverage decisions.

Because they are drafted by leading experts in the field, they are generally understood to reflect the best medical knowledge available. "People commonly associate guidelines with practicing evidence-based medicine," said Vielemeyer.

But often the relevant clinical studies simply haven't been done. In the absence of evidence, the recommendations end up depending largely on who's on the guideline-drafting panel and any assumptions or opinions they may bring to the process.

In 2008, for instance, then-attorney general of Connecticut Richard Blumenthal sued IDSA for stacking a panel with experts who didn't believe in the controversial "chronic" version of Lyme disease, a tick-borne illness.

While the guidelines were upheld by an independent review panel last April, Blumenthal still expressed concern over "improper voting procedures."

"We are operating on a lot of bias," acknowledged Dr. Larry Baddour, who chairs the division of infectious diseases at Mayo Clinic College of Medicine in Rochester, Minnesota, and has been on several IDSA panels. He recently published findings similar to those of the current study in the IDSA's journal Clinical Infectious Diseases.

We struggle with this, even as experts," he said. "We recognize we have bias, but it's impossible to eliminate when there is a dearth of data."

Diana Olson, spokesperson for IDSA, pointed out that all recommendations list the degree of evidence on which they are based.

"Clinicians understand when there is really rock-hard evidence behind our recommendations and when there isn't," she told Reuters Health, adding that some guidance is better than none at all.

"The public can have absolute confidence in these guidelines," she added. "They remain the best that science and medicine have to offer."

Vielemeyer agreed guidelines offer valuable help to busy doctors.

"As long as you understand that they are just guidelines, I don't think they are detrimental to patients," he said. "But if they are considered as dogma, that is when they can become dangerous to patients."

No comments:

Post a Comment