Print

Print


Patricia, I guess one may argue too from what you refer to is similar to the precautionary principle (a variation of it)  and one can then argue that EBM/EBP while using the 'best' most updated evidence combined with clinical judgement, is a type of precautionary principle approach...for the instances where there is no SR or no strong RCT and the only evidence is one consult or one case study etc...or a small case series with no controlled existing evidence, that there is a social responsibility to protect the public and to do one's best, based on whats available...'at that time'......so with no ficus only on harm, and in the absence of scientific certainty, we still must move forward and make the best informed decisions...
so the question remains, short of a well conducted Cochrane level SR, what will suffice? can one strong RCT, large sample size, well powered, well conducted (internal validity addressed etc.) be sufficient vs several smaller weaker poorly conducted RCTs? uncontrolled...can one strong observational study well powered adjusted procedurally and statistically for confounders etc. supplant 3-4 small sample size disparate study populations and settings etc.? 

     Best,
Paul E. Alexander 


--- On Sat, 4/14/12, Patricia Anderson <[log in to unmask]> wrote:

From: Patricia Anderson <[log in to unmask]>
Subject: Re: Distorting the Evidence & its impact on EBM/EBP
To: [log in to unmask]
Received: Saturday, April 14, 2012, 3:02 PM

This was a prominent theme at this week's TEDMED conference, with
speakers focused less on  retractions and more on positive-findings
bias in the published peer reviewed literature, with examples
presented illustrating how that bias endangers the public. EBM/EBHC
can only be as strong as the data it analyses. We make intense efforts
to show that that we, as systematic review researchers, try to be
comprehensive in our searches and avoid bias in our analysis, but if
the research data isn't published because of bias at the level of the
publishers and editors of the journals, well, how much is our work
worth?

There was a parallel presentation on the topic that the scientific
method is broken in our emerging big data environment. Briefly, the
question was whether the strategy of asking questions first and
seeking data makes sense when data is so easy to come by. The thought
is that we need to turn the scientific method on its head and come up
with new strategies for analyzing data, visualizing data, and
generating questions that allow us to look at big data in useful ways.

Separately but also recently, I attended a presentation by a curator
and manager at the United States Holocaust Memorial Museum on the
topic of their Deadly Medicine exhibit. The focus of the exhibit is an
examination of how the 3rd Reich successfully persuaded the healthcare
and other helping professions to switch from "First, do no harm" to
genocide. The elements that most struck me were these two:
 - 1) a strong focus, possibly the first intentional systematic
government funded focus, on what would now be called evidence-based
practices;
 - 2) discovering sympathetic researchers in closely related but
slightly peripheral fields, and funding them like crazy to generate
research in the targeted question areas.

I know, I KNOW. I've heard the Soviet war prisoner story. I know that
Archie Cochrane first came up with the whole idea of systematic
reviews with a view towards freeing frontline clinicians from spending
so much time digging through research and allowing them to focus on
compassionate patient care. I worry how well that is being understood
by the profession at large, especially when I hear things like "no
treatment should ever be provided for a patient without a strong
systematic review in support of it;" "insurance companies should only
fund evidence based practices;" or "government should not waste its
money funding research into areas where  there is not strong
evidence." Yes, I have really heard every single one of these, from
highly educated, informed, influential professional leaders.

My original mentor in evidence based methodologies and practice is
Amid Ismail, who has won many awards for his influence in bringing
evidence based methodologies and clinical practices to the profession
of dentistry. He repeatedly expressed concern over several years that
people were missing one particularly essential aspect of implementing
EBHC: the best AVAILABLE evidence needs to be combined with expert
CLINICAL JUDGMENT. He emphasized this, and often explained that in
cases of a rare condition or a complicated presentation, there may be
no systematic review and the best available evidence may very well be
a single case report or expert opinion. In that case, you use the best
available evidence, whatever that is (sometimes it means a consult)
and integrated that with your judgment as a clinician as that
particular patient, their needs and preferences. EBHC was never never
intended to supplant or interfere with the doctor-patient relationship
and the clinical decisionmaking process that grows out of that
relationship.

I have been planning a blogpost on this topic, but perhaps it would be
just as well to post this message, if Jordan would give me permission
to post his original question? Thank you for bringing this up here. I
have been agonizing over this topic for a few months now, and I see it
getting to be a bigger question. It is certainly not going away. I
suspect that EBHC in its current form is either going to fade away
over the next twenty years, or will have to drastically transform
itself in the eyes of the professions and the public.

 - Patricia Anderson, [log in to unmask]

On Sat, Apr 14, 2012 at 9:22 AM, Jordan Panayotov <[log in to unmask]> wrote:
> Dear All,
>
> May I add an important detail that is missing in the discussion
> about EBP/EBM/EIP/EIM/EIDM.
>
> How reliable is the Evidence? What happens with the Evidence when, for
> example, 193 (one hundred ninety three) papers are RETRACTED?
>
> See Retraction Watch here
>
> http://retractionwatch.wordpress.com/2012/04/10/193-papers-could-be-retracted-journal-consortium-issues-ultimatum-in-fujii-case/
>
> Countless number of practitioners and decision-makers around the world try
> to adhere to Evidence-Based Practice which is based on evidence, which is
> based on systematic reviews, which are based on peer reviewed publications
> (like Fujii’s papers).
>
> According to Microsoft Academic Search
>
>