Ruth Cronje writes:


> In 1980 in the Annals of Internal Medicine, DS Brody claimed

> that "If medical practice were based on truly objective

> scientific data, it would be very easy to make defendable decisions

> without any fear of self-reproach. Unfortunately, medicine is not

> an accomplished scienceİthere are tremendous gaps in

> scientific knowledge. Most of a physician's daily decisions do not

> involve situations that have been tested in double-blind,

> randomized trialsİfor [estimated] 90% of medical conditions there is

> either no specific remedy or effectiveness of treatment is

> unknown" (p. 720)


> Does anyone happen to know of a more current reference

> (since 2000) that gives an updated estimate of the percentage of

> medical conditions for which there is no specific remedy or for

> which the effectiveness of treatments is unknown? Is it still

> 90%, or have we made some progress since 1980? A published

> estimate from the peer-reviewed literature would be most helpful,

> but I'd also be interested in people's own estimates, based on

> their familiarity with the literature.


An interesting starting point is some commentary by Robert Todd Carroll on his Skeptic's Dictionary website


with some additional comments at


It is worth noting that many proponents of alternative medicine cite a small number like 10 or 15 percent for the fraction of conventional medicine treatments being supported by the evidence. This is intended to blunt criticism that most alternative medicine therapies are untested.


It is labor intensive, but you could find a good estimate of the proportion of treatments that are based on the evidence by following a busy clinician around for several days and note each decision that this person makes. Or better yet, follow several clinicians. Then review the literature on each decision and make a determination as to whether the decision was based on the evidence. There are some definitional issues that make this research tricky, as some people have already noted, but the research can and has been done. The proportion of decisions based on evidence will vary a lot by discipline and location, of course.


I do not have a comprehensive list of these types of studies, but here are a few:


Oncology treatment recommendations can be supported only by 1-2% of high-quality published evidence. S. Vincent, B. Djulbegovic. Cancer Treat Rev 2005: 31(4); 319-22.


"Is my practice evidence-based?" T. Greenhalgh. British Medical Journal 1996: 313(7063); 957-8.

[Full text]


The evidence for evidence-based medicine. R. Imrie, D. W. Ramey. Complementary Therapies in Medicine 2000: 8(2); 123-6.

[Full text]


Surgical practice is evidence based. N. Howes, L. Chagla, M. Thorpe, P. McCulloch. Br J Surg 1997: 84(9); 1220-3.


I do not have a copy of the very first article in this list, so I do not know whether it is relevant for the discussion or not.


By the way, there is another widely quoted statistic that appears to have a rather dubious basis. I have heard several people claim that "it takes an average of 17 years for research findings to be implemented in clinical practice." And wrote briefy about this on my weblog:


I'd be curious if anyone could provide additional references to support or debunk this statistic.


Steve Simon, [log in to unmask], Standard Disclaimer.

Look for my book "Statistical Evidence in Medical Trials"

newly published by OUP. For more details, see