On 14-Mar-2012 Jenny Morris wrote:
> Can anyone give me a easy to understand definition of hazard
> ratios and how they should be interpreted. This is a for a
> group of pre-registration nursing students.
>
> Thanks.
> Jenny
A hazard ratio (or risk ratio) is simply the ratio of the chances
of an event under different conditions.
So, if a person of Type A has a 1 in 10 (0.1) chance of
experiencing an event, and a person of Type B a 3 in 10 (0.3)
chance of experiencing that event, then the hazard ratio (B:A)
is 0.3/0.1 = 3.0: a Type B is 3 times as likely as a Type A to
experience the event.
That means what it says, no more and no less. A hazard ratio
will be 3.0 whether it is 3 in 10 versus 1 in 10, or 3 in 1000
versus 1 in 1000. "Hazard ratio = 3.0" says nothing (or very
little -- see below) about the absolute risks (so it is just
as true for 3 in 10 vs. 1 in 10 as for 3 in 1000 vs. 1 in 1000,
or 3 in 1,000,000 vs. 1 in 1,000,000).
As someone once put it to me one evening at the pub:
I read this morning a newspaper article which said that
"eating processed meats doubles your risk of bowel cancer".
What -- from 1 in ten million to 2 in 10 million? Or what?
Fair point. If I would only have a 2 in 10,000,000 chance
of contracting bowel cancer then I'd carry on eating bacon
butties. But, on the other hand, if it would raise my risk
from 1 in 1000 to 1 in 500, then I might feel moved to stop.
For most people, a risk of 1 in 5,000,000 can be ignored.
People's perceptions of risks, and their willingness to take
risks, are very dependent on the absolute level of the risks.
Driving 100 miles on a busy motorway might well be say twice
as risky as making the same journey along rural roads. But
we undertake such journeys quite willingly, because the risk
of a fatal accident, in absolute terms, is so small. But we
would be much less willing for higher, more "perceptible",
risks.
There is, however, one extra item of information in a given
risk ratio.
If the ratio is say 10:1, so that a B risk is 10 times an
A risk, then the A risk cannot exceed 1 in 10 (10%), because
the B risk cannot exceed 100%. So, given a risk ratio for
B:A of X (B is X times more likely than A to experience the
event), then the absolute risk for A cannot exceed 1/X,
because the B risk is X times the A risk so if the A risk
was greater than 1/X the B risk would be greater than X/X=1,
which is impossible.
For the sort of risk, and risk ratios, that are usually
encountered in medical applications, this is not often an
issue. Absolute risks are usually fairly small in that context,
and risk ratios are usually not huge, so that constraint
does not usually come into view. But it must be borne in
mind because it is an intrinsic part of the interpretation
of hazard ratio.
Hoping this helps!
Ted.
-------------------------------------------------
E-Mail: (Ted Harding) <[log in to unmask]>
Date: 14-Mar-2012 Time: 09:56:57
This message was sent by XFMail
-------------------------------------------------
|