> Although the formulas that Audrey Blumsohn uses are quite correct,
the
> conclusions are not entirely so. Since squares are always positive, the =
> variance of a sum of variances will always be larger than any of the =
> individual variances.
> To use this simple relation it is most important that the variables are =
> uncorrelated.
> Anders Kallner
No, you are wrong, not me (:
Although the variance of a sum of variances will always be larger than any of the
individual variances (no argument about this) this does not mean the
CV's will be larger - they can be larger or smaller depending on the
sign of the values. Also the errors have to be uncorrelated, not the
actual values necessarily.
Example 1
=======
Value A - mean 10, CV 1% (variance 0.01)
Value B - mean 10, CV 1% (variance 0.01)
Calculated index (A+B)
mean 20
variance = 0.01 + 0.01 = 0.02
CV = sqrt(0.02)/20 = 0.141/20 = 0.7% (ie smaller than original)
Example 2
=======
Value A - mean +10, CV 1% (variance 0.01)
Value B - mean -5, CV 2% (variance 0.01)
Calculated index (A+B)
mean 5
variance = 0.01 + 0.01 = 0.02
CV = sqrt(0.02)/5 = 0.141/20 = 2.8% (ie bigger than original)
Clearly when working with MOMs or other multivariate models all sorts
of models can apply (or the multiplicative case discussed earlier).
Aubrey Blumsohn
Ninewells Hospital, Dundee
Aubrey Blumsohn
Directorate of Biochemical Medicine
Ninewells University Hospital, Dundee
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|