Print

Print


Dear Ged,

Thank you very much for your help :)

cheers,
Hans.


On Mon, Jul 20, 2009 at 4:39 AM, DRC SPM <[log in to unmask]> wrote:

> Dear Hans,
>
> Sorry it's taken me two weeks to reply...
>
> > once you permute these transformed residuals and
> > premultiply by U we get U*P*U'*y. Now these residuals have covariance
> > sigma^2 U*P*U' where sigma^2 is the true noise variance. If you permute
> the
> > raw residuals you get the covariance sigma^2 P*U*U'*P'. Is this correct?
>
> Your second expression is correct, writing sigma^2 as v, the original
> residuals have covariance v*U*U', and if you permuted these, you would
> get v*P*U*U'*P' as you say. However, your first expression is wrong;
> the covariance of U*P*U'*y is:
>  U*P*U' * v * (U*P*U')'
>  U*P*U' * v * U*P'*U'
>  v * U*P*U'*U*P'*U'
>  v * U*P*P'*U'
>  v * U*U'
> since both U'*U and P*P' equal the identity, and hence the
> back-transformed Huh-Jhun permuted residuals have the same (non-white)
> covariance as the original residuals. I think this is the best that
> one can hope for, since I believe Theil proved that you cannot find
> linear (in the data) unbiased residuals with scaled identity
> covariance without reducing the dimensionality. (And in case it's not
> obvious, the only design matrix I can think of where dimensionality
> reduction wouldn't screw it up, is a plain one-sample t-test, which
> has no nuisance variables, and an exact permutation test anyway.)
>
> By the way, if you're keen to read up more on this kind of thing, then
> another key paper (though a very difficult one, at least for me) is
> Welch's 1990 JASA paper, http://www.jstor.org/stable/2290004
>
> Best,
> Ged
>