Hi Steven
I have had similar nightmares about first and second level errors, but
now there is a cure! In the January issue of NeuroImage (page. 244-252)
there is an article describing how a mixed effect analysis have been
implemented in the program spm_mfx.
Best
Torben
P.S. remember to specify your nuisacnce regressors as covariates of no
interest.
Torben E. Lund
Danish Research Centre for MR
Copenhagen University Hospital
Kettegaard Allé 30
2650 Hvidovre
Denmark
email: [log in to unmask]
webpage: http://www.drcmr.dk
On 21 Dec 2004, at 17:56, Stephen J. Fromm wrote:
> On Tue, 21 Dec 2004 11:27:22 -0500, Thomas E Nichols
> <[log in to unmask]>
> wrote:
>
>> Stephen,
>>
>>> When modeling a single subject in, say, fMRI data analysis, there's a
>>> tradeoff when considering adding more regressors to the model (more
>>> flexible model, but more degrees of freedom eaten up). How does this
>>> tradeoff affect the result of a random effects model?
>>>
>>> That is, if one adds more regressors at the subject level, this has
>>> no
>>> impact, algorithmically, at the group level (aka second level). (I'm
>>> omitting consideration of the issue that it could lead to more
>>> contrasts
>>> being brought to the second level, leading to a more severe multiple
>>> comparison correction over the greater number of t-tests.) Does it
>>> somehow implicitly affect the variance estimate at the second level?
>>
>> If the extra covariates are orthogonal with respect to the covariate
>> of interest, then it makes no difference whether you include the extra
>> covariates or not. [The extra covariates can reduce the intrasubject
>> variance estimate, but, as you point out, this has no impact in the
>> SPM summary statistic approach to group modeling.]
>
> Thanks for answering my query.
>
> I did think of putting in a caveat that there's also the issue of
> orthogonality, but left it out. Below, assume all the added regressors
> are orthogonal to the covariate that's being brought to the second
> level.
>
> While it's clear there's no *explicit* effect on the summary statistic
> of
> changing the subject-level model by adding more regressors, I'm still
> wondering if it might affect things implicitly. Looking at equation
> (10)
> in Penny and Holmes, "Random-Effects Analysis,"
> http://www.fil.ion.ucl.ac.uk/spm/doc/books/hbf2/pdfs/Ch12.pdf
> the total variance of the estimator of the population coefficient is
> the
> sum of inter- and intra-subject variance. So what I'm thinking is that
> the latter would decrease if more error variance were explained by
> regressors added at the single-subject (i.e., first) level, and I don't
> see where the penalty in terms of DOF lies. Maybe, on the other hand,
> I'm
> not thinking about this right and your claim "If the extra covariates
> are
> orthogonal with respect to the covariate of interest, then it makes no
> difference whether you include the extra covariates or not." still
> stands.
>
> Cheers,
>
> S
>
>>
>> If the extra covariates have non-zero correlation with the covariate
>> of interest, then they will change the estimate of the coefficient for
>> the covariate of interest. If the correlation is substantial,
>> indicating confounding, then including the extra covariates will
>> appropriately change the estimates of interest, adjusting for the
>> confounder.
>>
>>
>> So the best thing is to check for confounding. If there is none, you
>> can safely ignore the extra covariates; if there is confounding, you
>> should include them. I guess a 'safe' default would be to include
>> them.
>>
>> -Tom
>>
>>
>> -- Thomas Nichols -------------------- Department of
>> Biostatistics
>> http://www.sph.umich.edu/~nichols University of Michigan
>> [log in to unmask] 1420 Washington Heights
>> -------------------------------------- Ann Arbor, MI 48109-2029
|