Jeff,
> I was wondering why can't one just globally scale each person's fMRI
> data around a grand mean of 1000 prior to stats? Is there a way to do
> this? Is the problem that spm's masking routine and /or way of counting
> voxels as within brain sometimes flawed?
The problem is defining the grand mean; finding the grand mean across
space involves deciding what's brain and what's not brain. Defining
a grand mean across time is less problematic, but you get funny edge
effects as true intensity falls to zero.
> If I understand you correctly, your concern is that the mask that spm
> uses is faulty sometimes?
The only "mask" at issue is the one implicit in calculating the
intracerebral mean.
> To make sure, the mask that spm uses for its calculations its
> statistical calculations is the same as the mask it uses for the
> beta0 normalization?
No; the "mask" used to determine intracerebral mean (in spm_global.m)
is different from the analysis mask, which can be explicit, implicit,
etc, as per the model setup options.
> One solution is to make sure I know brain from non-brain using a
> histogram and then use this mask in the stats step instead of spm's
> default mask.
Yes, the best/safest option is to compute your own global intensity
and then apply it yourself and then disable SPM's scaling (which
requires that you carefully modify the SPM structure (or variables
within SPM.mat in SPM99).
-Tom
-- Thomas Nichols -------------------- Department of Biostatistics
http://www.sph.umich.edu/~nichols University of Michigan
[log in to unmask] 1420 Washington Heights
-------------------------------------- Ann Arbor, MI 48109-2029
|