> we are planning a VBM study (with VBM5 toolbox) with about 120 subjects.
> Because a lot of criticism against
> VBM is based on the normalization procedure. In order to work against this
> ctiticism, we intend to use high parameter normalization (e.g., 5-10mm
> cutoff for warping) to avoid misregistration and high resolution images
> (0.5*0.5*0.5mm) to avoid partial volume effects.
>
> Is this a reasonable strategy?
If your computer has enough memory, then you could try it, although 0.5mm
voxels are likely to be a bit more noisy. The multiple Gaussians per class
used by SPM5 should be a bit better at dealing with partial volume than the
older versions.
>
> If yes, what we have to pay attention to the other parameters like
> - Gaussian per class
> - warping regularisation
> - bias regularisation
> - bias FWHM
> - sampling distance
These are questions that would really need empirical evidence from your data
to answer. Settings that are optimal for one persons data may not be optimal
for data from elsewhere. In principle, decreasing the sampling distace
should always improve accuracy (at the expense of taking ages). The other
parameters are likely to be more scan dependent, so you may wish to try
tweeking them to see which works best.
>
> Are there any other potenitial problems?
Difficult reviewers.
Best regards,
-John
|