Hello Shahrzad,
between version 4010 and 4667 there were some changes influencing the statistics, see https://www.jiscmail.ac.uk/cgi-bin/webadmin?A2=spm;c8f290dc.1209 In my personal experience, this resulted in only very small differences.
Your different findings probably arise due to different preprocessing. As you note, in your first analysis, you used non-linearly modulated data. In your second analysis, you used the SPM8 "New segment" option with "modulation". This modulation corrects for non-linear as well as linear warpings.
With non-linearly modulated VBM8 data, you have already corrected for different brain sizes during preprocessing. So you don't need any covariate for TIV, GM volume in your model, neither is there a need for "global normalisation" (so "no" overall grand mean scaling and normalisation "none").
With SPM8 modulated data, you still have to correct for different brain sizes in your model. Thus add the different values as a covariate (again, no changes to default values in "global normalisation" section). This should lead to somewhat comparable results as the VBM8 model, at least in principle.
Another option for SPM8 modulated data would be to scale the volumes to the same overall value in your model either with 1) "Global normalisation" / "Overall grand mean scaling" -> yes (in this case only one value must be given, e.g. the default 50, NOT the individual values, different volumes will be scaled to the very same arbitrary value) or with 2) "Global normalisation" / "Overall grand mean scaling" -> "no" and "Normalisation" either "Proportional" or "ANCOVA". The different "Global normalisation" options are documented rather badly in the context of VBM, so better not use it.
Concerning your first question, there should be some differences between SPM8 and VBM8 volumes to some extent due to different algorithms. But right now I'm not sure what these automatically reported values actually reflect, e.g. volumes in original space/modulated space, so maybe VBM8 is different to SPM8 due to different meaning.
Hope this helps,
Gabor
|