Esa,
> I was doing a SnPM analysis with split-plot-design (2 groups,
> 2 conditions, 1 scan per subject) with a total of 30 PET scans.
> Supra-threshold statistics were collected, 5000 permutations were used,
> variance smoothing of 10x10x0 mm was applied, the analysis threshold
> was proportional 0.3 and SnPM was not working volumetrically. I am using
> Matlab 6.0.0.88 Release 12 and Windows NT 4.
>
> The size of cluster statistics file SnPM_ST.mat was huge (approx. 4 GB)
> when the following error message occurred during the computation step
> of SnPM:
[...]
> Plane 18: 21'35" ( 63% Sm)
> Plane 19: 21'45" ( 62% Sm)
> Plane 20: ??? Error using ==> spm_append_96
> Incompatible sizes
>
> Error in ==> c:\snpm99b\spm_snpm.m
> On line 483 ==> spm_append_96('SnPM_ST',[
> ...
>
> ??? Error while evaluating uicontrol Callback.
I think the problem is the SnPM_ST file is too big. I belive that
32-bit systems have a maximum file size of 4 GB.
The reason that SnPM_ST is so large is that it is saving the "mountain
tops" of each of the 5,000 statistic images. It does this so you
can specify the primary threshold (the cluster-defining threshold)
post hoc, after the compute step has finished.
The solution is to save less of the moutain tops. Raise the elevation,
er, threshold at which the image data is saved. In spm_snpm_defaults
increase STalpha (for t images) or STprop (for pseudo t).
In subsequent versions of SnPM we're going to change this so that
the primary threshold is applied to each statistic image on
the fly. This will preclude the post-compute tweaking of the
primary threshold, but it will eliminate the gigantic SnPM_ST.mat.
-Tom
-- Thomas Nichols -------------------- Department of Biostatistics
http://www.sph.umich.edu/~nichols University of Michigan
[log in to unmask] 1420 Washington Heights
-------------------------------------- Ann Arbor, MI 48109-2029
|