Steven,
> I dropped one subject so that I had an even number of subjects. I could
then
> successfully generate a SnPM_cfg.mat file specifying two conditions (A/B)
and
> 512 iterations, and defaults on all other options.
>
> However, I got an "Out of Memory" error during the calculation phase. I
tried
> again using only 256 iterations and got the out of memory error again.
The number of iterations, that is, permutations, doesn't have much impact
on memory usage.
The important variables which determine memory usage in SnPM are the number
of scans and whether you are working "volumetrically". In volumetric mode
the entire dataset is loaded into memory; this is more efficient in that
less disk i/o is needed, but obviously very memory intensive. The
alternative
is to work plane-by-plane, where only a plane of data is loaded at a time.
The limitation to plane-by-plane, however, is that no variance smoothing
in the z direction can be done.
If you specify a single number for variance smoothing then isotropic
smoothing assumed, and hence you will be forced to used volumetric
mode. In order to work non-volumetrically, enter no z variance
smoothing, e.g. 12 12 0; it will then ask if you want to work volumetrically
and you can answer "no".
This isn't a great solution, but it is a reasonable work around. When
you have more than ~20 degrees of freedom, the variance can be estimated
well and you shouldn't need any variance smoothing, and hence this won't
be a problem for large datasets. It's for small datasets (df < ~10
especially) that the variance smoothing is most important.
Hope this helps.
-Tom
-- Thomas Nichols -------------------- Department of Statistics
http://www.stat.cmu.edu/~nicholst Carnegie Mellon University
[log in to unmask] 5000 Forbes Avenue
-------------------------------------- Pittsburgh, PA 15213
|