Hi,
after having "played" successfully with some 2nd level fMRI statistics and
snpm I tried another larger dataset. This consists of 18 subjects and
whole-brain functional images.
After running out of disc space severall times I think my problem is not the
nr of subjects but whole brain images. So the combination of 10 subjects,
whole brain images, pseudo t-statistics (and thus the volumetric eval.), and
collecting sub-threshold statistics with only about 200 permutations leads to
a file size of SnPM_ST.mat which cannot be loaded anymore. (size about 1GB ->
load SnPM_ST.mat fails).
Is that just me or is this reproducable. What are the rules of thumb of data
sets that still can be handled?
The whole calc works btw for another data set with 10 subjects and about
half-brain images or with the whole-brain images if I not choose the
volumetr. calculation. And it's of course only a problem for cluster
statistics.
This happened on a debian sarge system with 2GB RAM and 2GB swap space under
both matlab 6.5 and 7.
Thanks for any help,
Roland
--
Dr. Roland Marcus Rutschmann <[log in to unmask]>
Institute for Experimental Psychology, University of Regensburg
Universitätsstraße 31, 93053 Regensburg, Germany
Tel: +49 941 943 2533, Fax: +49 941 943 3233
http://www.psychologie.uni-regensburg.de/Rutschmann
|