Roland,
We're just about done with a new version of SnPM that will free you
from the huge SnPM_ST.mat files. You'll have to specify your
cluster-defining threshold before you "Compute", but you should be
able to have cluster-size inference for arbitrarily large images.
If you (or anyone else) would like to beta-test these revesions please
email me.
-Tom
-- Thomas Nichols -------------------- Department of Biostatistics
http://www.sph.umich.edu/~nichols University of Michigan
[log in to unmask] 1420 Washington Heights
-------------------------------------- Ann Arbor, MI 48109-2029
On Mon, 20 Jun 2005, Satoru Hayasaka wrote:
> Hi,
>
> This SnPM_ST file is a collection of voxel values and locations for high
> t-scores (the tip of t-image iceberg) from all the permutations, to be used
> in a cluster size test later on. From my experience, using variance smoothing
> (or pseudo-t) seems to accelerate the growth of SnPM_ST, especially when
> there aren't large activation areas. A possible remedy for this problem is to
> change the variable called "SnPMdefs.STprop" in snpm_defaults.m. The default
> is 0.10, but try a smaller number (eg., 0.05, 0.025, and so on) until SnPM_ST
> becomes a manageable size. Alternatively, you can try SnPM with a regular
> t-image instead of a psudo-t image.
>
> Good luck!
> -Satoru
>
> Satoru Hayasaka ==============================================
> Post-Doctoral Fellow, MR Unit, UCSF / VA Medical Center
> Email: shayasak_at_itsa_dot_ucsf_dot_edu Phone:(415) 221-4810 x4237
> Homepage: http://www.sph.umich.edu/~hayasaka
> ==============================================================
>
> At 09:37 AM 6/20/2005 +0200, Roland Marcus Rutschmann wrote:
>> Hi,
>>
>> after having "played" successfully with some 2nd level fMRI statistics and
>> snpm I tried another larger dataset. This consists of 18 subjects and
>> whole-brain functional images.
>>
>> After running out of disc space severall times I think my problem is not
>> the
>> nr of subjects but whole brain images. So the combination of 10 subjects,
>> whole brain images, pseudo t-statistics (and thus the volumetric eval.),
>> and
>> collecting sub-threshold statistics with only about 200 permutations leads
>> to
>> a file size of SnPM_ST.mat which cannot be loaded anymore. (size about 1GB
>> ->
>> load SnPM_ST.mat fails).
>>
>> Is that just me or is this reproducable. What are the rules of thumb of
>> data
>> sets that still can be handled?
>>
>> The whole calc works btw for another data set with 10 subjects and about
>> half-brain images or with the whole-brain images if I not choose the
>> volumetr. calculation. And it's of course only a problem for cluster
>> statistics.
>>
>> This happened on a debian sarge system with 2GB RAM and 2GB swap space
>> under
>> both matlab 6.5 and 7.
>>
>> Thanks for any help,
>>
>> Roland
>>
>> --
>> Dr. Roland Marcus Rutschmann
>> <[log in to unmask]>
>> Institute for Experimental Psychology, University of Regensburg
>> Universitätsstraße 31, 93053 Regensburg, Germany
>> Tel: +49 941 943 2533, Fax: +49 941 943 3233
>> http://www.psychologie.uni-regensburg.de/Rutschmann
>
>
>
|