Dear Richard
If you have 8 subjects each contributing 2 fMRI sessions, you could create
adjusted mean images for each condition (using the random effects model).
This would reduce your data points to 8 x the number of conditions. The
analysis will run faster and you shouldn't run out of memory.
Best wishes
Cathy
>When estimating the parameters in an fMRI analysis using 99b
>we're consistently running out of memory. The analysis
>involves 8 subjects at two sessions with 180 scans per session.
>
>The error is:
>
>>> ??? Error using ==> .*
>Out of memory. Type HELP MEMORY for your options.
>
>Error in ==> /usr/local/spm99b/spm_SpUtil.m
>On line 470 ==> trRVRV = sum(sum(V.*V)) -
>2*sum(sum(V.*MV)) + sum(sum(MV'.*MV));
>
>Error in ==> /usr/local/spm99b/spm_spm.m
>On line 897 ==> [xX.trRV,xX.trRVRV] ...
>%-Variance expectations
>
>Unfortunately the process churns for about 7hr before dying --
>after saying it's completed 100% but not writing out its results.
>
>We're using a DEC Alpha with 256MB ram and another 1GB swap.
>From watching the swap space it's clear that we're not running
>out of swap so the problem appears to be elsewhere. In fact,
>it looks like its related to the process limits set in the
>kernel. Currently our max data space is set to ~250MB but
>perhaps this isn't sufficent? I don't understand why this
>should be a problem -- aren't people running similar analyses
>on Sun Ultras with only 128MB ram? Could the problem be something
>else?
>
>Any help would be greatly appreciated.
>
>Richard Russell
>Joe Devlin
>
>Centre for Speech and Language
>Cambridge University
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|