Dear Christian,
> Dear SPM, Dr. Gitelman I implemented the hack described below and unfortunately it still didn't
> solve my memory problems. I am trying to run a large fixed effects analysis
> with 32 subjects. this computes to 24,000 images and at 130560 bytes per image
> thats over 3gigs of data. I received the following error message at the start
> of the analysis:
>
> --------------------------------------------------------------------------
> Saving SPMstats configuration : ...SPMcfg.mat
> saved
>
> Design reporting :
> ...done
>
> SPM99: spm_spm (v2.37) 03:25:34 -
> 26/06/2001
>
> =======================================================================
>
> Initialising design space :
> ...computing??? Error using ==> *
> Out of memory. Type HELP MEMORY for your options.
>
> Error in ==> /pkg/spm99/spm_filter.m
> On line 140 ==> y = K{s}.KL*y;
>
> Error in ==> /pkg/spm99/spm_spm.m
> On line 440 ==> V = spm_filter('apply',xX.K,KVi');
>
> Error in ==> /pkg/spm99/spm_fmri_spm_ui.m
> On line 619 ==> spm_spm(VY,xX,xM,F_iX0,Sess,xsDes);
>
> ??? Error while evaluating uicontrol Callback.
>
> -------------------------------------------------------------
>
> This isn't surprising since i am using so many images. I am trying to figure
> out if i have configured something wrong or if i have truly hit a memory
> limitation. I am running matlab Version 5.3.0.10183 (R11) on Solaris 8 on a
> SunBlade 1000 machine. It has 1gig of physical memory and i've given it 8gigs
> of virtual(swap) memory. Aside from spm i tried to see how much memory matlab
> could use and it could only access 2gigs even though i have plenty of virtual
> memory left. Its a 64bit processor so i should be able to access a memory
> address space of more than 2gigs and as far as i can tell matlab can access as
> much memory as the system its on can access. If anyone has any insight to this
> problem it would be greatly appreciated.
Although I can't solve your problem, I can provide you with some
insight. What SPM99 was trying to do when it crashed, was the
multiplication of two sparse, but rather huge matrices. The problem
arises in the 2nd of these two lines in spm_spm:
KVi = spm_filter('apply',xX.K, Vi);
V = spm_filter('apply',xX.K,KVi');
In the first line, SPM filters the estimated (sparse) covariance matrix
Vi (usually just the identity matrix) with your bandpass filter matrix
K. The result is the matrix KVi. I simulated this with typical filter
values and given 24.000 images, the matrix Vi (sparse identity matrix)
uses 384.004 bytes, fine. Matrix KVi uses more than 200 MBytes, not so
good. While computing KVi, matlab used plenty of memory in excess of 200
MB. I couldn't determine the space needed for V, because the necessary
computer memory seems to be simply enormous.
The problem arises due to the not so sparse nature of the (24.000
squared) matrix KVi and V. For smaller matrices (less scans), everything
seems to work fine, but the need of memory is quadratic in the number of
scans and 24.000 is presumably more than one can handle with a few
Gigabytes of memory. Note that even if your computer uses the 8 Gigabyte
swap space, this wouldn't be a good solution, because your computer
would take forever, trying to fake having 9 Gigabyte main memory instead
of just one. I recall that a good working ratio is 2:1, i.e. having two
times more swap space than computer main memory, but I might be wrong
here. Another point is that even if you were able to complete your
analysis, your result files would be huge and any operation on them
would be tedious.
The only solution I can see is to split up your analysis in single
subject analyses and implement a mixed effects analysis. Another
'solution' is to use different filter values for K to reduce the size of
KVi and V, but this would mean to use another temporal filter just
because of technical hardware constraints.
Stefan
--
Stefan Kiebel
Functional Imaging Laboratory
Wellcome Dept. of Cognitive Neurology
12 Queen Square
WC1N 3BG London, UK
Tel.: +44-(0)20-7833-7478
FAX : -7813-1420
email: [log in to unmask]
|