Print

Print


Dear SPM, Dr. Gitelman
	I implemented the hack described below and unfortunately it still didn't 
solve my memory problems.  I am trying to run a large fixed effects analysis 
with 32 subjects. this computes to 24,000 images and at 130560 bytes per image 
thats over 3gigs of data.  I received the following error message at the start 
of the analysis:


--------------------------------------------------------------------------
  Saving SPMstats configuration           :            ...SPMcfg.mat
saved

   Design reporting                        :
  ...done

   SPM99: spm_spm (v2.37)                             03:25:34 -
  26/06/2001


  =======================================================================

   Initialising design space               :
   ...computing??? Error using ==> *
   Out of memory. Type HELP MEMORY for your options.

   Error in ==> /pkg/spm99/spm_filter.m
   On line 140  ==>     y = K{s}.KL*y;

   Error in ==> /pkg/spm99/spm_spm.m
   On line 440  ==> V             = spm_filter('apply',xX.K,KVi');

   Error in ==> /pkg/spm99/spm_fmri_spm_ui.m
   On line 619  ==>  spm_spm(VY,xX,xM,F_iX0,Sess,xsDes);

   ??? Error while evaluating uicontrol Callback.

-------------------------------------------------------------


This isn't surprising since i am using so many images.  I am trying to figure 
out if i have configured something wrong or if i have truly hit a memory 
limitation.  I am running matlab Version 5.3.0.10183 (R11) on Solaris 8 on a 
SunBlade 1000 machine.  It has 1gig of physical memory and i've given it 8gigs 
of virtual(swap) memory.  Aside from spm i tried to see  how much memory matlab 
could use and it could only access 2gigs even though i have plenty of virtual 
memory left.  Its a 64bit processor so i should be able to access a memory 
address space of more than 2gigs and as far as i can tell matlab can access as 
much memory as the system its on can access.  If anyone has any insight to this 
problem it would be greatly appreciated.

Christian DeVita
University of Pennsylvania
Dept. Neurology

>X-Sender: [log in to unmask] (Unverified)
>Mime-Version: 1.0
>Date: Sat, 26 May 2001 12:20:25 -0500
>From: "Darren R. Gitelman" <[log in to unmask]>
>Subject: Re: problems with memory
>Comments: To: Marta Maieron <[log in to unmask]>, devita 
<[log in to unmask]>
>To: [log in to unmask]
>X-Keywords: 
>
>Dear Marta, Devita and SPM
>
>There are several memory variables to be adjusted and one hack to be 
>implemented to help run large numbers of files in SPM.
>
>1) Adjust the MaxMem variable.
>JOHN:  THIS IS ALSO A PLEA TO MOVE MAXMEM FROM SPM_SPM TO SPM_DEFAULTS SO 
>THAT USERS CAN EASILY  ADJUST THIS VALUE TO DEAL WITH DIFFERENT SIZE 
>ANALYSES. PLEASE  8-)
>
>Anyway MaxMem is is one of the first lines after the help info in 
>spm_spm.m. Your current setting tells spm to load voxels in 512000 K chunks 
>(536 MB!). Because there are many other variables that may be large with 
>many images this is probably too big. I suggest trying 2^24 or even 2^20. 
>Understand that with loading smaller chunks the analysis will take longer 
>to run. To do this without affecting your site's installation of spm do the 
>following.
>         copy spm_spm.m to your home directory  (/home/~user   or whatever 
>it is)
>         Start matlab and spm99
>         type rmpath /home/~user
>                         (This command removes your home directory from the
>                         matlab path. If your home directory is not in the 
>matlab path
>                         you'll get an error but don't worry about it).
>         type addpath /home/~user
>                         (This command will now add back your home directory
>                         as the first folder in the matlab path. Now matlab 
>will look
>                         first in your come directory for any functions and 
>will use
>                         your modified version of spm_spm.m first)
>         type path
>                         (check that the path has been appropriately modified.
>                         Your should see something like:
>                                 /home/~user
>                                 /matlabr11/toolbox/spm99
>                                 /matlabr11/OtherMatlabDirectories)
>                 type clear functions (makes sure matlab rechecks all paths)
>                 edit the spm_spm.m and change the MaxMem variable.
>                 save and close the file
>
>2) Don't set up all F-contrasts, particularly if your experiment has many 
>conditions. This will just take a long time to set up and matlab never 
>entirely clears all the memory it uses for this type of operation.
>
>3) Hack- add preallocation of the XYZ matrix during the analysis. During 
>analysis spm_spm adds to a matrix called XYZ. This is a growing list voxel 
>locations to be saved. The way spm_spm is written it adds onto a matrix. 
>This is much less efficient and much slower than preallocating a matrix of 
>zeros, substituting the voxel values, and trimming the zeros at the end. I 
>have attached a modified version of spm_spm (version 2.37.1) that I have 
>hacked to do just this. See lines:
>         maxMem  341
>         preallocate XYZ     638
>         sub in voxels         1011
>         trim zeros             1173
>
>The modification does not change any of the analysis (as far as I can tell) 
>so it can either be substituted for the current version of spm_spm.m or 
>placed in your home folder just as noted above. We have been able to run 
>analyses of > 8000 images (64x64x32) using these strategies.
>
>Of course all the usual disclaimers apply to this modified version of 
>spm_spm: that there is no warranty or support, that it might give you 
>incorrect results, or that it might destroy your hard drive, your computer 
>or your life.
>
>Regards,
>Darren
>
>
>
>At 03:40 PM 5/25/2001 +0200, you wrote:
>>Dear SPMer's,
>>  I wrote some time ago but I haven't solved my problem.
>>I think that my problem is related to the fact that I have a big single
>>session with 3900 volumes.
>>Is possible that my problems of "out of memory" derive from this??
>>
>>My MaxMem varialbe is just set as 2^29.
>>Can I change the memory for a single session??
>>
>>
>>  I have a single session formed by 3600 images (8192 bytes each).
>>  The subject has done simple finger tapping with both hads.
>>  I've build a matrix design and estimate a model with this design
>>matrix.
>>
>>  SPM reads all the images, 3600/3600 but it is not able to complete the
>>  estimate. I've the following error:
>>
>>
>>  Saving SPMstats configuration           :            ...SPMcfg.mat
>>saved
>>
>>   Design reporting                        :
>>  ...done
>>
>>   SPM99: spm_spm (v2.37)                             03:25:34 -
>>  24/03/2001
>>
>>
>>  =======================================================================
>>
>>   Initialising design space               :
>>   ...computing??? Error using ==> *
>>   Out of memory. Type HELP MEMORY for your options.
>>
>>   Error in ==> C:\SPM99\spm99_updates\spm_filter.m
>>   On line 140  ==>     y = K{s}.KL*y;
>>
>>   Error in ==> C:\SPM99\spm_spm.m
>>   On line 440  ==> V             = spm_filter('apply',xX.K,KVi');
>>
>>   Error in ==> C:\SPM99\spm_fmri_spm_ui.m
>>   On line 619  ==>  spm_spm(VY,xX,xM,F_iX0,Sess,xsDes);
>>
>>   ??? Error while evaluating uicontrol Callback.
>>
>>
>>  Is possible that this error is related to the varialbe who sets memory
>>  of session???
>>  Can anybody tell me where can I set this variable in order to extend my
>>
>>  memory session?
>>
>>  Thanks
>>
>>  Marta
>>
>>
>>======================================
>>
>>Dr. Marta Maieron
>>Dip. Scienze e Tecnologie Biomediche
>>Università di Udine
>>Piazzale Kolbe, 4
>>33100 Udine
>>Italy
>>
>>Phone:   +39-0432-494360
>>Fax:     +39-0432-494301