Hi Lucas,
the error occurs during writing the bias corrected normalized image. Usually the most
memory demanding function is the MRF correction. I assume that your spatial resolution
is < 1mm and you have insufficient memory for these large images. You can try to
deselect the writing option of the bias corrected image and write the segmentations only.
If you still need the bias corrected images you can write these images in a second run
without segmentations.
Best regards,
Christian
--
____________________________________________________________________________
Christian Gaser, Ph.D.
Assistant Professor of Computational Neuroscience
Department of Psychiatry
Friedrich-Schiller-University of Jena
Jahnstrasse 3, D-07743 Jena, Germany
Tel: ++49-3641-934752 Fax: ++49-3641-934755
e-mail: [log in to unmask]
http://dbm.neuro.uni-jena.de
On Mon, 18 May 2009 08:42:10 +0200, Lucas Eggert <[log in to unmask]> wrote:
>Dear Experts,
>
>I am trying to run a VBM analysis on a sample of 70 subjects.
>
>I am using the default parameters of the VBM toolbox (version 1.18) and
>regularly get the following error message:
>
>Error running job: Error using ==> round
>Out of memory. Type HELP MEMORY for your options.
>In file "/usr/local/matlab/toolbox/spm5/toolbox/vbm5/cg_vbm_write.m"
>(v716), function "vbm_apply" at line 322.
>In file "/usr/local/matlab/toolbox/spm5/toolbox/vbm5/cg_vbm_write.m"
>(v716), function "cg_vbm_write" at line 35.
>In file "/usr/local/matlab/toolbox/spm5/toolbox/vbm5/cg_config_vbm.m"
>(v424), function "execute_estwrite" at line 729.
>--------------------------
>Done.
>
>Sometimes it helped to run the analysis on only two or three data sets at
>a time, instead of trying to process all in one run. But sometimes
>particular data sets also produce this error on their own.
>
>I am using spm5 on Matlab R2009a on Xubuntu 9.04.
>
>Any help would be very much appreciated.
>
>All the best,
>-Lucas
|