Dear Clare:
From what you say does this mean ~8000 images? That's a lot, but we have
run with as many as 10,000 I believe. I don't know your platform but the
bottom line is you should maximize your swap space (at least 1GB; 2GB would
be better) in addition to your RAM.
Increasing maxmem in spm_spm definitely will not help you. This variable
defines how manhy "planks" of the image can be calculated in memory at
once. Increasing it tends to speed up calculations (assuming that the
information is held in RAM and not continuously swapping out to disk). You
may want to consider decreasing it though I would try adjusting swap first.
However, what generally fails because of memory in spm is the application
of the large convolution matrices which occurs after the images are
processed, and this is not affected by maxmem.
regards,
Darren
At 01:56 PM 5/25/00 +0100, you wrote:
>I'm trying to specify and estimate a model of an event related paradigm
>with 10 subjects and 4 runs of three conditions per subject, all 203 TRs
>long (therefore 40 sessions each with 2 trials). The program crashes out
>at the initialising design space stage saying that it is out of memory
>(I have 256MB of RAM). I'm wondering whether it's a limitation in the
>program or just of my computer? Will increasing max mem help?
>
>Thanks,
>Clare.
>
>________________________________________________________________________
>Clare Mackay
>Magnetic Resonance and Image Analysis Research Centre,(MARIARC)
>University of Liverpool, PO Box 147, Liverpool, L69 3BX. UK.
>_________________________________________________________________________
>
Darren Gitelman
Northwestern University, Evanston, IL. USA
[log in to unmask]
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|