Dear Kris,
On that line an outerproduct of size scans x scans is being computed.
At 8 bytes per float, that amounts to ~2.9GB for your number of scans.
Matlab is looking for a contiguous piece of memory that size, not to
forget that there is a system defined limit of 2-3.5G limit per
process depending on your OS in a 32 bit machine and that memory is
being used by other variables in the spm computation. So you are out
of luck unless you have a 64-bit processor running the new matlab at
hand.
However, with that many subjects, a random effects analysis perhaps is
the better way to go, both statistically and computationally.
Satra
--
Satrajit Ghosh
Postdoctoral Associate
Speech Communications Group
Research Lab of Electronics, MIT
On Fri, 11 Feb 2005 17:10:09 -0500, Kris Knutson <[log in to unmask]> wrote:
> I'm doing a fixed effects analysis for 24 subjects, with 6 runs per
> subject and 136 images per run, for a total of 19584 scans.
> After SPM runs for awhile, it stops midway through spm_spm (line 675)
> at Plane 1 of 35 and block 56 of 282 with an error message of "Out of
> Memory", CY = CY + Y*Y'.
>
> I've had our system administrator increase the swap size three times,
> but I still get the same error at the same point. I've also tried
> "clear" and "pack" commands. Swap size usage does increase rapidly at
> this point, but it doesn't seem to be running out of swap space.
>
> Can matlab handle an analysis this big? Any ideas on what the problem
> is?
>
> Thanks,
>
> Kris Knutson,
> Psychologist
> Cognitive Neuroscience Section
> NINDS, NIH
> 301-402-6920
>
|