Hello all,
I have 2 runs of normalized and smoothed data for 36 subjects with about 2400 scans per subject total (TR = 0.46). The SPM.mat file is 1.06 GB (only looking at 4 short duration conditions currently). The system is a new 64-bit desktop with 64 GB of RAM (3000Mhz). I've been hitting a wall at the model estimation stage due to an "out of memory" error. I've played with the default maxmem setting, from 2^30 (1 GB) up to 2^35 (32 GB), but to no avail (the chunk size does change). I even removed about 8 additional subjects, but I get the same error. I've watched my resources in real time and see that memory usage initially goes up to around 50 or 51GB and plateaus for a bit. I see no spike over that point, but maybe the sample rate on the resource monitor is too slow to catch a transient spike. I don't really know, but at some point, the usage level drops off and the error pops up. I'd love to be able to run this on my lab computer, if at all possible. Do I just need more memory, or is there some other setting I can tweak? Does it sound like there's a problem with my data? Any advice would be greatly appreciated.
Take care,
Carl
|