I have had SPM99 memory problems, and wondered if anyone with similar problems had
managed to resolve them.
Basically, I am applying a fixed effects analysis to a large data set (1350 images
per subject, 8 subjects). The design matrix is also rather large (for each subject,
7 basis functions per event type, 6 event types and 6 confounds, in an event-related fMRI study).).
SPM99 is running on a PC Linux box with 1GB RAM and plenty of hard disk space. I think
the swap space is set to about 500MB.
Initially, I ran the analysis using the default maxMem setting of 2^20 and the analysis
proceeded very slowly (about 14 days before reaching 100%). On reaching 100%, it crashed
at the very last stage, well before it could save SPM.mat. It gave an out of memory
i) I would welcome any suggestions for an appropriate maxMem setting, and also any
thoughts on whether the problem is more likley to be related to the swap space settings.
ii) If the memory problem is insurmountable, I would need to conduct group analyses
in another way, ideally making use of first-level results from single subject analyses
in a second level analysis. How is this best acheived, given that I need to apply F
contrasts (I am using several basis functions per event type), and cannot generate
Best wishes for the New Year!