In its current implementation, spm_mfx is very memory hungry so I am not too surprised. I computed the results displayed in slide 18 (right) a few years ago:
https://www.fil.ion.ucl.ac.uk/spm/course/slides16-oct/04_Group_Analysis.pptx
I don't remember the exact details but it was perhaps half the number of subjects and half the number of scans compared to your dataset and it ran in a week on a computer with 128GB RAM for nearly identical results to the summary statistics approach that runs in seconds. So the main question before trying to access a computer with more RAM is whether you actually need to run a mixed effect model. The summary statistics approach is surprisingly robust to common violations of its underlying assumptions.
Or the assumption that 1st level residuals ~N(0,sI) is simply not violated (the newish AR(1)+white noise does a good job) --> checking residuals should be computationally cheaper
On 05/03/2020 19:05, Mark Orloff wrote:
> Hi,
>
> I am trying to run a mixed effects model with 31 subjects, ~700 volumes each, 6 conditions (x3 including time/dispersion derivatives), and 6 motion regressors on a cluster with 196GB available RAM. I've confirmed that all 196GB of memory is being used up by this model estimation process. Is it possible I've specified something incorrectly or is this level of memory usage to be expected for the way the model is set up? If I'm doing everything right, I can move over to a high memory cluster, but it seemed rather high for what I would have expected so I wanted to confirm everything was likely specified correctly.
>
> Thanks,
> Mark
>
--
Guillaume Flandin, PhD
Wellcome Centre for Human Neuroimaging
UCL Queen Square Institute of Neurology
London WC1N 3BG
The University of Edinburgh is a charitable body, registered in Scotland, with registration number SC005336.
|