Hello,
I am running Melodic Group ICA
>> melodic -i MelodicInput2013-04-17.txt -a concat -o Melodic2013-04-20.ica --report --tr=2 -d 30 --verbose
on a Linux (CentOS) machine with 32GB RAM and 39GB swap, 2TB hard drive.
The data files themselves are huge (~1GB) for each (resampled at 2mm) functional image (resting state), and I am analyzing 90 subjects.
The last lines of the program before the segmentation fault occurs are:
Step no. 34 change : 8.51335e-05
Step no. 35 change : 4.81112e-05
Convergence after 35 steps
Sorting IC maps
Writing results to :
Melodic2013-04-20.ica/melodic_IC
Melodic2013-04-20.ica/melodic_Tmodes
Melodic2013-04-20.ica/melodic_mix
Melodic2013-04-20.ica/melodic_FTmix
Melodic2013-04-20.ica/melodic_ICstats
Melodic2013-04-20.ica/mask
...done
Creating report index page ...done
Running Mixture Modelling on Z-transformed IC maps ...
IC map 1 ...
calculating mixture-model fit
re-scaling spatial maps ...
thresholding ...
alternative hypothesis test at p > 0.5
creating report page ... Segmentation fault
When watching the memory usage (via top) the RAM gets to the point of being maxed out, however, the swap appears to be barely used (no more than 7 MB at any time while melodic is running).
I am planning to upgrade to a 64GB RAM system to attempt to fix the issue (assuming it is due to memory limitations). However, I am concerned that 64GB of RAM may still not be enough to model the 90GB of data.
Is there any way that I can make the melodic program use more of the swap / virtual memory?
Any other suggestions?
Thanks,
Katie
|