Hi guys,
I’m trying to use ‘fslmerge’ and ‘fslmaths -Tmean’ to generate an average volume for nearly 1100 HCP subjects. This obviously requires a lot of RAM, but there is something puzzling that I was wondering if you could shed some light on.
As a test case, to gauge the total memory that I’ll need, I ran the following simple script:
# Merge 101 volumes
fslmerge -t tmpmerge 1[01]*/MNINonLinear/T1w_restore.nii.gz
# Take the mean along the 4th dimension
fslmaths tmpmerge -Tmean tmpmean -odt float
Running that script used a maximum of ~16 GB of memory, even though the
tmpmerge.nii.gz file is only ~7 GB.
The gunzipp’ed tmpmerge.nii file is ~8 GB, indicating that the RAM required was double the size of the input to ‘fslmaths’.
Running the two commands separately showed that it is specifically the ‘fslmaths’ command that is the “memory hog” here. Is ‘fslmaths’ pre-allocating memory for an output that is the same size as the input, without regard
to the specific operations requested? e.g., In this case, the only operation requested is a “-Tmean”, and thus once the input is loaded, all that should be needed memory wise is a single additional volume (which would be tiny in the scheme of things).
thanks,
-MH
-----------------------------------------------------------
Conte Center for the Neuroscience of Mental Disorders
Washington University School of Medicine
Department of Psychiatry, Box 8134
660 South Euclid Ave.
Tel: 314-747-6173