Hello,
We have been processing large datasets, comparing 8 runs in a second level
analysis. We could not get completed on: Windows XP, 1.5 GB RAM; Gentoo
Linux 2.6.x, 3.2 GB RAM, 512 MB swap. We upped the swap space to 244 GB,
and the run completed without problem. We also noticed that the RAM and
swap usage appeared to be not fully utilized under Linux, never exceeding
50% (of RAM). The process appears to have never delved into swap
space. It is as if FEAT looks and sees memory less than some amount, maybe
200% of what it actually needs, and just fails to go on. One hypothesis we
came up with is that the memory is needed for a period shorter than the
granularity of our monitoring, once every two seconds, and so we never see
the memory usage spike. Without increasing the sampling rate or looking
into the code, we cannot assess this possibility.
The bottom line is that a pure Linux system is superior to a Cygwin *nix
emulator, and that throwing more swap at the problem is cheaper than adding
RAM. Note also that most commodity motherboards cannot handle more than 4
GB of RAM, so swap is the only alternative if you have lots (basically an
average study) of data.
Chuck
At 07:25 AM 9/9/2005, Kaufman, Galen D. wrote:
>We have failed to process fMRI data using FSL FEAT in blocks of 300
>volumes on a Windows XP system with 2 GB RAM and 3 GHz processor. Each
>attempt results in errors such as the following:
>-------------------------------------------------------------------------
>
>/usr/local/fsl/bin/contrast_mgr stats design.con<?xml:namespace prefix =
>o ns = "urn:schemas-microsoft-com:office:office" />
>
>** ERROR: nifti_image_read(stats/sigmasquareds): can't open header file
>
>** ERROR: nifti_image_open(stats/sigmasquareds): bad header info
>
>Error: failed to open file stats/sigmasquareds
>
>Error:: FslGetDim: Null pointer passed for FSLIO
>
>Rendering using zmin=2.3 zmax=8
>
>
>
>mkdir tsplot
>
>
>
>/usr/local/fsl/bin/tsplot . -f filtered_func_data -o tsplot
>
>** ERROR: nifti_image_read(./stats/pe1): can't open header file
>
>** ERROR: nifti_image_open(./stats/pe1): bad header info
>
>Error: failed to open file ./stats/pe1
>
>ERROR: Could not open image ./stats/pe1
>
>Image Exception : #22 :: Failed to read volume ./stats/pe1
>------------------------------------------------------------------------
>We were able to improve the number of volumes slightly with more memory
>(up to about 120), but when the memory performance is monitored the memory
>does not appear to be fully utilized. We've tried this on multiple
>computers, including one that had a fresh installation if Windows XP
>professional. In order to perform this analysis we would like to know if
>anyone has processed that many volumes (or more) with FSL on a similar
>system running Linux? Is this issue exclusive to Windows (FSL doesn't
>appear to integrate the file structure very well), or is there some other
>issue?
>
>Thanks very much,
>
>Galen Kaufman and Michael Shinder
>UTMB in Galveston
Chuck Theobald
System Administrator
The Robert and Beverly Lewis Center for Neuroimaging
University of Oregon
P: 541-346-0343
F: 541-346-0345
|