Hi,
I recorded some highres fmri data from a 30 minutes experiment. The
image is 160x160 inplane with 25 slices and 902 volumes. The raw data
is about 600 MB.
I ran a FEAT analysis (first level) on it with no filtering and no
corrections and it went fine.
But when I enabled slicetime correction feat exists with:
<snip>
/usr/share/fsl/bin/betfunc prefiltered_func_data_st prefiltered_func_data_bet
Aaargh not enough memory!
** ERROR: nifti_image_read(prefiltered_func_data_bet_tmp_brain_mask): can't open header file
** ERROR: nifti_image_open(prefiltered_func_data_bet_tmp_brain_mask): bad header info
Error: failed to open file prefiltered_func_data_bet_tmp_brain_mask
Error:: FslGetIntensityScaling: Null pointer passed for FSLIO
Aaargh not enough memory!
The funny thing is that prefiltered_func_data_st.nii.gz is 1.1 GB (2.2
GB uncompressed) big, due to datatype FLOAT32 instead of int16). The machine
I'm using is a four core Opteron workstation running 64 Debian etch with 12 GB
of RAM -- so this should be no problem. I'm running the Debian package
of FSL 3.3.11.
Unfortunately, I also cannot send a header dump of this file, because avwhd
says:
** ERROR: nifti_image_read(prefiltered_func_data_st): bad data file
** ERROR: nifti_image_open(prefiltered_func_data_st): bad header info
Error: failed to open file prefiltered_func_data_st
ERROR: Could not open file
The same is true for most other avwtools (including avwcheck).
I thought that 'ip' might have corrupted the datafile, but I can easily
open it with PyNIfTI. The fully loaded image consumes about 20% of the
memory of that machine (not touching any swap space).
Here is the PyNIfTI header dump (not as complete as the avwhd one, but
all I can get):
In [2]: nim=nifti.NiftiImage('prefiltered_func_data_st.nii')
In [3]: nim.header
Out[3]:
{'aux_file': '',
'bitpix': 32,
'cal_max': 0.0,
'cal_min': 0.0,
'datatype': 16,
'db_name': '',
'descrip': 'FSL3.3',
'dim': [4, 160, 160, 25, 902, 1, 1, 1],
'dim_info': '\x00',
'extents': 0,
'glmax': 0,
'glmin': 0,
'intent_code': 0,
'intent_name': '',
'intent_p1': 0.0,
'intent_p2': 0.0,
'intent_p3': 0.0,
'magic': 'n+1',
'pixdim': [1.0, 1.375, 1.375, 1.540001392364502, 2000.0, 1.0, 1.0, 1.0],
'qform_code': 1,
'qoffset': [118.66303253173828, 95.945091247558594, 35.716033935546875],
'quatern': [5.5377973911865871e-19,
-0.23259672522544861,
0.97257328033447266],
'regular': 'r',
'scl_inter': 0.0,
'scl_slope': 0.0,
'session_error': 0,
'sform': array([[ -1.37500000e+00, -3.54220219e-19, 1.65886295e-18,
1.18663033e+02],
[ -3.54220219e-19, -1.22622156e+00, -6.96750104e-01,
9.59450912e+01],
[ 1.48112624e-18, -6.22097731e-01, 1.37336946e+00,
3.57160339e+01],
[ 0.00000000e+00, 0.00000000e+00, 0.00000000e+00,
1.00000000e+00]]),
'sform_code': 1,
'sizeof_hdr': 348,
'slice_code': '\x00',
'slice_duration': 0.0,
'slice_end': 0,
'slice_start': 0,
'toffset': 0.0,
'vox_offset': 352.0,
'xyzt_units': '\n'}
What could be the cause of the problem? Is there a general limitation in
the size of a nifti file?
My wild guess would be, that this is somehow related to the fslio
library as most avwtools are affected, but libniftiio (which is used by
PyNIfTI) has no problems with files of that size).
Any help is very much appreciated.
Thanks,
Michael
--
GPG key: 1024D/3144BE0F Michael Hanke
http://apsy.gse.uni-magdeburg.de/hanke
ICQ: 48230050
|