Hi,
I don't know why you would get different behavior with the spm8 vs niftilib
code. I wonder if there is a change with the memory mapping -- are both
mex compiles 64bit ?
Back in 2005 or so when we released niftimatlib John Ashburner and the FIL
very generously released a copy of the spm nifti i/o code in public
domain, so
we could have an unrestricted distribution for the niftilib project. I
see there
have been updates to the @nifti and @file_array files since then, perhaps
these
are causing what you see. I will contact John re getting an updated version
of the files posted to niftilib.
best
Kate
> Dear all,
>
> When I was playing around with the standalone version of niftimatlib
> (whose matlab/@nifti I took to be the same as the @nifti directory in SPM)
> I got this strange result:
>
>>> rmpath niftimatlib-1.0/matlab
>>> rmpath(genpath('/usr/local/bin/spm8'))
>>> clear classes
>>> clear all
>
>>> addpath(genpath('/usr/local/bin/spm8'))
>>> N=nifti('fmri4d.nii');
>>> n=N.dat(:,:,:,:);
>
>>> rmpath(genpath('/usr/local/bin/spm8'))
>>> clear all
>>> clear classes
>
>>> addpath niftimatlib-1.0/matlab/
>>> N=nifti('fmri4d.nii');
>>> n=N.dat(:,:,:,:);
> ??? Error using ==> file2mat
> Out of memory. Type HELP MEMORY for your options.
>
> Error in ==> file_array.subsref>subfun at 80
> t = file2mat(sobj,varargin{:});
>
> Error in ==> file_array.subsref at 60
> t = subfun(sobj,args{:});
>
> Error in ==> nifti.subsref>rec at 219
> t = subsref(t,subs(2:end));
>
> Error in ==> nifti.subsref at 45
> varargout = rec(opt,subs);
>
> And yes, this was after compiling the MEX files in @file_array/private ...
>
> So: with the SPM version of @nifti, loading a 4D fMRI data set is OK, but
> loading the same data set with the niftimatlib libraries produces an
> out-of-memory.
> Strikes me as odd. Has anyone else come across this? Are there more recent
> versions of niftimatlib that have solved this (the one on the site says 5
> April 2006)?
>
> Many thanks,
> Alle Meije Wink
>
|