Hi,
I have posted an updated version of niftimatlib to the SourceForge site:
http://niftilib.sourceforge.net/
This update is based on the SPM8 files, the previous version was based
on SPM5. Thanks to John and the FIL for permitting a copy of the spm8
io files to be placed in niftimatlib !
Thanks for testing Alle Meije, I believe the updates have resolved
your issue.
best
Kate
> Dear all,
>
> When I was playing around with the standalone version of niftimatlib
> (whose matlab/@nifti I took to be the same as the @nifti directory in SPM)
> I got this strange result:
>
>>> rmpath niftimatlib-1.0/matlab
>>> rmpath(genpath('/usr/local/bin/spm8'))
>>> clear classes
>>> clear all
>
>>> addpath(genpath('/usr/local/bin/spm8'))
>>> N=nifti('fmri4d.nii');
>>> n=N.dat(:,:,:,:);
>
>>> rmpath(genpath('/usr/local/bin/spm8'))
>>> clear all
>>> clear classes
>
>>> addpath niftimatlib-1.0/matlab/
>>> N=nifti('fmri4d.nii');
>>> n=N.dat(:,:,:,:);
> ??? Error using ==> file2mat
> Out of memory. Type HELP MEMORY for your options.
>
> Error in ==> file_array.subsref>subfun at 80
> t = file2mat(sobj,varargin{:});
>
> Error in ==> file_array.subsref at 60
> t = subfun(sobj,args{:});
>
> Error in ==> nifti.subsref>rec at 219
> t = subsref(t,subs(2:end));
>
> Error in ==> nifti.subsref at 45
> varargout = rec(opt,subs);
>
> And yes, this was after compiling the MEX files in @file_array/private ...
>
> So: with the SPM version of @nifti, loading a 4D fMRI data set is OK, but
> loading the same data set with the niftimatlib libraries produces an
> out-of-memory.
> Strikes me as odd. Has anyone else come across this? Are there more recent
> versions of niftimatlib that have solved this (the one on the site says 5
> April 2006)?
>
> Many thanks,
> Alle Meije Wink
>
|