hi everybody,
i do regressions of individual parameters (subject's age etc.) with
structural images (like MTratio or AI) on a group of 20-60 subjects.
normally, the regressions run fine (i am using bas neggers scripts with some
modifications...), and i can include normalised and smoothed MTr-images (or
anisotropy-index-images) in the analysis. the script always worked.
the i decided to do an amplitude normalization on the images, to get rid of
individual mean gray value differences. i wrote a script that uses
spm_imcalc to normalizes each image plane (thats what spm_imcalc normally
does) by the image std. dev and image mean, by:
%-------
Y=spm_read_vols(Vin(n));
MW=nanmean(nanmean(nanmean(Y)));
SD=nanstd(nanstd(nanstd(Y)));
Vout=spm_imcalc(Vin,Vout,'((i1-MW)./SD).*ImageStdDev +
ImageMean','',MW,SD,ImageMean,ImageStdDev);
% ImageMean,ImageStdDev are normally 0 and 1, respectively.
%----------
(you can pass additional variables to spm_imcalc)
i add a capitol 'A' before the image name to signify the amplitude
normalization (i.e. swn_subject_MTratio.img -> Aswn_subject_MTratio.img).
These images look fine, and there is no (obvious) proplem with them in MRicro.
and now comes the weird stuff:
my regression scripts crash with an out-of-memory error if i use the
normalized 'A'-images, but NOT with the original images:
%-----------
...
Plane 171/171, block 1/1 : ...closing files
Spatial non-sphericity (over scans) : ...estimating derivatives???
Out of memory. Type HELP MEMORY for your options.
Error in ==> spm_est_smoothness at 104
[d, dx, dy, dz] = spm_sample_vol(V(i), Ix, Iy, Iz, 1);
...
%----------------
this normally happens at that very point, but i also had crashes at other
points IN spm_est_smoothness. seems really to be a memory problem. I hve 2
Gb installed, largest mem block for matlab is usually around 1.1 GB. tried
all the memory tricks, none worked.
even when i use only 4-5 input images, i get that out-of-mem error, since
matlab runs up to more that one gig of mem right after starting the
spm_est_smoothness-function. this does NOT happen with the original
images(stays around some 500 MB).
Ok: i know that there might be a problem with my image normalization, mybe
there is some parameter set wrong during the spm_imcalc routine (also i
tried it with spm_write_vol, too), but spm_est_smoothness does NOT USE the
input images, but the residual images, right? so how can it be dependant of
eventually faulty input images, if the parameter estimation runs nicely?
the residual imges are 67 MB each (in both situations), the input images
(original and 'A') are 16 Mb each. Betas are same size with 33 MB.
That leaves me with the bad feeling that either there is some awfully
complicated problem, or something so stupid that i overlooked it... :-)
can anybody give some hints? might this have something to do with data types
or scaling factors?
thanks in advance,
hartmut
|