Thanks to all those who replied to my recent posting.
Since none of those replies was posted to the list in general, however, I
guess I'll reply to my own question now, for the benefit of all those who
had the same problem:
For 1GB RAM, changing maxMem in spm_spm.m to 2^29 (thanks to Kent Kiehl for
this suggestion)dramatically sped up the analysis (e.g., our 36 session*128
volumes/session statistical analysis that would have taken 40+ hours to
complete at 2^20, completed in ~3 hours after setting maxMem to 2^29).
2^30 would have exceeded our 1GB RAM, but there appeared to be no problems
with 2^29.
Out of curiosity - is there a reason why maxMem is defined in exponential
terms? Being able to define it as just an integer would seem to give more
flexibility in allowable settings.
Joe
------------------------------
Joseph B. Hopfinger, Ph.D.
Department of Psychology
University of California, Davis
Davis, CA 95616
------------------------------
Joseph Hopfinger wrote:
> Dear list,
>
> could anyone give us advice about windows NT memory?
>
> We're running spm99 on a winNT workstation with 1 GB RAM and 2GB swap.
> The analysis of a 6-subject dataset is proceeding somewhat slowly
> (projected finish time of well over 36 hours), and we were hoping that
> there might be some memory settings that we could change to improve our
> performance (the memory usage by Matlab, as measured by the NT task
> manager appears to never exceed ~68000K). We've never modified any
> memory settings on the machine (aside from increasing the swap to 2 GB),
> so any advice on general NT system enhancements would be helpful.
>
> A more specific question is could anyone advise us on an appropriate
> setting for the maxMem variable in spm_spm.m, given our amount of RAM?
>
> Thanks very much,
>
> Joe
>
> ------------------------------
> Joseph B. Hopfinger, Ph.D.
> Department of Psychology
> University of California, Davis
> Davis, CA 95616
> ------------------------------
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|