Dear SPMers,
I have recently migrated to linux and I'm having difficulty
running large analyses (~2gigs of images). I originally ran all of my studies
on Solaris Sun stations and now I run the exact same analyses on linux and it
runs out of memory while initializing the design space. When I check how much
memory has been used I find that only 300megs of swap have been touched and
there is well over 1.5 gigs of space left. It has a pentium III 750mhz chip,
1gig of ram and two gigs of swap space and the linux kernel is version 2.2.14
with matlab version 5.3. I have done all of the usual tricks like unlimit. I
even got matlab to use most of my memory and swap with the command
a = rand(1e8,2);
which creates a very large matrix of random numbers.
So matlab can access the memory. Matlab is known to spit out the "out of
memory" for reasons not exactly related to amount of memory available. I cant
seem to figure out though what is causing this since this analysis ran on a sun
machine with less ram just fine. When I only run one subject I have no
problems. It's just the large studies. Here is the error message:
SPM99: spm_spm (v2.37) 17:25:20 - 27/09/2000
========================================================================
Initialising design space : ...computing??? Erro
r while evaluating uicontrol Callback.
>> ??? Error using ==> *
Out of memory. Type HELP MEMORY for your options.
Error in ==> /pkg/spm99/spm_filter.m
On line 140 ==> y = y - K{s}.KH*(K{s}.KH'*y);
Error in ==> /pkg/spm99/spm_spm.m
On line 439 ==> KVi = spm_filter('apply',xX.K, xX.xVi.Vi);
Has anyone had this problem on linux before?
Christian DeVita
Neurology
University of Pennsylvania
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|