It sounds like the problem may be related to memory mapping. In
versions before SPM99, there were problems because all the files
were mapped at once. When a file is mapped, it gets added to
Matlab's virtual memory, but when it is unmapped, the virtual
memory is freed up again. Most current systems have a 2 or 4 Gbyte
limit on the amount of virtual memory that can be used by any
process (because of the largest integer value that can be
represented).
I thought I'd fixed this problem by only mapping images when they
were needed. It may be the case that Linux works differently to
SunOS, and doesn't properly free up its memory mapped files.
Feedback from Linux users would be helpful at this point.
Alternatively (and I hope that this is the case), you may have
crashed out of some process that had lots of images mapped at
the same time (e.g., creating a mean of some kind). If this
happens, the mex files aren't sophisticated enough to unmap any
images that may have been mapped. Try quitting and restarting
Matlab before you run the analysis to see if this helps.
All the best,
-John
| I have recently migrated to linux and I'm having difficulty
| running large analyses (~2gigs of images). I originally ran all of my studies
| on Solaris Sun stations and now I run the exact same analyses on linux and it
| runs out of memory while initializing the design space. When I check how much
| memory has been used I find that only 300megs of swap have been touched and
| there is well over 1.5 gigs of space left. It has a pentium III 750mhz chip,
| 1gig of ram and two gigs of swap space and the linux kernel is version 2.2.14
| with matlab version 5.3. I have done all of the usual tricks like unlimit. I
| even got matlab to use most of my memory and swap with the command
|
| a = rand(1e8,2);
|
| which creates a very large matrix of random numbers.
|
| So matlab can access the memory. Matlab is known to spit out the "out of
| memory" for reasons not exactly related to amount of memory available. I cant
| seem to figure out though what is causing this since this analysis ran on a
sun
| machine with less ram just fine. When I only run one subject I have no
| problems. It's just the large studies. Here is the error message:
|
|
| SPM99: spm_spm (v2.37) 17:25:20 - 27/09/2000
| ========================================================================
| Initialising design space : ...computing???
Erro
| r while evaluating uicontrol Callback.
|
|
| >> ??? Error using ==> *
| Out of memory. Type HELP MEMORY for your options.
|
| Error in ==> /pkg/spm99/spm_filter.m
| On line 140 ==> y = y - K{s}.KH*(K{s}.KH'*y);
|
| Error in ==> /pkg/spm99/spm_spm.m
| On line 439 ==> KVi = spm_filter('apply',xX.K, xX.xVi.Vi);
|
|
|
| Has anyone had this problem on linux before?
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|