On 1/25/11 12:30 PM, Michael Hanke wrote:
> On Tue, Jan 25, 2011 at 10:58:22AM -0800, Chuck Theobald wrote:
>
>> Another error claims an out-of-memory condition in the Stats section:
>>
>> Spatially smoothing auto corr estimates
>> Error: FILM did not complete - it probably ran out of memory
>>
>> Running on my single non-virtualized machine, both first- and
>> second-level analysis completed without problem. Should I be less
>> ambitious in configuring the VMs? Are virtualized nodes something
>> that one just should not try?
>>
> Did you test whether it is really a memory issue? What if you run just
> one VM with 20GB assigned to it -- does it still fail? Or alternatively,
> what is your 'single' machine equipped with? Does it also use the DEB
> package?
>
> Using VMs as compute nodes is a common use case. For example, Condor [0]
> can use VM nodes running on machines with other OS or in the cloud to do
> grid-computing.
>
>
> [0] http://www.cs.wisc.edu/condor/
>
>
> Michael
>
>
My rule of thumb is to allocate 3-4 GB RAM per core. The version of
VirtualBox I have permits a maximum of 2 CPUs per VM, hence the 8 GB
allocation. Running 3 VMs left nothing for the base O/S, so something
had to get shorted. I think my VMs were bumping heads with the base O/S.
I did run some analyses using just two VMs per hardware node and did not
run into memory issues. All analyses finished and produced readable
results. I still had the 'number missing' condition, though, so I don't
know if I can trust these results. Here is part of the Post-stats output:
++ WARNING: nifti_read_buffer(reg/standard.nii.gz):
data bytes needed = 14442064
data bytes input = 9176201
number missing = 5265863 (set to 0)
The single machine that works quite nicely is an identical installation
to the virtual machines, it just has 8 threads and 24 GB RAM.
Thank you,
--
Chuck Theobald
System Administrator
The Robert and Beverly Lewis Center for Neuroimaging
University of Oregon
P: 541-346-0343
F: 541-346-0345
|