Ad 2 - at least on my machine, Melodic uses swap space (and wouldn't
complete without or too little for some analyses). My system: Ubuntu
10.04 on a Core2Duo Q6600 w/ 8GB of RAM and a generous amount of swap
(32GB); I am running FSL 4.1.8 obtained from the Neurodebian
repositories.
Cheers,
Cornelius
On Fri, Sep 2, 2011 at 1:49 PM, Benjamin Kay <[log in to unmask]> wrote:
> On Friday, September 02, 2011 02:32:18 you wrote:
>> On 1 Sep 2011, at 14:35, Benjamin Kay wrote:
>> > Is there a set formula for estimating the memory requirements for MELODIC
>> > on a normal (64-bit Linux) system? The FAQ
>> > http://www.fmrib.ox.ac.uk/fslfaq/ suggests 1 GB of RAM, more if using
>> > MELODIC with "a large image matrix or large number of timepoints or
>> > subjects". But how much more? There have been a number of posts in the
>> > past about how so-and-so's dataset failed with 4 GB of RAM or
>> > so-and-so's dataset worked with 8 GB of RAM, but that doesn't tell me if
>> > processing *my* dataset with MELODIC is feasible.
>> >
>> > For instance, suppose I am using the temporal concatenation approach with
>> > a total of x voxels (x = voxels per volume * volumes per session * # of
>> > sessions). Is the expected virtual memory requirement x? Is it 2*x? Is
>> > it x^2? For large datasets, is there a way to trade speed or disk space
>> > for lower memory consumption?
>> >
>> > FYI, my specific case is x = 109350 voxels per volume * 400 volumes per
>> > session * 76 sessions = 26.6 billion voxels. Stored as single precision
>> > floating point data, that's 12.4 GB uncompressed. I am running:
>> >
>> > melodic --approach=concat --in=files --outdir=result --nomask --nobet
>> > --report --bgimage=MNI152_T1_4mm_brain.nii --tr=3 --Sdes=music.mat
>> > --Scon=music.con --mmthresh=0.5 --Oall --verbose
>> >
>> > And I get an std::bad_alloc during the PCA step (Data size : 5928 x
>> > 53516).
>>
>> Sorry I don't think we have a simple formula - particularly difficult to
>> estimate for a data-driven method…….my very rough guess would be that for
>> your data you might need between 5-20GB RAM/swap.
>>
>> Cheers.
>>
>> ---------------------------------------------------------------------------
>> Stephen M. Smith, Professor of Biomedical Engineering
>> Associate Director, Oxford University FMRIB Centre
>>
>> FMRIB, JR Hospital, Headington, Oxford OX3 9DU, UK
>> +44 (0) 1865 222726 (fax 222717)
>> [log in to unmask] http://www.fmrib.ox.ac.uk/~steve
>> ---------------------------------------------------------------------------
>
> Thank you for taking a stab at this, Stephen. Your estimate is actually very
> helpful. I have access to machines with ~20 GB of RAM. If you had told me I
> might need 100 GB of RAM, I would switched to a different ICA suite.
>
> For the 12.4 GB dataset I mentioned, it turned out 11 GB of RAM was necessary.
> Curiously, melodic refuses to touch swap. That is, melodic threw
> std::bad_alloc on a computer with more than 11 GB of virtual memory, and top
> shows it did not even try to use swap. Is this a bug?
>
> As for a formula, I realize that guessing exactly how much memory a data
> driven process needs is nigh impossible. Would you at least know if the gross
> memory requirement is linear? Logarithmic? If my dataset had been 300 sessions
> (50 GB on disk), for example, would I need to get my hands on 50 GB of RAM?
> I'm wildly guessing it boils down to a question of how much memory the PCA
> algorithm needs...
>
--
Dr. med. Cornelius J. Werner
Department of Neurology
RWTH Aachen University
Pauwelsstr. 30
52074 Aachen
Germany
|