Matlab on 32 bit machines can only use 2 GB of memory. To use more
memory you will need a 64 bit version of matlab running on a 64 bit
machine
On Tue, Sep 9, 2008 at 1:58 AM, Christophe Phillips
<[log in to unmask]> wrote:
> Hi Arshad ,
>
> with 30 subjects (as stated in your original message), the easiest option
> would be to treat them individually (FFX) then pass contrast images to a 2nd
> level analysis (RFX). This would prevent any memory problem...
> Or do you have to stick with a massive 30 subject FFX analysis ? This is
> bound to create memory problem as SPM implementation needs to handle
> (really) big covariate matrices and you then reach the limits of your 32bit
> hardware...
>
> HTH,
> Chris
>
> Darren Gitelman a écrit :
>
> Arshad
>
> On Mon, Sep 8, 2008 at 1:50 PM, Dr A. Zaman <[log in to unmask]> wrote:
>>
>> Hi Darren,
>>
>> Many thanks for the quick response.
>>
>> I am at the model estimating stage using spm2,
>> even with 15 subjects (-albeit 444vols/subject),
>
>
> Well, Windows does not win the prize for memory management.
>>
>> I get the notorious memory error. Would a 64bit/8GB work for sure?
>
>
> Possibly. I didn't ask how big your images are, but no matter. If you have
> access to a Linux machine you might try the analysis there. That would give
> you a better idea on the likelihood of working.
>
> You can check the maxmem setting in spm_defaults. If it is more than 2^20 it
> is possible that lowering the value would load less data at a time and that
> might help. Using less than 2^20 is usually not helpful and will increase
> the run time enormously. (I've never found this makes much of a difference,
> but you can try it. Memory errors usually occur in relation to
> manipulating the large covariance matrices.) You can try turning off java in
> matlab - I think matlab -nojvm when you start matlab from a shortcut. You
> can check what other processes you have running and close all non-essential
> programs.
>
>>
>> Also, if I
>> re-run the model estimation is SPM5, is it going
>> to be any easier (assuming I pre-process my data in SPM5)?
>
>
> No. Won't make a difference regarding memory issues.
>
> Darren
>
>>
>> best wishes,
>>
>> Arshad
>>
>>
>> On Mon, 8 Sep 2008, Darren Gitelman wrote:
>>
>>> Hi Arshad
>>>
>>> I don't think you will be able to do this with less than a 64 bit machine
>>> and 8 gb ram if you want to analyze them all together (i.e., a big fixed
>>> effects analysis). However, with 30 subjects I would think you would have
>>> sufficient numbers to perform a random effects analysis, in which case
>>> each
>>> subject could be analyzed separately and that should certainly work.
>>>
>>> regards,
>>> darren
>>>
>>> On Mon, Sep 8, 2008 at 11:43 AM, Dr A. Zaman <[log in to unmask]>
>>> wrote:
>>>
>>>> Dear SPM experts,
>>>>
>>>> I was wondering if anyone can suggest approx. how much
>>>> RAM is needed in order to model fMRI data consisting
>>>> of 30 subjects, 8 conditions, 444 (3mm/isotropic) volumes?
>>>> I have used a pc with 4GB RAM with no joy, could not even run
>>>> half the number of subjects. Prior to getting hold of an 8GB
>>>> machine, can I cam sure that it will work? Any comments will be greatly
>>>> appreciated.
>>>>
>>>> Many thanks,
>>>>
>>>> Arshad
>>>>
>>>>
>>>> *********************************************************/
>>>>
>>>> Dr. A Zaman,
>>>> Clinical Sciences Centre,
>>>> University Hospital Aintree,
>>>> Lower Lane,
>>>> Liverpool, UK.
>>>> L9 7AL.
>>>>
>>>> *********************************************************/
>>>>
>>>
>>>
>>>
>>> --
>>> -----
>>> Darren Gitelman
>>>
>
>
>
> --
> -----
> Darren Gitelman
>
>
|