Hello Will,
thanks for the reply.
The monster model I am trying to estimate has (perhaps people will jump
on the chair) one factor with 256 levels. It is meant to increase the
power of some non-standard analyses I designed.
So, reducing the number of levels is not an option right now. Assuming
independence might be: thanks for the tip, I will use it as very last
resort.
I put some debugging instructions in spm_get_vc to get feedback on where
the computations are: the model runs fine, just takes a looot of time.
I guess the best way to see how long it takes is to wait for the
analysis to end.
All the best,
Bruno
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Bruno L. Giordano, PhD
Postdoctoral Research Fellow
CIRMMT - Schulich School of Music
555 Sherbrooke Street West
Montréal, QC H3A1E3
Canada
+1 514 398 4535, Ext. 00900 (voice)
+1 514 398 2962 (fax)
http://www.music.mcgill.ca/~bruno
Will Penny wrote:
> Dear Bruno,
>
> Perhaps it would be possible to work with a different 2nd level design.
>
> If you have eg one factor with L levels, and tell SPM to allow for
> correlated errors, then SPM has L(L+1)/2 different variance components
> to estimate (it has to set up an individual covariance basis function
> for each, which is what spm_get_vc is doing).
>
> This part of SPM has not been optimised for large models as we did not
> envisage it being widely used.
>
> Your quick work around options are:
>
> 1. Do lots of simple two-level models by taking differential contrasts
> at the first level and one-sample t-tests at the second.
>
> Or
>
> 2. Tell SPM that that there is no correlation between levels - ie
> independence.
>
> Out of interest - how many levels are there in your factor(s) ?
>
> There may be a third option - reduce the number of levels and still
> allow for dependence.
>
> Best,
>
> Will.
>
> Bruno L. Giordano wrote:
>> Hello,
>>
>> I am trying to estimate a rather large (and perhaps crazy) 2nd level
>> model, with around 5000 contrast images.
>>
>> I should have managed to free up enough RAM in my XP 32bit system (RAM
>> defrag + 3 Gb Windows boot switch + startup in -nojvm mode): I do no
>> get the memory error when the factorial design specification starts.
>>
>> However, the computation appears to hang: after 10 hours of full-load
>> CPU the Matlab process appears to have read or written nothing more
>> than what it did at the beginning of the factorial design
>> specification phase. Is it possible to estimate roughly how much time
>> this phase would take, AKA, at which point should I start suspecting a
>> problem with the computations, if any? I am finding, for instance,
>> that the sort rows/columns loop in spm_get_vs alone takes around 190
>> minutes.
>>
>> Thanks,
>>
>> Bruno
>>
>> ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
>> Bruno L. Giordano, PhD
>> Postdoctoral Research Fellow
>> CIRMMT - Schulich School of Music
>> 555 Sherbrooke Street West
>> Montréal, QC H3A1E3
>> Canada
>> +1 514 398 4535, Ext. 00900 (voice)
>> +1 514 398 2962 (fax)
>> http://www.music.mcgill.ca/~bruno
>>
>>
>
|