Hello again,
Yes - I jumped on my chair :-) This is huge !
I'm wondering if this is a sensible analysis.
Presumably you are not going to test for the main effect of the factor ?
(ie look for any differences among the 256 levels. This would require an
F-test with 255 rows). And if you're not, there's little point fitting
this model.
What are the contrasts you are interested in ? (ie what was your expt
designed to test?)
If for example, to see if there are any differences between levels
1-32, 33-64, 65-96, 224-256 - ie 8 groups of levels, then you only need
to set up a design with 8 levels.
BEst,
Will.
Bruno L. Giordano wrote:
> Hello Will,
>
> thanks for the reply.
>
> The monster model I am trying to estimate has (perhaps people will jump
> on the chair) one factor with 256 levels. It is meant to increase the
> power of some non-standard analyses I designed.
>
> So, reducing the number of levels is not an option right now. Assuming
> independence might be: thanks for the tip, I will use it as very last
> resort.
>
> I put some debugging instructions in spm_get_vc to get feedback on where
> the computations are: the model runs fine, just takes a looot of time.
>
> I guess the best way to see how long it takes is to wait for the
> analysis to end.
>
> All the best,
>
> Bruno
>
> ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
> Bruno L. Giordano, PhD
> Postdoctoral Research Fellow
> CIRMMT - Schulich School of Music
> 555 Sherbrooke Street West
> Montréal, QC H3A1E3
> Canada
> +1 514 398 4535, Ext. 00900 (voice)
> +1 514 398 2962 (fax)
> http://www.music.mcgill.ca/~bruno
>
> Will Penny wrote:
>> Dear Bruno,
>>
>> Perhaps it would be possible to work with a different 2nd level design.
>>
>> If you have eg one factor with L levels, and tell SPM to allow for
>> correlated errors, then SPM has L(L+1)/2 different variance components
>> to estimate (it has to set up an individual covariance basis function
>> for each, which is what spm_get_vc is doing).
>>
>> This part of SPM has not been optimised for large models as we did not
>> envisage it being widely used.
>>
>> Your quick work around options are:
>>
>> 1. Do lots of simple two-level models by taking differential contrasts
>> at the first level and one-sample t-tests at the second.
>>
>> Or
>>
>> 2. Tell SPM that that there is no correlation between levels - ie
>> independence.
>>
>> Out of interest - how many levels are there in your factor(s) ?
>>
>> There may be a third option - reduce the number of levels and still
>> allow for dependence.
>>
>> Best,
>>
>> Will.
>>
>> Bruno L. Giordano wrote:
>>> Hello,
>>>
>>> I am trying to estimate a rather large (and perhaps crazy) 2nd level
>>> model, with around 5000 contrast images.
>>>
>>> I should have managed to free up enough RAM in my XP 32bit system
>>> (RAM defrag + 3 Gb Windows boot switch + startup in -nojvm mode): I
>>> do no get the memory error when the factorial design specification
>>> starts.
>>>
>>> However, the computation appears to hang: after 10 hours of full-load
>>> CPU the Matlab process appears to have read or written nothing more
>>> than what it did at the beginning of the factorial design
>>> specification phase. Is it possible to estimate roughly how much time
>>> this phase would take, AKA, at which point should I start suspecting
>>> a problem with the computations, if any? I am finding, for instance,
>>> that the sort rows/columns loop in spm_get_vs alone takes around 190
>>> minutes.
>>>
>>> Thanks,
>>>
>>> Bruno
>>>
>>> ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
>>> Bruno L. Giordano, PhD
>>> Postdoctoral Research Fellow
>>> CIRMMT - Schulich School of Music
>>> 555 Sherbrooke Street West
>>> Montréal, QC H3A1E3
>>> Canada
>>> +1 514 398 4535, Ext. 00900 (voice)
>>> +1 514 398 2962 (fax)
>>> http://www.music.mcgill.ca/~bruno
>>>
>>>
>>
>
>
--
William D. Penny
Wellcome Trust Centre for Neuroimaging
University College London
12 Queen Square
London WC1N 3BG
Tel: 020 7833 7475
FAX: 020 7813 1420
Email: [log in to unmask]
URL: http://www.fil.ion.ucl.ac.uk/~wpenny/
|