Hello Will,
I am wondering whether the spm_get_vc instructions I comment at the end
of this email are strictly necessary for the stats, or they are just a
trick supposed to speed up computations (they are the heaviest
instructions in spm_get_vc, especially the cVi1 loop).
These lines are dedicated to "sort out rows/columns & remove all-zero
variance components" (and remove duplicates).
Interestingly to me, with various analyses of the same dataset of mine
these instructions keep all of the elements of SPM.xVi.Vi, i.e., neither
zero elements nor duplicate elements are found.
If they are indeed necessary, I am really wondering how they could be
optimized.
Thank you for any feedback,
Bruno
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
is the %commented code strictly necessary?
[unused ind] = ismember(Iin,Igen,'rows');
az = false(size(Vi));
for cVi = 1:numel(Vi)
Vi{cVi} = Vi{cVi}(ind,ind);
%az(cVi) = full(all(Vi{cVi}(:) == 0));
end;
%Vi = Vi(~az);
%dupl = false(size(Vi));
%for cVi = 1:numel(Vi)
% if ~dupl(cVi)
% for cVi1 = (cVi+1):numel(Vi)
% dupl(cVi1) = dupl(cVi1)||full(all(Vi{cVi}(:) == %Vi{cVi1}(:)));
% end
% end;
%end;
%Vi = Vi(~dupl);
~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
Bruno L. Giordano, PhD
Postdoctoral Research Fellow
CIRMMT - Schulich School of Music
555 Sherbrooke Street West
Montréal, QC H3A1E3
Canada
+1 514 398 4535, Ext. 00900 (voice)
+1 514 398 2962 (fax)
http://www.music.mcgill.ca/~bruno
Will Penny wrote:
> Hello again,
>
> Yes - I jumped on my chair :-) This is huge !
>
> I'm wondering if this is a sensible analysis.
>
> Presumably you are not going to test for the main effect of the factor ?
> (ie look for any differences among the 256 levels. This would require an
> F-test with 255 rows). And if you're not, there's little point fitting
> this model.
>
> What are the contrasts you are interested in ? (ie what was your expt
> designed to test?)
>
> If for example, to see if there are any differences between levels
> 1-32, 33-64, 65-96, 224-256 - ie 8 groups of levels, then you only need
> to set up a design with 8 levels.
>
> BEst,
>
> Will.
>
> Bruno L. Giordano wrote:
>> Hello Will,
>>
>> thanks for the reply.
>>
>> The monster model I am trying to estimate has (perhaps people will
>> jump on the chair) one factor with 256 levels. It is meant to increase
>> the power of some non-standard analyses I designed.
>>
>> So, reducing the number of levels is not an option right now. Assuming
>> independence might be: thanks for the tip, I will use it as very last
>> resort.
>>
>> I put some debugging instructions in spm_get_vc to get feedback on
>> where the computations are: the model runs fine, just takes a looot of
>> time.
>>
>> I guess the best way to see how long it takes is to wait for the
>> analysis to end.
>>
>> All the best,
>>
>> Bruno
>>
>> ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
>> Bruno L. Giordano, PhD
>> Postdoctoral Research Fellow
>> CIRMMT - Schulich School of Music
>> 555 Sherbrooke Street West
>> Montréal, QC H3A1E3
>> Canada
>> +1 514 398 4535, Ext. 00900 (voice)
>> +1 514 398 2962 (fax)
>> http://www.music.mcgill.ca/~bruno
>>
>> Will Penny wrote:
>>> Dear Bruno,
>>>
>>> Perhaps it would be possible to work with a different 2nd level design.
>>>
>>> If you have eg one factor with L levels, and tell SPM to allow for
>>> correlated errors, then SPM has L(L+1)/2 different variance
>>> components to estimate (it has to set up an individual covariance
>>> basis function for each, which is what spm_get_vc is doing).
>>>
>>> This part of SPM has not been optimised for large models as we did
>>> not envisage it being widely used.
>>>
>>> Your quick work around options are:
>>>
>>> 1. Do lots of simple two-level models by taking differential
>>> contrasts at the first level and one-sample t-tests at the second.
>>>
>>> Or
>>>
>>> 2. Tell SPM that that there is no correlation between levels - ie
>>> independence.
>>>
>>> Out of interest - how many levels are there in your factor(s) ?
>>>
>>> There may be a third option - reduce the number of levels and still
>>> allow for dependence.
>>>
>>> Best,
>>>
>>> Will.
>>>
>>> Bruno L. Giordano wrote:
>>>> Hello,
>>>>
>>>> I am trying to estimate a rather large (and perhaps crazy) 2nd level
>>>> model, with around 5000 contrast images.
>>>>
>>>> I should have managed to free up enough RAM in my XP 32bit system
>>>> (RAM defrag + 3 Gb Windows boot switch + startup in -nojvm mode): I
>>>> do no get the memory error when the factorial design specification
>>>> starts.
>>>>
>>>> However, the computation appears to hang: after 10 hours of
>>>> full-load CPU the Matlab process appears to have read or written
>>>> nothing more than what it did at the beginning of the factorial
>>>> design specification phase. Is it possible to estimate roughly how
>>>> much time this phase would take, AKA, at which point should I start
>>>> suspecting a problem with the computations, if any? I am finding,
>>>> for instance, that the sort rows/columns loop in spm_get_vs alone
>>>> takes around 190 minutes.
>>>>
>>>> Thanks,
>>>>
>>>> Bruno
>>>>
>>>> ~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~~
>>>> Bruno L. Giordano, PhD
>>>> Postdoctoral Research Fellow
>>>> CIRMMT - Schulich School of Music
>>>> 555 Sherbrooke Street West
>>>> Montréal, QC H3A1E3
>>>> Canada
>>>> +1 514 398 4535, Ext. 00900 (voice)
>>>> +1 514 398 2962 (fax)
>>>> http://www.music.mcgill.ca/~bruno
>>>>
>>>>
>>>
>>
>>
>
|