Thanks Matthew,
I found an error in the design matrix.
Cheers
Inge
On Mon, August 9, 2010 15:50, Matthew Webster wrote:
> Hello,
> The TFCE algorithm involves an integral with a step size based
> on the peak-statistic value in the first permutation. If a
> subsequent permutation has much larger statistic values than
> those in the first permutation, this integral can take an
> extremely long time to complete. It might be worth checking the
> input design for problems - as an error in the design can often
> cause this behaviour.
>
> Many Regards
>
> Matthew
>
>> Hi list,
>> randomise is throwing an error at me after stalling at permutation 2.
>> "Warning: tfce has detected a large number of integral steps. This
>> operation may require a great deal of time to complete."
>> It seems tfce is at fault somehow, as it runs fine unless I use --T2 or
>> -T options. Same error on fsl 4.1.6, 4.1.5, 4.1.1.
>>
>> Tips greatly appreciated!
>>
>> Cheers
>> Inge
>>
>> _______________________________
>> randomise options: -i all_FA_diff_skeletonised.nii.gz -m
>> mean_FA_skeleton_mask -d fsgd/diff/design.mat -t fsgd/diff/design.con -n
>> 10000 -x --T2 -o res_diff/FA -n 40 -o res_diff/FA_SEED250 --seed=250
>> Seeding with 250
>> Loading Data:
>> Data loaded
>> 1.14956e+11 permutations required for exhaustive test of t-test 1
>> Doing 40 random permutations
>> Starting permutation 1 (Unpermuted data)
>> Starting permutation 2
>> Warning: tfce has detected a large number of integral steps. This
>> operation may require a great deal of time to complete.
>>
>> ___________________________________________
>> WIth --debug I get this output:
>>
>> randomise -i all_FA_diff_skeletonised.nii.gz -o res_diff/FA -m
>> mean_FA_skeleton_mask -d fsgd/diff/design.mat -t fsgd/diff/design.con -n
>> 10000 --T2 -x -V --debug
>> randomise options: -i all_FA_diff_skeletonised.nii.gz -o res_diff/FA -m
>> mean_FA_skeleton_mask -d fsgd/diff/design.mat -t fsgd/diff/design.con -n
>> 10000 --T2 -x -V --debug
>> Loading Data: ********************************************
>> Data loaded
>> Confounds detected.
>> Subject | Design | group | label
>> 1.000000 1.363636 1.000000 1.000000
>> 2.000000 1.363636 1.000000 1.000000
>> 3.000000 1.363636 1.000000 1.000000
>> 4.000000 1.363636 1.000000 1.000000
>> 5.000000 1.363636 1.000000 1.000000
>> 6.000000 1.363636 1.000000 1.000000
>> 7.000000 1.363636 1.000000 1.000000
>> 8.000000 1.363636 1.000000 1.000000
>> 9.000000 1.363636 1.000000 1.000000
>> 10.000000 1.363636 1.000000 1.000000
>> 11.000000 1.363636 1.000000 1.000000
>> 12.000000 1.363636 1.000000 1.000000
>> 13.000000 1.363636 1.000000 1.000000
>> 14.000000 1.363636 1.000000 1.000000
>> 15.000000 -0.636364 1.000000 2.000000
>> 16.000000 -0.636364 1.000000 2.000000
>> 17.000000 -0.636364 1.000000 2.000000
>> 18.000000 -0.636364 1.000000 2.000000
>> 19.000000 -0.636364 1.000000 2.000000
>> 20.000000 -0.636364 1.000000 2.000000
>> 21.000000 -0.636364 1.000000 2.000000
>> 22.000000 -0.636364 1.000000 2.000000
>> 23.000000 -0.636364 1.000000 2.000000
>> 24.000000 -0.636364 1.000000 2.000000
>> 25.000000 -0.636364 1.000000 2.000000
>> 26.000000 -0.636364 1.000000 2.000000
>> 27.000000 -0.636364 1.000000 2.000000
>> 28.000000 -0.636364 1.000000 2.000000
>> 29.000000 -0.636364 1.000000 2.000000
>> 30.000000 -0.636364 1.000000 2.000000
>> 31.000000 -0.636364 1.000000 2.000000
>> 32.000000 -0.636364 1.000000 2.000000
>> 33.000000 -0.636364 1.000000 2.000000
>> 34.000000 -0.636364 1.000000 2.000000
>> 35.000000 -0.636364 1.000000 2.000000
>> 36.000000 -0.636364 1.000000 2.000000
>> 37.000000 -0.636364 1.000000 2.000000
>> 38.000000 -0.636364 1.000000 2.000000
>> 39.000000 -0.636364 1.000000 2.000000
>> 40.000000 -0.636364 1.000000 2.000000
>> 41.000000 -0.636364 1.000000 2.000000
>> 42.000000 -0.636364 1.000000 2.000000
>> 43.000000 -0.636364 1.000000 2.000000
>> 44.000000 -0.636364 1.000000 2.000000
>>
>> 1.14956e+11 permutations required for exhaustive test of t-test 1
>> Input Design:
>> 0.500000 -0.707107 0.136364
>> 0.500000 -0.707107 0.136364
>> 0.500000 -0.707107 0.136364
>> 0.500000 -0.707107 0.136364
>> 0.500000 -0.707107 0.136364
>> 0.500000 -0.707107 0.136364
>> 0.500000 -0.707107 0.136364
>> 0.500000 -0.707107 0.136364
>> 0.500000 -0.707107 0.136364
>> 0.500000 -0.707107 0.136364
>> 0.500000 -0.707107 0.136364
>> 0.500000 -0.707107 0.136364
>> 0.500000 -0.707107 0.136364
>> 0.500000 -0.707107 0.136364
>> -0.500000 -0.707107 -0.863636
>> -0.500000 -0.707107 -0.863636
>> -0.500000 -0.707107 -0.863636
>> -0.500000 -0.707107 -0.863636
>> -0.500000 -0.707107 -0.863636
>> -0.500000 -0.707107 -0.863636
>> -0.500000 -0.707107 -0.863636
>> -0.500000 -0.707107 -0.863636
>> -0.500000 -0.707107 -0.863636
>> -0.500000 -0.707107 -0.863636
>> -0.500000 -0.707107 -0.863636
>> -0.500000 -0.707107 -0.863636
>> -0.500000 -0.707107 -0.863636
>> -0.500000 -0.707107 -0.863636
>> -0.500000 -0.707107 -0.863636
>> -0.500000 -0.707107 -0.863636
>> -0.500000 -0.707107 -0.863636
>> -0.500000 -0.707107 -0.863636
>> -0.500000 -0.707107 -0.863636
>> -0.500000 -0.707107 -0.863636
>> -0.500000 -0.707107 -0.863636
>> -0.500000 -0.707107 -0.863636
>> -0.500000 -0.707107 -0.863636
>> -0.500000 -0.707107 -0.863636
>> -0.500000 -0.707107 -0.863636
>> -0.500000 -0.707107 -0.863636
>> -0.500000 -0.707107 -0.863636
>> -0.500000 -0.707107 -0.863636
>> -0.500000 -0.707107 -0.863636
>> -0.500000 -0.707107 -0.863636
>>
>> Input Contrast:
>> 1.000000 0.000000 0.000000
>>
>> Contrast rank: 1
>> Dof: 42 original dof: 42
>> Doing 10000 random permutations
>> Starting permutation 1 (Unpermuted data)
>> statistic Maximum: 2.16111e-15
>> Starting permutation 2
>> statistic Maximum: 4.00092
>>
>> and hangs..
>>
>
|