Thanks Jesper.
I will lower --ol_nstd as much as I don't get the error. Regarding myslspec.txt, it was my fault to post it "here" like that.
I also have 2 more questions:
Regarding --ol_nstd, in general, wouldn't be better to also implement a mechanism to work based on interquartile?
and what is the best/common criterion to exclude subjects based on motion and which post eddy file should be used for this purpose?
Best,
Hamed
Dear Hamed,
> Dear experts
>
> I have a Philips DWI dataset with acquisition parameters: single band, 50 slices, ascending, 102 directions, multishell. I receive the following error with eddy (cuda: fsl 5.0.11) for "few" subjects.
>
> eddy_cuda --imain=dwi.nii --mask=mask.nii --acqp=eddy_config.txt --index=eddy_indices.txt --bvecs=bvecs --bvals=bvals --topup=field --repol --niter=6 --fwhm=10,0,0,0,0,0 --data_is_shelled --cnr_maps --residuals --mporder=18 --slspec=myslspec.txt --s2v_niter=5 --s2v_lambda=5 --s2v_interp=trilinear --out=dwi_post_eddy
>
>
> ...................Allocated GPU # 0...................
> eddy: msg=ECScanManager::set_slice_to_vol_reference: ref index out of bounds
> terminate called after throwing an instance of 'EDDY::EddyException'
> what(): eddy: msg=ECScanManager::set_slice_to_vol_reference: ref index out of bounds
>
> When I run the command without the slice to vol correction part, i.e.,
>
> eddy_cuda --imain=dwi.nii --mask=mask.nii --acqp=eddy_config.txt --index=eddy_indices.txt --bvecs=bvecs --bvals=bvals --topup=field --repol --niter=6 --fwhm=10,0,0,0,0,0 --data_is_shelled --cnr_maps --residuals --out=dwi_post_eddy
>
> it finishes without error but I do see in the dwi_post_eddy.eddy_outlier_report text file that slice no.1 is an outlier in a lot of volumes. So I guess that this is the problem and for the moment I can get around the error generated above by increasing the --ol_nstd=8. I was wondering whether this the correct approach or whether I should take another approach.
> If this is the correct approach, should I repeat this new setting for "all" my subjects in the dataset?
yes, I have seen this exact error in some other data. In that case the error was caused by a failure to find any volume suitable as a “shape reference”, and that in turn was caused by the existence of outliers in every volume (one of the criteria for qualifying as a “shape reference” is no outliers). In that case we could identify one MB-group of slices as having consistently lower than expected intensity, and that in turn we attributed to timing mismatches/errors in the sequence.
The short term solution in that case was the same as you suggest here, to increase the threshold for what is considered an outlier. The longer term solution was slight changes to the acquisition that “fixed” the too low intensity in that MB-group.
In your case, if this is the first slice, I can see you either increasing the outlier threshold or discarding the first slice before any topup/eddy or other analysis. A threshold of 8 is very high and will mean that you might miss some actual outliers. It is conceivable that losing the basal-most slice is better, but you be the judge of that.
> Another issue is the myslspec.txt file. Here is the content of myslspec.txt file: 0,1,2,3,.....49. I wonder whether it should start from zero (as it is now) or 1?
Yes, you should start with 0. Also, it should have no commas and be a “column vector”. I.e. cat myslspec.txt should give
0
1
2
3
etc.
Jesper
>
> Best,
> Hamed
|