Hi Matteo
You need to provide a qshells file only when the shells have different number of datapoints (which seems not to be the case in your example). The file should have exactly 3 entries corresponding to the "number of volumes acquired in each shell". The "number of volumes in each shell" include the diffusion-weighted volumes, but also b0 volumes that have been acquired with the same TE (for ODF estimation the normalised signal attenuation is used). If TE is constant across shells (as in the HCP data), you could equally distribute the b0s across the shells.
It seems that in your case if you arrange every shell to have 96 volumes (90 DW and 6 b0s), you do not need any qshells file. Let me know how that works.
Notice that there are a couple of changes that will be incorporated in the next FSL minor release (within the next month or so). Amongst them, a bug fix for the multishell CSA ODFs.
Cheers
Stam
On 12 Aug 2013, at 12:12, Matteo Diano wrote:
> Dear FSL experts,
> I am trying to perform qboot estimation on HCP data. I have already ordered the data into subdivided bvalue sections (b0, b1000, b2000, b3000) but i got some errors during the reading step of the file qshell.txt:
>
> Reading qshells file...
> At least one b=0 image is required! Exiting now!
>
> My qshell.txt is composed by this values:
>
> 18 90 90 90
>
> Could you please suggest me the right way to put values in the qshell.txt? Thank you very much in advance
> Regards
>
> Matteo
|