> If you are really determined to perform only one resampling per EPI scan it gets a bit fiddly, and the reason for this is that FSL usually performs the first-level modelling in subject space, followed by a resampling into standard space of the parameter images (copes and varcopes). I imagine I would go about it something like.
>
> 1. Run mcflirt and save the transformation matrices and generate a mean motion-corrected volume.
> 3. Distortion correct the mean volume.
> 4. Register the mean volume to the highres.
> 5. Register highres->standard (using flirt+fnirt).
> 6. Split your uncorrected 4D->3D.
> 7. For each 3D EPI volume
> {
> Combine individual McFlirt matrix and mean->highres matrix using convert_xfm
> Create a single transform using convertwarp (see online documentation).
> This step needs to incorporate all transforms generated above (including the "shiftmap" from the fieldmap and the combined map generated above)
> Run applywarp using the resulting transform.
> }
>
Thanks, Jesper. That matches up with Mark's suggestion. I just wasn't sure of the exact implementation, especially with fieldmaps.
>> Assuming this is indeed what he was suggesting, my concern was incorporating the other aspects of pre-stats in the "best" order (i.e., grand mean scaling, temporal filtering, brain extraction, and potentially, fieldmaps).
>
> The fieldmap should be applied first (which convertwarp will do). The other steps you describe should all be applied after all the steps above have been performed.
>
> In conclusion, what you want to do is doable. My personal opinion though is that you are causing yourself lots of extra headache for a very marginal (possible) gain. I would just go with the regular pipeline if I were you.
Thanks for the advice. Do you think the (potential) gain would be dependent on data resolution? I.e., if I had high-res data that was ~8 times smaller than conventional fMRI resolution, would these extra steps help more? I need everyone in a common reference frame (i.e., MNI/ICBM space), but I would like to keep interpolation (and smoothing) to a minimum. For this particular analysis, I am only using FSL/FEAT for pre-stats and registration.
Cheers,
David
>
> Good luck Jesper
>
>
>>
>>
>> On Jun 15, 2011, at 2:39 AM, Jesper Andersson wrote:
>>
>>> Dear David,
>>>
>>>> Thanks for the feedback. I think this all makes sense, but I do have have a couple of follow up questions.
>>>>
>>>> 1) Could I re-run prestats, turn off the mcflirt option, and use the transforms estimated previously (e.g., prestats_with_mc.feat/mc/prefiltered_func_data_mcf.mat/MAT_0000)? I still want to temporally filter and brain extract the data. However, I suspect brain extraction might be suboptimal if done before motion correction, but it seems like the registration might also be suboptimal if non-brain material is left in the image.
>>>
>>> I am not sure I follow you here. Non-brain stuff is an issue mainly for intersubject or intermodality registration. For motion correction it is not an issue and you can safely motion correct non-betted data. Or am I misundertsanding you?
>>>
>>>> 2) We've created a study-specific EPI template (using ANTs; see http://www.duke.edu/~dvs3/MNIdiffeo.nii.gz). We were hopeful this would serve as a good reference image for registration -- but it is a brain-only image, which isn't ideal for FNIRT according to the documentation. Alternatively, we have a similar study-specific template created from our T1 scans (brain extracted and whole head). Which template would be better, especially if one of our goals is to interpolate the functional data as little as possible?
>>>
>>> This would make no difference from the interpolation perspective. In both cases you can perform a single resampling from functional space to standard space and if the standard space (as defined by the sampling grid) is identical for your EPI and your T1 template the interpolation will be identical.
>>>
>>> As for accuracy of the transform to standard space your two options are func->EPI_template or func->struct->T1_template. It is very hard to say which of these is better. I have no doubt that the struct->T1_template transform is more accurate than the func->EPI_template transform. But then there is also the func->struct transform that adds uncertainty to that stream. In general I would say that if you have very small EPI distortions and/or if you are correcting the distortions using a fieldmap then func->struct->T1_template is better. If not, I think the jury is still out.
>>>
>>> I might take this opportunity to reiterate for the benefit of everyone. Always acquire and use fieldmaps!
>>>
>>> Good luck Jesper
>>>
>>>
>>>>
>>>>
>>>> On Jun 14, 2011, at 4:19 PM, Mark Jenkinson wrote:
>>>>
>>>>> Hi David,
>>>>>
>>>>> If you want your data in standard space then FNIRT is
>>>>> definitely the way to go. So you could follow the
>>>>> same procedure but split the original (pre-motion-correction)
>>>>> data into individual volumes and then use the final applywarp
>>>>> call, but you'll need to use a different --premat matrix.
>>>>> It needs to be a combination of the appropriate
>>>>> mcflirt transformation matrix with the
>>>>> example_func2highres.mat using convert_xfm (with
>>>>> the -concat option). I hope that makes sense.
>>>>> Oh, and you can also try the spline interpolation
>>>>> option with applywarp.
>>>>>
>>>>> All the best,
>>>>> Mark
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On 14 Jun 2011, at 19:56, David V. Smith wrote:
>>>>>
>>>>>> Hi Mark,
>>>>>>
>>>>>> Sorry -- I forgot to specify that "data" was my filtered_func_data file from Pre-Stats.
>>>>>>
>>>>>> OK, I understand what you're saying. We certainly want to reduce interpolation as much as possible -- e.g., no smoothing and I made a standard template that has approximately the same resolution as our functional data (1.8 x 1.8 x 1.8 mm). However, we want the best possible registration, so I imagine FNIRT is the way to go.
>>>>>>
>>>>>> Thanks,
>>>>>> David
>>>>>>
>>>>>>
>>>>>> On Jun 14, 2011, at 2:38 PM, Mark Jenkinson wrote:
>>>>>>
>>>>>>> Dear David,
>>>>>>>
>>>>>>> I'm not sure what "data" is in the last call, but what you have is
>>>>>>> correct. What is different from the previous question is that you
>>>>>>> do not include any motion correction here, as motion correction
>>>>>>> has a separate matrix for every timepoint. However, if you
>>>>>>> wanted to combine motion correction and warping to standard
>>>>>>> space with one transformation (and hence reduce interpolation
>>>>>>> effects) then you'd need to do what I suggested in the previous
>>>>>>> email.
>>>>>>>
>>>>>>> If you do not have 4D data or you are doing smoothing on the
>>>>>>> data after it is resampled like this, then you don't really need to
>>>>>>> worry about anything else and the commands you are running
>>>>>>> here will be absolutely fine.
>>>>>>>
>>>>>>> All the best,
>>>>>>> Mark
>>>>>>>
>>>>>>>
>>>>>>> On 14 Jun 2011, at 18:49, David V. Smith wrote:
>>>>>>>
>>>>>>>> Hi Mark,
>>>>>>>>
>>>>>>>> I have a follow up question. I used FNIRT to normalize my preprocessed 4D data prior to analysis (code below). My output looks OK, but I didn't use applyxfm4D. Does this not matter since I used FNIRT on data that had already been preprocessed (including motion correction)? Please let me know if you think I did anything wrong.
>>>>>>>>
>>>>>>>> ${FSLDIR}/bin/fnirt --in=highres --aff=highres2standard.mat --cout=highres2standard_warp --iout=highres2standard --jout=highres2standard_jac --config=T1_2_MNI152_2mm --ref=standard --refmask=standard_mask --warpres=9,9,9 --applyrefmask=0,1,1,1,1,1
>>>>>>>> ${FSLDIR}/bin/convert_xfm -inverse -omat standard2highres.mat highres2standard.mat
>>>>>>>> ${FSLDIR}/bin/convert_xfm -omat example_func2standard.mat -concat highres2standard.mat example_func2highres.mat
>>>>>>>> ${FSLDIR}/bin/applywarp --ref=standard --in=example_func --out=example_func2standard --warp=highres2standard_warp --premat=example_func2highres.mat --interp=sinc
>>>>>>>> ${FSLDIR}/bin/convert_xfm -inverse -omat standard2example_func.mat example_func2standard.mat
>>>>>>>> ${FSLDIR}/bin/applywarp --ref=standard --in=data --out=data2standard --warp=highres2standard_warp --premat=example_func2highres.mat
>>>>>>>>
>>>>>>>> (Note that I'm not using FEAT for subsequent analyses, so the conventional approach of normalizing the cope images is not what I need here.)
>>>>>>>>
>>>>>>>> Thanks,
>>>>>>>> David
>>>>>>>>
>>>>>>>>
>>>>>>>> On Jun 14, 2011, at 12:47 PM, Mark Jenkinson wrote:
>>>>>>>>
>>>>>>>>> Hi,
>>>>>>>>>
>>>>>>>>> I'm afraid applywarp will only take one matrix at a time.
>>>>>>>>> The tool applyxfm4D would take a set of matrix files, but
>>>>>>>>> unfortunately does not yet support spline interpolation.
>>>>>>>>> We will be putting spline interpolation into these tools in
>>>>>>>>> the future though, but at the moment you need to do
>>>>>>>>> fslsplit, loop applywarp, then fslmerge.
>>>>>>>>>
>>>>>>>>> All the best,
>>>>>>>>> Mark
>>>>>>>>>
>>>>>>>>>
>>>>>>>>> On 14 Jun 2011, at 17:39, Satrajit Ghosh wrote:
>>>>>>>>>
>>>>>>>>>> hi
>>>>>>>>>>
>>>>>>>>>> if i run mcflirt on a 4d time series and generate the list of mat files for the transforms. can i then send the single 4d mat file and the list of transforms to applywarp (in order to use spline interpolation)? or will i need to split the 4d file into 3d files, apply the transforms individually and then merge them back in?
>>>>>>>>>>
>>>>>>>>>> cheers,
>>>>>>>>>>
>>>>>>>>>> satra
>>>>>>>>>>
>>>>>>>>
>>>>>>
>>>>
>>
|