Thanks for the quick answer Mark,
But I've still got some questions....
On Fri, 13 Feb 2004, Mark Jenkinson wrote:
> Hi,
>
> Sounds like you are almost there with your monkey data.
> If you can do your own motion correction and get a reasonable
> bet result with 0.18 for -f then there isn't much more to fix.
>
> Firstly to registration. You will need to define a "standard space"
> to do your higher level analysis in. Obviously our human brain
> template is not useful for this. Instead I suggest you pick a single
> brain image from your monkey data (assumedly an MPRAGE
> that looks nice) and, for the sake of argument, call this "standard
> space" for your experiment. Having picked this image you just
> need to use it in the Standard Space tab of the Feat registration.
> You also want to put in the individual monkey MPRAGE images
> in the "Main structural image" tab. If these images look OK then
> flirt should be able to align them OK.
Is FLIRT still necessary if the images are already realigned and thus
motion corrected ? (I Am using the Feat GUI). apropos, templates is no
problem, since we do have EPI and MPRAGE templates of our monkeys, and
I've created with bet "pure-brain" images out of them.
> One difficulty is that it will still try to register the functional images
> to the structural ones. You don't want this, but unfortunately, it
> can't be turned off. However, once registration is run you can
> overwrite the results by replacing the example_func2highres.mat
> file with the following contents:
> 1 0 0 0
> 0 1 0 0
> 0 0 1 0
> 0 0 0 1
So this is not clear to me when you do the overwrite (I'm using the gui):
is it just before the second level analysis which takes all runs into
consideration ? or even before ? Because if I'm right the registration
part comes at the end of the first level analysis stage, is it not ?
> This effectively resets the transformation of the example_func
> to MPRAGE, as you say they are already aligned. Once you've
> done this you'll need to regenerate example_func2standard.mat
> which, in this case is easy, just copy the contents of highres2standard.mat
> into example_func2standard.mat - that is:
> cp highres2standard.mat example_func2standard.mat
>
> This will then set things up correctly for higher-level feat analyses.
> It won't, however, fix the images on the report.html page - to do this
> you need to regenerate the gif images made by slicer - see the
> report.com file and extract the appropriate commands from there
> if you need this.
I guess fslview will return the right images ?
> Now, the BET issue. You shouldn't need to use pre-threshold
> masking or change the brain/background percentage (although
> this probably won't be a big deal either way). What you do need
> to do is to apply you bet result to your functional series. This can
> be done with the following commands:
>
> ${FSLDIR}/bin/bet funcimage funcimage_brain -m -n -f 0.18
> ${FSLDIR}/bin/avwmaths funcimage_brain_mask -dil funcimage_mask
> ${FSLDIR}/bin/avwmaths functionalseries -mas funcimage_mask maskedseries
small error I guess (otherwise I'm completely lost), but line 2, shouldn't
that read: ${FSLDIR}/bin/avwmaths funcimage_brain -dil funcimage_mask ???
Moreover, what exactly does this function do, I did not see a real
difference between funcimage_brain and funcimage_mask (btw, our images do
have a 1mm^3 voxelsize). Is it some kind of smoothing ?
>
> where I've used the filenames funcimage, funtionalseries and maskedseries
> as your input single functional image, input 4D series and output 4D series
> respectively. Obviously use whatever appropriate names you like for these.
> This should sort out all of your bet problems.
>
> As for HRF/CARF I would suggest that you either put in sensible
> parameters for the Gamma or Gaussian HRF model, or better still,
> choose no convolution and do the convolution with a good CARF
> model yourself, outside of FEAT, then feed the resulting timecourses
> in directly (with the custom options).
That did indeed the job, thanks very much,
Patrick
>
> I think this should be all.
> Let me know if you still have problems - and ideally mail it to the FSL
> email list ( http://www.jiscmail.ac.uk/lists/fsl.html ) so that others can
> see the issues and chip in with help.
>
> All the best,
> Mark
>
> Patrick De Maziere wrote:
>
> >Hello,
> >
> >I got a couple of small questions. I try to use FSL for analysing
> >monkey-data. As I've seen on the list, there's only little discussion
> >about it.
> >
> >As far as I'm right, FSL is not very suited to analyse monkey data, mainly
> >because of 'problems' for motion correction and BET. However, we have our
> >own stuff to realign which works pretty well. So no need for motion
> >correction and thus no MCFLIRT problem.
> >
> >But, what about the brain extraction thing ?
> >Is it better to not extract anything at all and use raw images images in
> >the registration part of Feat ? Using BET with a fractional intensity
> >threshold of 0.18, I did manage to get good bet-images for both MPRAGE and
> >EPI images. However, combining this with a brain/background threshold of
> >1% and a BET-created binary as pre-threshold mask in Feat does not deliver
> >good results. Moreover, the report shows rather bad coregistrations. In
> >fact coregistration is not an issue for us, since using our method for
> >realignment also guarantees that the obtained EPIs are realigned and
> >coregistered with the MPRAGE. However, I did not manage to disable turn
> >registration, and still having the possibility to perform multisession
> >analyses (higher level analysis)
> >
> >Any suggestions ?
> >
> >Last small question, in our monkey studies we do not use the BOLD
> >protocol, but we use a contrast agent (MION), which increases the CNR
> >considerably. However, the contrast agent response function (CARF) is not
> >at all similar to the double-gamma HRF. First, it evokes a negative
> >response, and second its decay is somewhat longer. I could circumvent this
> >problem by adopting the more general Gaussian-convolution and multiplying
> >the desired contrast by -1. But I'm certain that I loose power in this
> >way. However, this is not my main problem, the registration stage is much
> >worse, because I really obtain damned bad images.....
> >
> >Many thanks in advance,
> >Patrick
> >
> >
> >Ir. Patrick De Mazière
> >Research Engineer
> >Computational Neuroscience
> >Laboratory of Neurophysiology
> >K.U.Leuven Faculty of Medicine
> >Campus Gasthuisberg O & N
> >Herestraat, 49
> >B-3000 Leuven
> >Belgium
> >Tel : +32 16 34.59.61
> >Fax : +32 16 34.59.93
> >[log in to unmask]
> >http://simone.neuro.kuleuven.ac.be
> >
> >
> >
>
>
>
|