Dear Michael, Milliseconds & Delays-
to pick up the issue of slice timing: Shouldnīt we account for the time
occupied by the slice selection gradient (say TG) within the TA, i.e. at
the beginning of each run? Sure, thatīs just a short period and the
error is going to decrease in longer TRs & TAs and with lots of slices.
However, physically the read out of the slices occurs right between the
slice selection gradients and the delay period. Thus, there seems to be
no reason to neglect this.
Aquisition-Repetition-Time TA = TG + TRslices (=TR x Nslices) + TD
(Delay)
At least theoretically, this could reduce the variance of results
between different scanners and sequences. Well, if you use a rather
flexible model, it should not matter a lot. Just a thought. What do the
experts think?
Warm regards- andreas
Andreas Bartsch MD
Neuroradiology & Psychiatry, BJMU Wuerzburg
%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%%
|