dear spmer.
a few weeks ago there was a question about a paradigm with scan pauses. i
could not find any reply to this (neither could i find something in the
archives). i have to ask almost the same question due to a paradigm, which
needs scan pauses (due to the noise of the scanner).
we explored the following paradigm:
60 scans baseline - 5 min pause with sensoric testing- 108 scans
stimulation - 15 min pause with sensoric testing - 60 scans - 15 min pause
with sensoric testing - 60 scans.
we want to answer the BOLD-signal changes after the stimulation over a tine
period of 50 minutes.
now we have to face the problem of the interruption of the time series.
here my questions:
1. is it absolutely impossible to conduct a paradigm with scanning pause
(and why)?
2. is there a time limit of the pause which would be tolerable?
3. is there any method, how we can test, whether the pause is really a
problem or not (and not just theoretical)?
4. can we do something to solve this problem (like correction for global
changes)?
5. if we have to regard the 4 blocks as individual sessions, is it possible
to use the first session (baseline) as the baseline condition and contrast
all the other sessions with it?
i would appreciate any commend about this very much (and even more, if it
is not to technical).
thanks a lot
markus
|