Dear Experts,
first, thx to Tom for the answer to my last question.
If I understand correctly, the high-pass filter cut-off should be relatively low
to preserve experimental variance (a) and relatively high to remove noise (b).
While b seems to depend on scanner properties, a depends on the design matrix.
In the resources I have read (and heard) differing suggestion concerning the
optimal cut off value according to the design, e.g.:
a)-twice the maximum (over trials) of the minimum interval between two
instances of any particular trial
b)-twice the minimum of the maximum interval between corresponding conditions
c)-twice the maximum SOA between the most frequently occurring condition
d)-Depending on the applied contrast:
e.g., the interval between experimental block and its corresponding
baseline(as determined by the applied contrast vector)
Which suggestion is correct ?
I have a blocked Desing (2x2 factorial) with block duration=30seconds, in a
pseudorandomized order. Following the rules
above i would go with either
a)2*60 sec
b)2*240 sec
c)2*240sec
d)(do have to take the maximum/minimum/mean of these intervals ?)
I feel that the cutoff should be dependent on the applied contrast,
because when I contrast 2 conditions (i.e. add/subtract regressors) I search
for effects in a higher frequency range and can therefore safely increase
the cutoff value. But thats just a guess.
Thank you for any advice and comments,
andi
|