If I get it correctly, hp filtering can be useful to remove scanner drifts while keeping task-related frequency (supposed to be high on event and fast-event related design). However, this filter can also help to clean data from physiological influences, am I correct?
Thus, the ideal would be to base the cutoff mainly on knowledge about task-related frequency...
My starting issue was to understand whether the optimal cutoff for my data could be lower than 128s and this is because, while trying to compare the results with different cutoffs (as suggested here [
http://mindhive.mit.edu/node/116] for very short trials), I obtained a consistent pattern between 128 and 32secs, but in the latter case some significant clusters seem to be better defined in terms of their spatial distribution and bilaterality...However, at the moment I have no reason to prefer the 32 or some higher cutoff to the default 128secs, and this is what I am trying to realize.
If your advice is to keep 128 (or do the phantom test) with the only aim to remove scanner drift, then I won't go further in this.
Thanks for your responses
David