Somrthing has recently been said to me that has made me think, and I want to get my head around this a bit more.....
At what point does high pass filtering becomes ineffective? It has not been a concern for me in the past, but recently it has been suggested to me that a rule of thumb is to set the high pass filter at a minimum of 1.5 x your longest SOA. That seems reasonable if this is equal to or less than 128secs. I say that because I beleive that this is the optimum filter setting to remove low-freq noise associated with cardiac/respiratory noise. But what if you use a filter at a lower frequency (e.g., 260secs)? It has been said to me that this would be OK. But aren't you running a risk of alliasing signal of interest at frequencies lower than 0.01Hz with cardiac/respiratory noise etc? IF this is an acceptable risk with a filter of say 260secs, at what point does it becomes unacceptable and the filter becomes ineffective? Eg., Say you had an enomrous longest SOA of 800 secs, you might want to use a filter set at 1000secs (0.001Hz) just to be sure. would this filter be effectively redundant?
What other factors might speak to this? Jittered SOAs for example? Using a range of SOAs would of course spread the signal of interest across frequencies increasing sensitivity. Does this have an effect of reducing the risk of alliasing with low-freq noise or are you still losing some of your signal in the <0.1Hz frequencies (either due to noise or a 128 sec filter)? In what manner should this inform your high-pass filter? Should you be concerned with the longest SOA or the mean SOA (fundamental frequency) of your signal?
Thanks in advance for your help,