Print

Print


thanks Donald, I will try with the average ITI, although I think the average on a randomly intermixed design with 5 conditions may not be very informative about the relevant frequencies.
Best,
David

2015-07-06 18:09 GMT+02:00 MCLAREN, Donald <[log in to unmask]>:
You want your filter to be at least twice the frequencies found in regions that are active. Generally this is computed based on the interval between trials of the same type. So, if the average ITI between trials of the same type is 32 seconds, you'd want to use at least 64 seconds. Using 128 is a safe choice for ER designs as its well beyond any frequencies you'd find in your data.

Best Regards, Donald McLaren
=================
D.G. McLaren, Ph.D.
Research Fellow, Department of Neurology, Massachusetts General Hospital and
Harvard Medical School
Postdoctoral Research Fellow, GRECC, Bedford VA
Website: http://www.martinos.org/~mclaren
Office: (773) 406-2464
=====================
This e-mail contains CONFIDENTIAL INFORMATION which may contain PROTECTED
HEALTHCARE INFORMATION and may also be LEGALLY PRIVILEGED and which is
intended only for the use of the individual or entity named above. If the
reader of the e-mail is not the intended recipient or the employee or agent
responsible for delivering it to the intended recipient, you are hereby
notified that you are in possession of confidential and privileged
information. Any unauthorized use, disclosure, copying or the taking of any
action in reliance on the contents of this information is strictly
prohibited and may be unlawful. If you have received this e-mail
unintentionally, please immediately notify the sender via telephone at (773)
406-2464 or email.

On Mon, Jul 6, 2015 at 10:24 AM, David Pascucci <[log in to unmask]> wrote:
If I get it correctly, hp filtering can be useful to remove scanner drifts while keeping task-related frequency (supposed to be high on event and fast-event related design). However, this filter can also help to clean data from physiological influences, am I correct?
Thus, the ideal would be to base the cutoff mainly on knowledge about task-related frequency...

My starting issue was to understand whether the optimal cutoff for my data could be lower than 128s and this is because, while trying to compare the results with different cutoffs (as suggested here [http://mindhive.mit.edu/node/116] for very short trials), I obtained a consistent pattern between 128 and 32secs, but in the latter case some significant clusters seem to be better defined in terms of their spatial distribution and bilaterality...However, at the moment I have no reason to prefer the 32 or some higher cutoff to the default 128secs, and this is what I am trying to realize.

If your advice is to keep 128 (or do the phantom test) with the only aim to remove scanner drift, then I won't go further in this.

Thanks for your responses
David

2015-07-06 16:26 GMT+02:00 MCLAREN, Donald <[log in to unmask]>:
Helmut,

There is physiological noise at low frequencies that is spatially correlated. Smoothing wouldn't eliminate these confounds. SeeĀ http://journal.frontiersin.org/article/10.3389/fnhum.2015.00285/abstract for more details on physiological fluctuations and their impact on the BOLD signal even in smoothed data.

Best Regards, Donald McLaren
=================
D.G. McLaren, Ph.D.
Research Fellow, Department of Neurology, Massachusetts General Hospital and
Harvard Medical School
Postdoctoral Research Fellow, GRECC, Bedford VA
Website: http://www.martinos.org/~mclaren
Office: (773) 406-2464
=====================
This e-mail contains CONFIDENTIAL INFORMATION which may contain PROTECTED
HEALTHCARE INFORMATION and may also be LEGALLY PRIVILEGED and which is
intended only for the use of the individual or entity named above. If the
reader of the e-mail is not the intended recipient or the employee or agent
responsible for delivering it to the intended recipient, you are hereby
notified that you are in possession of confidential and privileged
information. Any unauthorized use, disclosure, copying or the taking of any
action in reliance on the contents of this information is strictly
prohibited and may be unlawful. If you have received this e-mail
unintentionally, please immediately notify the sender via telephone at (773)
406-2464 or email.

On Mon, Jul 6, 2015 at 5:51 AM, H. Nebl <[log in to unmask]> wrote:
Dear everyone,

This is a little off-topic, but when considering literature on high-pass filter / detrending issues, I've noticed that several papers rely on unsmoothed data. I'm a little curious on what the findings really mean for studies based on smoothed data. E.g. if there's just low-frequency noise, but not in phase across voxels, then smoothing should reduce the issue. If "the noise is in phase" = drift, then smoothing won't help.

Just thought this is an interesting aspect, as sometimes, figures illustrate that a certain set of regressors e.g. explains lots of variance, but possibly, this is variance we would get rid of due to smoothing anyway.

Best

Helmut




--
Pascucci David
__________________________________________
Department of Neurological and Movement Sciences
Section of Physiology and Psychology, University of Verona
Strada Le Grazie 8, I-37134 Verona, Italy




--
Pascucci David
__________________________________________
Department of Neurological and Movement Sciences
Section of Physiology and Psychology, University of Verona
Strada Le Grazie 8, I-37134 Verona, Italy