Print

Print


Hi Jeannette,

Yes, thank you, that clarifies a lot! I was trying to figure this out in the process of manually calculating percent signal change (on which you wrote this nice tutorial). I'm using the featquery source code as a reference.  When mean_func is not available, featquery calculates mean_func by averaging filtered_func. Hence, this seems like an odd step when performed on a higher level analysis, as mean_func is only the average of filtered_func during first level analyses as you just explained.

Thanks again!

Johannes


On Thu, Jeanette Mumford wrote:

Hi,

In the first level analysis the mean_func image is the mean of the functional data time series.  At the second level the lower level mean_func images are averaged (after transforming the image to standard space), so it would be the mean of the lower level mean_func images across the 3 runs.  Due to grand mean scaling the mean_func image is typically close to 100^2 in each voxel.

The filtered_func, on the other hand, is the lower level cope that you fed into the analysis (i.e. the dependent variable).  It wouldn't be expected that the mean of the filtered_func would match the mean_func.

Hope that helps,

Jeanette

On Wed, Nov 30, 2011 at 5:37 PM, Johannes <[log in to unmask]> wrote:

    Dear FSL users,

    It is my understanding that from time-averaging filtered_func_data, you should get mean_func. However, when I average filtered_func_data manually using: fslmaths filtered_func_data -Tmean temp_mean_func
    the values in temp_mean_func are about ten times lower than the values in the mean_func that was created while running the analysis. Any idea what could cause the discrepancy?

    Both the filtered_func_data and the mean_func data were created while running a higher level-analysis in which 3 runs from a subject are combined (fixed-effects). So my filtered_func_data contains 3 'timepoints', all running up to values of about 1000 max, while the mean_func data hover around 10000 (order of 10 larger). From what I've read I understand that mean_func is a grand mean-scaled intensity image that should center around 10000, so that part seems right, but why then are the filtered_func_data values so much lower? Sorry if this seems like a basic question, I've been trying to wrap my head around this, but I can't figure it out.

    Many thanks,

    Johannes