Dear FSL users,
It is my understanding that from time-averaging filtered_func_data, you should get mean_func. However, when I average filtered_func_data manually using: fslmaths filtered_func_data -Tmean temp_mean_func
the values in temp_mean_func are about ten times lower than the values in the mean_func that was created while running the analysis. Any idea what could cause the discrepancy?
Both the filtered_func_data and the mean_func data were created while running a higher level-analysis in which 3 runs from a subject are combined (fixed-effects). So my filtered_func_data contains 3 'timepoints', all running up to values of about 1000 max, while the mean_func data hover around 10000 (order of 10 larger). From what I've read I understand that mean_func is a grand mean-scaled intensity image that should center around 10000, so that part seems right, but why then are the filtered_func_data values so much lower? Sorry if this seems like a basic question, I've been trying to wrap my head around this, but I can't figure it out.
Many thanks,
Johannes
|