Hi FSL community,
I'm trying to use randomise to run a one-sample t test on a beta series correlation and am getting strange output. I've tried on two computers and using different thresholding/output options, but none are working. Any advice would be appreciated!
The following commands all led to all output image files with min 0 and max 0 and a critical value of 0.
randomise -i <in> -o <out> -1 -T
randomise -i <in> -o <out> -1 -v 5 -T
randomise -i <in> -o <out> -m <binary mask> -1 -T
randomise -i <in> -o <out> -m <binary mask> -n 100 -1 -C .001
randomise -i <in> -o <out> -n 100 -1 -C .001
This command led to an output file with intensity .99 at each voxel, and a massive critical value (>150,000)
randomise -i <in> -o <out> -m <binary mask> -n 100 -1 -C .001
With these commands, no output files were created, and the critical value was nan
randomise -i <in> -o <out> -n 100 -1 -x
randomise -i <in> -o <out> -m <binary mask> -n 100 -1 -x
In case this is helpful:
I made the (z-scored) correlation maps in Matlab, one per subject, and exported them as .nii files.
I've checked the header files -- all fine.
I aligned each image file to MNI space using flirt.
There are no NaN values in any of the image files.
I used fslmerge to combine the normalized image files from each subject.
If I just use fslmaths to take the average of these maps, there are areas of activity even if I threshold at .9.
Thanks again,
Elizabeth
|