Hi Christian and CAT12 group,
I am using CAT12 (r2137) for cortical thickness analyses. Following the manual, a between-group comparison (two-sample t-test) was done with CAT12, and cluster-FWE corrections in SPM12 (v7771).
1. Cluster Labeling
I tried to get all atlas labels within the significant clusters. Based on the previous discussion in
https://www.jiscmail.ac.uk/cgi-bin/wa-jisc.exe?A2=SPM;e1f98ab1.1901
, I saved the thresholded-SPM map by 'save -> thresholded SPM' in SPM12-Results-GUI , and then I used 'Surface Overlay' to load the significant clusters in CAT12. By clicking 'Atlas Labeling' some atlas-labels were printed in the Matlab command window. May I ask if I did everything correctly for atlas labeling?
2. Value Extraction
Then I tried to extract cortical thickness values from each subject in the significant clusters, which will be used in correlation analyses with psychometrics. I found in the previous thread that 'Plot mean data inside cluster' could help to extract the values. However, when I tried this option, although the line 'The values are available at the MATLAB command line as variable 'y'' was printed in the Matlab command window, the variable 'y' is actually empty. By the way, I am curious if this 'Plot mean data inside cluster' will extract cortical thickness values from each subject, or if it just calculates the mean T-value (from the loaded SPM map of between-group compare)?
I also tried another way: firstly I saved a binary mask of the significant clusters in SPM12-Results-GUI, and then used 'Surface Calculator' to average the cortical thickness values from each subject in the mask. The expression of 'Surface Calculator' was 'mean(s1.*s2)', where s1 is a surface image from one subject (e.g. s15.mesh.thickness.resampled_32k.*.gii) and s2 is an SPM12-generated mask image. However, some problem happened with the error: 'Surface 's2' (the mask gii-file) does not match previous texture (non-resampled input)!'.
May I have your suggestions on extracting the cortical thickness values?
Thank you very much!
Best regards,
Haoye
|