Print

Print


Hello Peter,

Thank you very much for your reply.
I used the batch, and I attach the jobman script and job files. Hopefully it will help finding the location of the save funciton.

Regarding the DCMs, each subject has one full model. In the full model there are three nodes. I included one condition (word reading) and one of its parametric modulator (word length). Between the 3 nodes connectivity are set to be bilinear, with stochastic noise modeled. There are 43 subjects in total. And 6 days include two times of estimation, the first one with preset connectivity parameter, and the second one with group mean from the first estimation as prior (as in the Full + BMR PEB option).

In this study, we let subjects read texts, and record word fixations with eye tracker. I used the word fixations as stimuli onsets. When acquisition, we used an acceleration factor of 6 and had a TR of 400 ms. There are totally 5 runs of texts, so the total reading time is about 10 to 15 minutes, resulting in 1500 to 2200 volumes of data per subject.

I think the number of subjects and amount of data, and also the situation that Full+BMR PEB option needed to estimate twice, made the estimation quite long in the end.

Best,
Chun-Ting


On 8 February 2017 at 06:17, Zeidman, Peter <[log in to unmask]> wrote:
Dear Chun-Ting
I am sorry to hear you had this problem - especially after estimating DCMs for so many days. Did you use the batch or a script? If we can find the save function which failed, it can then be updated to:

save(filename,variables,'-v7.3');

Yes, you can make the DCM estimation work in parallel, assuming you have the Matlab parallel computing toolbox. In future, we will provide this option with a simple setting. For now, I attach a modified version of spm_dcm_fit.m. Please place this in your SPM directory (or in a directory higher up on your Matlab path than the SPM directory). By default, Matlab will parallelize over the CPUs in your computer (typically 4-8 cores).

Out of interest, what kind of DCMs are these? How many DCMs do you have per subject? And which estimation option are you using? Six days is an unusually long time and I want to confirm that there isn't a quicker way to estimate your models.

Best
Peter

-----Original Message-----
From: SPM (Statistical Parametric Mapping) [mailto:[log in to unmask]] On Behalf Of Chun-Ting Hsu
Sent: 06 February 2017 17:55
To: [log in to unmask]
Subject: [SPM] DCM Estimation - GCM.mat cannot be saved; Parallelization

Dear SPM users,

I ran a group DCM estimation using the PEB-BMR option which would result in a GCM.mat file that has DCM structure of all subjects in it. I use spm12 (v6906) on Matlab 2016a on a HPC.
In the spm_defaults.m file, I have already changed the default mat file format.
defaults.mat.format     = '-v7.3';
However, in this estimation, the following massage appeared, and the GCM.mat was not saved after 6 days of estimation.

[Warning: Variable 'GCM' cannot be saved to a MAT-file whose version is older than 7.3.
To save this variable, use the -v7.3 switch.
Skipping...]

Would you be so kind to point to me where the code assigning the format of GCM.mat file is, and why the GCM.mat file was not saved using the -v7.3 set in the spm default file?

I am also wondering if there is a possibility to parallelize the group DCM estimation process. Currently it is done subject by subject sequentially for the full model. I think it should be possible to run estimation for the full model of all subjects in parallel, taking advantage of the HPC, but I'm not so sure where the code of the loop to run estimation over subjects/models are .

Thank you very much.

Best regards,
Chun-Ting Hsu
Postdoctoral Scholar
Brain, Language, and Computation Lab
Department of Psychology
Pennsylvania State University



--
Chun-Ting Hsu, M.D., Ph.D.
Postdoctoral Scholar

Brain, Language, and Computation Lab
Department of Psychology
Pennsylvania State University
460 Moore Building, University Park, PA 16802
Phone (O): +1 814 867 3652