Dear Brian,
I think it would be good to try to diagnose why your model estimation isn't getting off the ground before turning to stochastic DCM. Some questions:
- What kind of task are your participants performing, and what's the experimental design? Do you think the neural activity you observe is caused by your experimental manipulations, or by endogenous activity? These considerations will have a big impact on the success of your models. E.g. if you had an autobiographical memory recall task over many seconds, it might be fair to argue that most activity is caused endogenously rather than by your cue, and thus stochastic DCM would be favourable.
- You say you get robust 1st level main effects. How are you defining your ROIs? Based on single-subject activation clusters? Or anatomically?
I'm not sure about if you'll get an advantage from non-linear DCM - it depends on whether you hypothesise a region modulates a connection. Give it a try if you think it makes sense. You could also try 2-state DCM, which has richer dynamics and so might stand a better chance of fitting your data.
Best,
Peter.
Peter Zeidman, PhD
Methods Group
Wellcome Trust Centre for Neuroimaging
12 Queen Square
London WC1N 3BG
[log in to unmask]
-----Original Message-----
From: SPM (Statistical Parametric Mapping) [mailto:[log in to unmask]] On Behalf Of Brian Haagensen
Sent: 31 May 2014 14:04
To: [log in to unmask]
Subject: [SPM] stochastic DCM
Dear DCM experts,
we apply bilinear DCM for event-related fMRI data, but often see that the model flat-lines, also with more tight hyper-priors than the default ones. This is the case also for quite robust 1st level main effects of the inputs.
What is the opinion among you on the use of stochastic DCMs in a case like this, ie. where we also have known inputs to the system? My own thoughts would be that if we're interested in inference on model space, this approach would be ok, but I'm more in doubt concerning inference on model parameters- because noise explains so much of data, the posterior parameter estimates are very small and in our case typicaly having posterior probabilities around 0.5. So I guess stats and correlations on these would be problematic?
A more preferable option if we're interested in the parameters might be to use the nonlinear integration scheme with its shorter time step, because our modulatory input (multiplied on B) is event-related (more like neural activity multiplied on D) - do you have some thoughts on this ?
Thanks for your time!
Best regards, Brian Haagensen
|