Print

Print


Dear Jeremy,

On Fri, Apr 16, 2010 at 10:48 PM, Jeremy Nelson <[log in to unmask]> wrote:
> Hi Vladimir,
>
>> I'm not familiar with this error, but it might have something to do
>> with version clashes. Could you just download SPM8 again from the
>> website (the version you get is already up-to-date you don't need the
>> updates), make sure you have the default path with just the main SPM
>> folder added to it and if you still get the problem, let me know?
>
> I redownloaded SPM8 and did not apply any of the updates.  After
> attempting to run the DCM_ERP_example.m (it appears to now be called
> DCM_ERP_subject1.m), this is the error message I received:
>
> ??? Error using ==> read_vol at 37
> file 'D:\home\wpenny\spm8\canonical\single_subj_T1_EEG_BEM.mat' does not
> exist
>
> Error in ==> fileio_read_vol at 11
> [varargout{1:nargout}] = funhandle(varargin{:});
>
> Error in ==> spm_eeg_lgainmat at 84
>            vol = fileio_read_vol(vol);
>
> Error in ==> spm_dcm_erp_dipfit at 152
>        [L D]   = spm_eeg_lgainmat(D, [], D.chanlabels(DCM.xY.Ic));
>
> Error in ==> DCM_ERP_subject1 at 55
> DCM = spm_dcm_erp_dipfit(DCM);
>

OK. That's much clearer now. What you can do is press '3D source
reconstruction', load the example file, delete the source
reconstruction data that is there and save. Then when you load it to
DCM you'll be asked at some point to specify the head model parameters
from scratch. That should solve this problem.

>>> On a related note, I have not had success getting SPM to read my
>>> Neuroscan .AVG files - but that's not a huge deal, as I needed to
>>> stack and transpose my matrices into a single file, so I converted
>>> them all to .DAT.  However, when I did this I had to strip the header
>>> data from the .DAT files so MATLAB would read them.  Now I currently
>>> have a single .DAT file for each subject that is 64 (electrode sites)
>>> X 6001 (timepoints) X 3 (experiment conditions).  My questions are: is
>>> this the appropriate matrix type I need for DCM? And, what metadata do
>>> I need for each file to load it directly into DCM (by-passing all the
>>> SPM preprocessing)?  I was hoping to get this from the DCM_ERP_example
>>> (using mafdeMspm8_subject1.dat/mat as guides), but since those files
>>> do not seem to be working, I am hesitant to do so.
>>
>> This is a little complicated way to do things and I'm afraid you'll
>> have to do a little programming to make it work. You can use the
>> attached example script and replace the random data there with your
>> data. If you can't make it work we'll help you when you come to the
>> course.
>
> Luckily, I've found someone on campus who is competent with MATLAB
> coding who is able to help me if needed.  I was able to get the script
> you sent me to work, except that I'm not sure what it's doing to my
> data exactly.  I had three experimental conditions: pre-TMS, post 1
> TMS, post 2 TMS.  In trying to combine the auditory ERPs from these
> three conditions into a single file for the DCM (to see the
> connectivity changes due to TMS).  So, my raw data going into the
> spm_eeg_convert_arbitrary_data script is a single matrix with the
> individual's pre, post 1, and post 2 ERPs [64 (electrode sites) X 6001
> (timepoints) X 3 (experiment conditions - pre, post 1, post 2)].  The
> matrix contains, essentially, 1 trial (for each experiment condition).
>  The header data from the .AVG file does have the number of accepted
> trials that went into creating the ERP, but I guess I'm not sure why
> that is necessary for DCM and how I would specify the different number
> of accepted trials for each condition.  Does that makes sense?  The
> MMN paradigm example is different than our design because within a
> single block you can have labels for "standard" and "deviant".  I
> would like to take my pre, post 1, and post 2 blocks and combine them
> into a single "block" with the labels "pre", "post 1", and "post 2".
> Sorry I'm so confused as to how to make this happen.  If the script
> you sent me said something like "Nconditions = 3" instead of "Ntrials
> = 50;", I would be much less confused.  After running the script,
> which I kept Ntrials at 50, my matrix is now 64x6001x50.  If the ERP
> itself cannot be fed into DCM, then using trials makes more sense -
> but it also means Neuroscan .AVG files do not work (at least for DCM).
>
> I hope that makes sense.
>

You should not put all the ERPs from all subjects in the same dataset.
You can either compute the grand average and do DCM of that or do DCM
of each subject individually and then do statistics on the parameters
across subjects. Look at the published DCM paper of Marta Garrido et
al. for some examples. In any case what you should input to DCM are
files with 3 trials, one per condition. You don't need to specify the
number of repetitions that went into each average for DCM. You should
specify the condition labels by changing the line:

D = conditions(D, 1:Ntrials, 'Condition 1');  % Sets the condition label

from the script and changing into something like:

D = conditions(D, 1, 'Pre');
D = conditions(D, 2, 'Post 1');
D = conditions(D, 3, 'Post 2');

> Thanks again for the help - I really do appreciate it!  Hopefully I
> can get this project working and possibly attend the next short course
> that is offered (in October?).

We have not decided yet whether to offer the second course in October
or next May. I'd suggest you to contact Jean Reynolds
([log in to unmask]) and ask her to put you on the waiting list.
That will give you a priority when the registration for the next
course is open and also will give us better idea about the level of
demand.

Vladimir