Print

Print


Dear Oliver:

I have not tried to analyze any design that large using PPI. My guess is
that it is failing at the call to spm_PEB, but let me know if that's not
the case.

I can think of two ways to do this, though I have not tried them myself
(well I tried the second but it's harder to do and it was for DCM not PPI).

1) Calculate the PPI vectors (Y, P, and PPI) for each session. Then include
each of the regressors in one design. You don't necessarily need to
concatenate them. If  you do then include block regressors for each session
(columns of ones) defined as "user defined" covariates. There isn't really
any specific code for concatenating the regressors other than if you have 3
PPI's load each one into a variable

tmp1 = load('myPPI-1.mat');
tmp2 = load('myPPI-2.mat');
tmp3 = load('myPPI-3.mat');

if you really want to concatenate them then

bigPPI.Y = [tmp1.PPI.Y; tmp2.PPI.Y; tmp3.PPI.Y];

and so on...

then use these regressors in your model with the appropriate block effects
or include them as separate sessions in the same model (then no
concatenation needed).

2) The other way is more difficult. At one point before DCM estimation was
modifed to use logdet it used spm_PEB and would run out of memory with
large designs. I wrote some code to store the partially calculated PQ
matrices on disk and load them as needed. This works pretty well and
allowed us to calculate 7 region DCM with 1300 scans per region.
Fortunately this is no longer necessary for DCM. Anyway if there is some
reason to calculate all the scans at once then download it from the CBMG
tools site (see spm2 extensions).

Regards,
Darren

At 11:24 AM 5/2/2005, Oliver Gruber wrote:
>Dear Dr. Gitelman,
>
>I just tried to perform a PPI analysis on an event-related fMRI data set
>using the SPM2 interface (on Windows platform) and your PPI code. In
>general, I got the impression that this analysis takes a lot of processing
>time even with smaller data sets. I now wanted to analyse a data set of
>over 2000 image volumes, but I always got the error message "out of
>memory" (well-known from similar problems with categorial analyses many
>years ago...). Our systems admin tried to fix the problem but gave up in
>the meantime. So, I wondered whether you have experience with this
>specific problem and whether you could make a suggestion to solve it.
>Since the analysis still works with 1 or 2 of the three sessions of the
>experiment (700-1400 volumes), I wonder whether it would be possible to
>simply concatenate the regressors (Y, P, PPI) of the single sessions using
>the matlab code, and to perform the statistical analysis using the
>concatenated regressors. However, because I am not familiar with this code
>I would be very happy if you - in case this could work - could give me the
>"cooking recipe" (the matlab commands) to perform such a concatenation
>(assuming that this procedure will not confound the analysis).
>
>Many thanks in advance and best regards,
>Oliver Gruber
>
>--
>--------------------------------------------------------------------------------
>Prof. Dr. Oliver Gruber, MD
>Professor of Cognitive Neuroscience in Psychiatry, Saarland University
>Psychiatrist and Psychotherapist, Dept. of Psychiatry and Psychotherapy,
>Saarland University Hospital
>
>Address:
>Neuroimaging Laboratory
>Department of Psychiatry and Psychotherapy
>Saarland University Hospital
>D-66421 Homburg (Saar)
>Germany
>
>Tel.: +49 6841 162 4245
>Fax:  +49 6841 162 4270
>Email: [log in to unmask]
>Homepage:
>http://wwwalt.uniklinik-saarland.de/psychiatrie/sub3/imaging/imaging.html
>--------------------------------------------------------------------------------
>