Dear Ferenc,
you cunningly fooled me into reading this SPM
question by putting "DCM" in the subject line... ;-)
I think your solution is fine. It is basically
the same as downsampling your measured data. As
long as you preserve the temporal order and
adjust the TR in the model specification, it should be OK.
Best wishes,
Klaas
At 17:08 31/07/2006, you wrote:
>Hello,
>a common problem with a FFX analysis in SPM5 is
>the 'out of memory' error when using large
>Datasets for a subject. In my case I failed to
>run a 6 subjects FFX analysis with 1160 volumes
>per subject, 34 slices 3x3mm, nearly whole
>brain, TR=2s. There seems to be no way around
>the 'out of memory' problem, despite the fact
>the workstation I am using has 3 GB of RAM.
>
>So, my question now is:
>Is it possible and valid to 'fool' SPM5 by
>creating a new Dataset that contains every
>second volume, so the total number of volumes
>will be reduced to 580. Creating a design matrix
>with the parameter that TR is 4 seconds now. In
>my case I could even use every 4'th scan,
>without violating my block design, so the total
>number of scans would be 290 per person and the
>TR would be increased to 8 seconds.
>
>Thank you,
>Ferenc
>
>
>
>------------------------------------------------------------
>Ferenc Acs
>Lehrstuhl Prof. Dr. M. W. Greenlee
>Institut für Psychologie
>Universität Regensburg
>93040 Regensburg
>Tel. +49 (0)941 943 3582
>Fax +49 (0)941 943 3233
>http://www.psychologie.uni-regensburg.de/Greenlee/team/Acs/acs.html
|