Hello.
When I try to run a full first level analysis for one of my data sets, I get
the following message:
Error: FILM did not complete - it probably ran out of memory
This is confusing to me, since I'm able to run another data set with no
memory problems on the same server.
Any ideas? Is there some way I can set up the analysis to reduce the memory
required? Also, the data set that goes through fine has 4 run sessions of
about 9 minutes each, whereas the data set I'm having problems with is 6
runs of about 6 minutes. Same number of slices and thickness for both data
sets.
Thanks,
Dharol
|