Hello, Satoru:
Thanks for the reply. I will try the down-sampling. I am wondering if
there is a way to calculate how much memory is needed based on the image
resolution, so that I could upgrade the system to meet the need. Thanks a
lot!
Xue
Quoting Satoru Hayasaka <[log in to unmask]>:
> Dear Xue and SnPMers,
>
> At 04:32 PM 7/25/2005 -0700, HUA,XUE wrote:
> >I posted a previous question regarding the memory problem I
> encountered
> >while using SnPM. The problem was solved by choosing the slice by
> slice
> >calculating mode. However, there is no similar option if you want to
> do
> >the variance smoothing (pseudo-t). I have 13 subjects and each scan
> has
> >the dimension of 199 x 199 x 199. I want to see how much memory is
> needed
> >for this calculation. And is there a good way to get around this
> memory
> >issue?
>
> If you are doing a pesudo-t analysis, then unfortunately there aren't
> good
> solutions to this problem. You could somehow reduce the amount of data
> (re-slicing / down-sampling to a larger voxel size, throwing away
> non-brain
> slices, etc) so that your computer can handle the data in the
> volumetric mode.
>
> -Satoru
>
>
> Satoru Hayasaka ==============================================
> Post-Doctoral Fellow, MR Unit, UCSF / VA Medical Center
> Email: shayasak_at_itsa_dot_ucsf_dot_edu Phone:(415) 221-4810
> x4237
> Homepage: http://www.umich.edu/~hayasaka
> ==============================================================
>
|