Print

Print


Stephen, List,

Attached are two plots that show my smoothing times on the PC/XP writing to
a local disk for both SPM2 and the most recent release of SPM5.  I'm running
MATLAB 7.5.  I have similar problems on 64-bit Linux for SPM2 and SPM5.  The
timing is even worse in a directory that already contains 460 pairs each of
f*, af*, and waf* files before smoothing with swaf* output going by default
into the same directory.

These plots show results of calls to the smoothing function for the same
data set but with a different number of files in the input/output directory.
MATLAB was stopped and re-started before each timed call to the smoothing
function.  Maybe it's a MATLAB 7.5 issue since I'm seeing it in two
different hardware/software configurations for both SPM2 and SPM5.

Kathy Pearson
UAB Psychology Dept.

-----Original Message-----
From: SPM (Statistical Parametric Mapping) [mailto:[log in to unmask]] On
Behalf Of Stephen J. Fromm
Sent: Wednesday, April 23, 2008 9:01 AM
To: [log in to unmask]
Subject: Re: [SPM] slow smoothing

On Tue, 22 Apr 2008 16:42:46 +0100, Kathy Pearson <[log in to unmask]> wrote:

>I have resolved the problem of slow smoothing locally at the command line 
by
>calling Linux to create a new subdirectory each time to receive the 
smoothed
>output files.  Smoothing then takes .12 seconds per volume or about one
>minute in total for 460 input volumes.  It takes an additional 10 seconds
>for a Linux call to move the generated swaf* files into the same directory
>with the waf* files.
>
>Profiling the smoothing function reveals that much of its time is spent in
>the MATLAB exist function, so I'm not sure what a general solution across
>platforms might be for faster i/o.

I'm not a high-level Linux guru, but I've played with Linux/Unix for some 
time now, and the idea that the speed depends on how many files are in the 
directory sounds very strange to me, unless there's something weird going 
on with your hard disk.

I often see things proceed more quickly at the beginning of a calculation, 
and then a slow down (I haven't plotted it, but I assume it's reaching 
some kind of steady-state rate asymptotically).  I've always assumed it's 
a virtual memory thing.  Something like pages getting sucked into RAM, but 
then in the steady-state pages have to be swept both in and out of RAM.

Again, I don't know enough to say for sure that it shouldn't depend on the 
number of files in the directory, but it sounds very strange to me.

Cheers

>
>Kathy Pearson
>UAB Psychology Dept.
>========================================================================