Hi,
I am trying to smooth a big 4D .nii file (7.5GB, 256x176x200x446). The
smoothing function gives me this error:
-------------------------------------------------------------------------------------
MATLAB Version 7.7.0.471 (R2008b)
MATLAB License Number: 129249
Operating System: Linux 2.6.31-18-generic #55-Ubuntu SMP Fri Jan 8
14:54:52 UTC 2010 x86_64
Java VM Version: Java 1.6.0_04 with Sun Microsystems Inc. Java
HotSpot(TM) 64-Bit Server VM mixed mode
-------------------------------------------------------------------------------------
MATLAB Version 7.7
(R2008b)
Genetic Algorithm and Direct Search Toolbox Version 2.4
(R2008b)
Image Processing Toolbox Version 6.2
(R2008b)
MATLAB Compiler Version 4.9
(R2008b)
Neural Network Toolbox Version 6.0.1
(R2008b)
Optimization Toolbox Version 4.1
(R2008b)
Parallel Computing Toolbox Version 4.0
(R2008b)
Signal Processing Toolbox Version 6.10
(R2008b)
Statistical Parametric Mapping Version 3684
(SPM8)
Statistics Toolbox Version 7.0
(R2008b)
Wavelet Toolbox Version 4.3
(R2008b)
SPM version: SPM8
SPM path: /home/filo/opt/spm8/spm.m
------------------------------------------------------------------------
Running job #1
------------------------------------------------------------------------
Running 'Smooth'
Failed 'Smooth'
Error using ==> spm_conv_vol
File too small.
In file "/home/filo/opt/spm8/spm_smooth.m" (v2794), function "smooth1"
at line 105.
In file "/home/filo/opt/spm8/spm_smooth.m" (v2794), function
"spm_smooth" at line 37.
In file "/home/filo/opt/spm8/config/spm_run_smooth.m" (v3534),
function "spm_run_smooth" at line 20.
I have further located the error message to be in get_map_file from
spm_mapping.c. Smooth job before crashing creates a file with the
first 120 smoothed volumes (2gb in size). I have ruled out memory
allocation problems by running it on 32gb machine.
Its a 64bit linux machine with spm8 r3684. Does anyone have a clue
what might be wrong?
Best regards,
Chris
|