Hi Matthew,
Thank you so much for your explanation. I'm calculating the size of my data based on the dimensions that are given in the dcm2niix output. For example, the particular BOLD run I'm looking at for a functional task is (64x64x40x192). I'm using the formula you provided to calculate total bytes. What do the numbers within the parenthesis mean? I should be able to reference the .json file and compare those with acquisition parameters to confirm that the file is the size it should be, is that right?
I had initial problems with these data because there was an exporting issue and not all the volumes made it over to the exported DICOMS. I re-exported the problematic files and, from what I can tell, we now have the full number of volumes. I re-ran the level 1 FEAT analysis and am not able to reproduce the error I was receiving about byte size. Now, according to the log, the error occurs during pre-processing stage 2 and reads:
FATAL ERROR ENCOUNTERED:
COMMAND:
/usr/local/fsl/bin/susan prefiltered_func_data_thresh 0.0 2.12314225053 3 1 1 mean_func 0.0 prefiltered_func_data_smooth
ERROR MESSAGE:
child killed: segmentation violation
END OF ERROR MESSAGE
child killed: segmentation violation
while executing
"if { [ catch {
for { set argindex 1 } { $argindex < $argc } { incr argindex 1 } {
switch -- [ lindex $argv $argindex ] {
-I {
incr arginde..."
(file "/usr/local/fsl/bin/feat" line 312)
Error encountered while running in main feat script, halting.
child killed: segmentation violation
Does this error message provide any further insight into where the problem may be? Thank you so much for your help.
Diane
|