Hi FSL users,
I am attempting to run a very large command line melodic analysis across 192 'subjects' - actually 48 subjects with 4 conditions - (186 volumes each at 2mm resolution), but I consistently receive the terminate message, std::bad alloc despite multiple attempts at modifying file order (as a test to see if any files were corrupted) and checking with the system administrators that we have sufficient memory resources.
Each of the 192 inputs I am currently using are the reg_standard/filtered_func_data images that were generated from another partially successful GUI run. With these inputs the command line melodic runs to the same point each time (at the 158th input).
We are running with 65Gb of memory and have the same amount of swap space (I don't, however, think swap is being utilised much). The analysis drops out (with the std::bad alloc error) when it reaches ~30Gb of memory.
I thought it might be a per-user process limit (or open file issue) but according to the admins this isn't the problem either.
In my last attempt this afternoon I tried adding the --debug option (that I wanted to send to the FSL group) but it didn't output any files, which makes me think I didn't specify correctly (or is the debug output in the command window??).
Is it possible that this is simply too big to run as one analysis? Or, alternatively, would such an error be a result of corrupted files (that I am not picking up somehow - the latter files seemed to work on a smaller analysis)??
Any advice would be greatly appreciated.