We're considering some sort of solution like this. The problem being
the user education part. I've found in most cases if the user has to
actually do something, it doesn't get done.
> This is a problem that I have seen many times, especially with DMF
> based backup systems...
>
> In any case, the most common problem we have is with DICOM
> directories, the best solution we have found for this in the long term
> is .tar.gz. Not the most elegant and sometimes a pain as it requires
> a certain amount of user education. But it certainly works especially
> when trying to recall a single .tar.gz file from a DMF as opposed to
> 1500 of them for one analysis.
>
After some investigation, it's definitely the feat directories
producing the high ratio of file count to MB's. How can I determine
which files feat creates which can be safely deleted and/or recreated
without rerunning the entire analysis?
|