Morning,
Our back up server is limited by two factors:
-) Storage capacity (of course)
-) File counts
The system is database driven. Each file receives a record. As the
ratio of file count to GB increases the performance of the server
decreases. As such, a ratio of 1 million files to 1TB has bee
determined as the max allowable for smooth backup operation.
We are currently backing up 2TB of data which consist of 5.1 million
files. As you can see, this is close to 3x the max amount generally
allowed. The vast majority of this data is generated by FSL.
Questions:
1. Has anyone else had to address this issue? If so, what was your
solution?
2. I have heard that there are certain files that can safely be
deleted without drastically increasing subsequent analysis times. How
can I find out what these files are?
Thank you for any input,
James A. Kyle
|