I am assembling a specification for a compute node for FSL processing.
Our researchers will be doing averages of perhaps 12-20 subjects. DTI
analysis is also desired. How much memory should we be purchasing per
core? Our current processing indicates that 2 GB is not adequate, but
that 4 GB probably is. Given the nature of such work, I am thinking that
6 GB per core should see us into the mid-term future. Is this overkill
The Robert and Beverly Lewis Center for Neuroimaging
University of Oregon