There is a general equation available for calculating how much RAM is needed to pre-load all particles into memory.
Is there an equation available for calculating how much VRAM is needed for a GPU when running either 2D or 3D classification? We have seen RELION crash when too many threads or MPI ranks are run on GPUs simultaneously, and it would be nice to know the maximum number of threads and/or MPI ranks can be run on a single GPU device, given box size and number of particles (if relevant).
David
--
David Hoover, Ph.D.
Computational Biologist
High Performance Computing Services,
Center for Information Technology,
National Institutes of Health
12 South Dr., Rm 2N207
Bethesda, MD 20892, USA
TEL: (+1) 301-435-2986
Email: [log in to unmask]
########################################################################
To unsubscribe from the CCPEM list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/WA-JISC.exe?SUBED1=CCPEM&A=1
This message was issued to members of www.jiscmail.ac.uk/CCPEM, a mailing list hosted by www.jiscmail.ac.uk, terms & conditions are available at https://www.jiscmail.ac.uk/policyandsecurity/
|