Hello,
A 3D refinement crashes with the following error:
++++ Executing new job on Fri Feb 1 10:53:31 2019
++++ with the following command(s):
`which relion_refine_mpi` --o Refine3D/job085/run --auto_refine --split_random_halves --i Select/job071/particles.star --ref InitialModel/job062/run_it300_class001.mrc --ini_high 30 --dont_combine_weights_via_disc --pool 20 --pad 2 --ctf --ctf_corrected_ref --particle_diameter 190 --flatten_solvent --zero_mask --oversampling 1 --healpix_order 2 --auto_local_healpix_order 4 --offset_range 5 --offset_step 2 --sym C1 --low_resol_join_halves 40 --norm --scale --j 4 --gpu "0:1:2:3"
++++
...
Expectation iteration 1
000/??? sec ~~(,_,"> [oo]^M0.22/11.02 min .~~(,_,">^M0.52/14.15 min ..~~(,_,"> WARNING: norm_correction= 92.7148 for particle 15824 in group 2153; Are your groups large enough? Or is the reference on the correct greyscale?
^M0.85/15.93 min ...~~(,_,">KERNEL_ERROR: an illegal memory access was encountered in /programs/x86_64-linux/relion/3.0_beta_cu8.0/src/acc/acc_ml_optimiser_impl.h at line 415 (error-code 77)
KERNEL_ERROR: an illegal memory access was encountered in /programs/x86_64-linux/relion/3.0_beta_cu8.0/src/acc/acc_ml_optimiser_impl.h at line 415 (error-code 77)
KERNEL_ERROR: an illegal memory access was encountered in /programs/x86_64-linux/relion/3.0_beta_cu8.0/src/acc/acc_ml_optimiser_impl.h at line 2290 (error-code 77)
--------------------------------------------------------------------------
MPI_ABORT was invoked on rank 4 in communicator MPI_COMM_WORLD
with errorcode 1.
NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
You may or may not see output from other processes, depending on
exactly when Open MPI kills them.
Please advise.
Thank you,
Yehuda
########################################################################
To unsubscribe from the CCPEM list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/webadmin?SUBED1=CCPEM&A=1
|