Print

Print


Dear Prem,
You probably made an error in combining STAR files from the different
classes, as the rlnRandomSubset is not always 1 or 2. Be aware that column
numbering may be different in different STAR files: the header defines the
order of the columns!
HTH,
Sjors

> Dear all,
>
> I am refining a data set after successful 2D and 3D Classification in
> relion 1.4. my input script is
>
> mpirun -pernode -n 30   `which relion_refine_mpi` --o Ref-class-1-2-3/run1
> --auto_refine --split_random_halves --i Class-1-2-3-merg.star
> --particle_diameter 430 --angpix 1.25 --ref run3_6.0A_class001.mrc
> --ini_high 50 --ctf --ctf_corrected_ref --flatten_solvent --zero_mask
> --oversampling 1 --healpix_order 2 --auto_local_healpix_order 4
> --offset_range 5 --offset_step 2 --sym C1 --low_resol_join_halves 40
> --norm --scale  --j 8 --memory_per_thread 4
> --------------
> it is giving following error
> -------------------
> Warning: no access to tty (Bad file descriptor).
> Thus no job control in this shell.
> /usr15/data/psk01/DATA1
>  === RELION MPI setup ===
>  + Number of MPI processes             = 30
>  + Number of threads per MPI process  = 8
>  + Total number of threads therefore  = 240
>  + Master  (0) runs on host            = node54
>  + Slave     1 runs on host            = node53
>  + Slave     2 runs on host            = node52
>  + Slave     3 runs on host            = node51
>  + Slave     4 runs on host            = node50
>  + Slave     5 runs on host            = node49
>  + Slave     6 runs on host            = node48
>  + Slave     7 runs on host            = node47
>  + Slave     8 runs on host            = node46
>  + Slave     9 runs on host            = node45
>  + Slave    10 runs on host            = node44
>  + Slave    11 runs on host            = node43
>  + Slave    12 runs on host            = node42
>  + Slave    13 runs on host            = node41
>  + Slave    14 runs on host            = node40
>  + Slave    15 runs on host            = node39
>  + Slave    16 runs on host            = node38
>  + Slave    17 runs on host            = node37
>  + Slave    18 runs on host            = node36
>  + Slave    19 runs on host            = node35
>  + Slave    20 runs on host            = node34
>  + Slave    21 runs on host            = node33
>  + Slave    22 runs on host            = node32
>  + Slave    23 runs on host            = node31
>  + Slave    24 runs on host            = node30
>  + Slave    25 runs on host            = node29
>  + Slave    26 runs on host            = node28
>  + Slave    27 runs on host            = node27
>  + Slave    28 runs on host            = node26
>  + Slave    29 runs on host            = node25
>  =================
>  Running in double precision.
> ERROR Experiment::divideParticlesInRandomHalves: invalid number for random
> subset (i.e. not 1 or 2): 7
> File: src/exp_model.cpp line: 211
> --------------------------------------------------------------------------
> MPI_ABORT was invoked on rank 5 in communicator MPI_COMM_WORLD
> with errorcode 1.
>
> NOTE: invoking MPI_ABORT causes Open MPI to kill all MPI processes.
> You may or may not see output from other processes, depending on
> exactly when Open MPI kills them.
> --------------------------------------------------------------------------
> ERROR Experiment::divideParticlesInRandomHalves: invalid number for random
> subset (i.e. not 1 or 2): 7
> File: src/exp_model.cpp line: 211
> ERROR Experiment::divideParticlesInRandomHalves: invalid number for random
> subset (i.e. not 1 or 2): 7
> File: src/exp_model.cpp line: 211
> [node54:19017] 1 more process has sent help message help-mpi-api.txt /
> mpi-abort
> [node54:19017] Set MCA parameter "orte_base_help_aggregate" to 0 to see
> all help / error messages
> ---------------------
> Your advice and suggestion would be greatly appreciated
>
> with regards
> -Prem
>
>


-- 
Sjors Scheres
MRC Laboratory of Molecular Biology
Francis Crick Avenue, Cambridge Biomedical Campus
Cambridge CB2 0QH, U.K.
tel: +44 (0)1223 267061
http://www2.mrc-lmb.cam.ac.uk/groups/scheres