Print

Print


Hi Ilian,

Thanks for the reply, somehow I didn't see it until I searched today. Sorry
for the delay.

I am sending you a link to the input files.

1. I believe this is true, I will check to confirm.

2. version:   4.08    /     march  2016
compiler: gfortran v5.1.0
 ****      MPI: v3.0
 **** MPI libs: Open MPI v1.8.8, package: Open MPI root@cnode0 ****
 **** MPI libs: 006 Distribution, ident: 1.8.8, repo rev: v1.8 ****
 **** MPI libs: .7-20-g1d53995, Aug 05, 2015

3. I'm not sure if there's a specific "trace-back" method? The recomended
value (283) or above are ones where it fails, 282 didproceed further to
give output and stop on warnings/error.

Much appreciated.
Anton

On Tue, 7 May 2019 at 13:14, Ilian Todorov - UKRI STFC <
[log in to unmask]> wrote:

> Hi Anton,
>
>
>
> 1)  Can you confirm that the jobs submitted to run with versions 4.07 and
> 4.08 are with the same input as (processor count and CONTROL) as these that
> run successfully with versions 2.20 and 3.09?
>
> 2)  Can you confirm which version of 4.08 have you used for this?
>
> 3)  Have you tried to trace-back the problem when going to the
> recommended (still an estimate) densvar?
>
>
>
> Do send me a link with the input files so I could try and investigate.
>
>
>
> Regards,
>
>
>
> Ilian Todorov
>
>
>
> *From:* DL_POLY Mailing List <[log in to unmask]> *On Behalf Of *Anton
> Lopis
> *Sent:* 07 May 2019 11:16
> *To:* [log in to unmask]
> *Subject:* Empty OUTPUT for higher values of denvar
>
>
>
> Hi All,
>
>
>
> I'm trying to assist one of our users in terms of scaling calculations and
> moving from version 2 to version 4. Her inputs will work on versions 2.20
> and 3.09, but are failing on 4.07 and 4.08. She has used  denvar=600, but I
> need to drop the value significantly in order to see any output.
>
>
>
> The code (4.08) recommends using 283, however if I use 282 I get the final
> part of output listed below. For 283 and above (I've tried 284 and others)
> the code starts running and dies without writing anything into the OUTPUT
> (the std error and std out seem unhelpful) - in this case, it seems to take
> a few seconds longer before the job stops than when I use 282.
>
>
>
> I can provide more info if needed including the input files (tarred and
> zipped via Google drive perhaps). The system contains 717703 atoms, using
> Buckingham, core and Coulombic.
>
>
>
> Please let me know what you think or need to know from me.
>
> Much appreciated,
>
> Anton
>
>
>
>
> I/O read method: parallel by using MPI-I/O (assumed)
>  I/O readers (assumed)                  15
>  I/O read batch size (assumed)     2000000
>  I/O read buffer size (assumed)      20000
>  I/O parallel read error checking off (assumed)
>
>  I/O write method: parallel by using MPI-I/O (assumed)
>  I/O write type: data sorting on (assumed)
>  I/O writers (assumed)                  60
>  I/O write batch size (assumed)    2000000
>  I/O write buffer size (assumed)     20000
>  I/O parallel write error checking off (assumed)
>
>
>  node/domain decomposition (x,y,z):      4     5     6
>
>  pure cutoff driven limit on largest possible decomposition:117649
> nodes/domains (49,49,49)
>
>  pure cutoff driven limit on largest balanced decomposition: 13824
> nodes/domains (24,24,24)
>
>  cutoffs driven limit on largest possible decomposition:103823
> nodes/domains (47,47,47)
>
>  cutoffs driven limit on largest balanced decomposition: 12167
> nodes/domains (23,23,23)
>
>  link-cell decomposition 1 (x,y,z):     11     9     7
>
>  *** warning - next error due to maximum number of atoms per domain set to
> : 90308
>  ***           but maximum & minumum numbers of atoms per domain asked for
> : 90454 & 0
>  ***           estimated denvar value for passing this stage safely is :
> 283
>
>  DL_POLY_4 terminated due to error    45
>
>  error - too many atoms in CONFIG file or per domain
>
>
>
>
> --
>
> Anton Lopis
> CHPC
> 021 658 2746 (W)
> 072 461 3794 (Cell)
> 021 658 2746 (Fax)
>
>
> ------------------------------
>
> To unsubscribe from the DLPOLY list, click the following link:
> https://www.jiscmail.ac.uk/cgi-bin/webadmin?SUBED1=DLPOLY&A=1
>
> ------------------------------
>
> To unsubscribe from the DLPOLY list, click the following link:
> https://www.jiscmail.ac.uk/cgi-bin/webadmin?SUBED1=DLPOLY&A=1
>


-- 
Anton Lopis
CHPC
021 658 2746 (W)
072 461 3794 (Cell)
021 658 2746 (Fax)

########################################################################

To unsubscribe from the DLPOLY list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/webadmin?SUBED1=DLPOLY&A=1