Print

Print


Dear all,
I'm a bit confused from the output of the CORRECT step in XDS. In one of the 
first tables I can read the mean I/sigma for each resolution shell, but these 
values are much different from the I/sigma reported in the table at the end of 
the output files, titled "completeness and quality of data set" for the full 
data range with signal/noise > -3.0. For example, from the first table I have 
I/sigma = 2 at 3.6 A, while from the second table I have I/sigma = 2 at 2.8 A!
What is exactly the difference between the two values? And which one is reliable 
to decide the resolution cutoff?


Thank you in advance,

Michele Lunelli
MPI for Infection Biology
Berlin - Germany








----