Jim,
This is coming from someone who just got enlightened a few weeks ago on resolution cut-offs.
>>I am asked often: What value of CC1/2 should I cut my resolution at? <<
The K&D paper mentioned that the CC(1/2) criterion loses its significance at ~9 according to student test.
I doubt that this can be a generally true guideline for a resolution cut-off. The structures I am doing right now were cut off at ~20 to ~80 CC(1/2)
You probably do not want to do the same mistake again, we all made before, when cutting resolution based on Rmerge/Rmeas, do you?
>> What should I tell my students? I've got a course coming up and I am sure they will ask me again.<<
This is actually the more valuable insight I got from the K&D paper. You don't use the CC(1/2) as an absolute indicator but rather as an suggestion. The resolution limit is determined by the refinement, not by the data processing.
I think I will handle my data in future as follows:
Bins with CC(1/2) less than 9 should be initially excluded.
The structure is then refined against all reflections in the file and only those bins that add information to the map/structure are kept in the final rounds. In most cases this will probably be more than CC(1/2) 25. If the last shell (CC~9) still adds information to the model, process the images again, e.g. till CC(1/2) drops to 0, and see if some more useful information is in there. You could also go ahead and use CC(1/2) 0 as initial cut-off, but I think that will rather increase computation time than help your structure in most cases.
So yes, I would feel comfortable with giving true resolution limits based on the refinement of the model, and not based on any number derived from data processing. In the end, you can always say " I tried it and this was the highest resolution I could model" vs. "I cut at _numerical value X of this parameter_ because everybody else does so".
|