Dear Randy,
related to this - as you suggested, the latest version of XDSCONV now has a CCP4_I+F output format (documented at http://homes.mpimf-heidelberg.mpg.de/~kabsch/xds/html_doc/xdsconv_formats.html#CCP4_I+F ) which produces a MTZ file with
IMEAN SIGIMEAN FP SIGFP and (if FRIEDEL'S_LAW=TRUE was specified) also I(+),SIGI(+),I(-),SIGI(-),F(+),SIGF(+),F(-),SIGF(-) (and, if requested, a TEST flag for Rfree reflections). This allows to provide intensities to those programs which already support their use, and at the same time, provide amplitudes (resulting from French&Wilson calculation with its various disadvantages) to those programs which don't.
In addition, it simplifies deposition of intensities at the PDB.
For phenix.refine, this new format requires the user to choose whether to use IMEAN SIGIMEAN or FP SIGFP for refinement. In the former case, the intensities would be (at the moment) internally converted to amplitudes - but in the future, phenix.refine supposedly will be able to use intensities directly, using the approach you published.
I don't agree that the sigmas of weak high-resolution reflections are poorly estimated by data processing programs. I've tested XDS with simulated data, and the estimates are actually good, likely because the random (Poisson counting) error is more easily taken care of than the systematic errors, which dominate the error of the strong reflections.
best,
Kay
On Thu, 2 Jun 2016 19:37:25 +0100, Randy Read <[log in to unmask]> wrote:
>Dear John,
>
>We’ve had good success placing fragments with Phaser to solve Fab structures, along the lines that a number of people have mentioned already.
>
>In terms of your original question about weak data, the version of Phaser in current releases of CCP4 and Phenix now accounts much better for the effect of intensity measurement errors. (But make sure you give it intensities and particularly not French&Wilson amplitudes!) If you have a good partial model, data to the highest possible resolution will help, and — as long as the intensity sigmas are reasonably accurate — pushing the resolution shouldn’t hurt. However, it’s probably fair to say that when the resolution limit is pushed to extremes, the integration and scaling programs do a poorer job of estimating the standard deviations.
>
>Best wishes,
>
>Randy
>
>-----
>Randy J. Read
>Department of Haematology, University of Cambridge
>Cambridge Institute for Medical Research Tel: +44 1223 336500
>Wellcome Trust/MRC Building Fax: +44 1223 336827
>Hills Road E-mail: [log in to unmask]
>Cambridge CB2 0XY, U.K. www-structmed.cimr.cam.ac.uk
>
>> On 2 Jun 2016, at 10:55, John R Helliwell <[log in to unmask]> wrote:
>>
>> Dear Colleagues,
>>
>> We would be grateful for suggestions on the following case of molecular replacement using antibody H and L chains (ie high percentage of beta sheet content) and weak X-ray diffraction data ie going to 3.2 Angstrom (<I/sigI> crossing 2), 2.8 Angstrom (CC½ crossing 0.25) and finally processed to 2 Angstrom where the molecular transform (presumably) rises again to give a CC½ 0.04. We recall that beta sheet protein subunit placements can prove awkward, due presumably to the hydrogen bonding electron density, running perpendicular to the polypeptide chains, preventing a successful placement.
>>
>> So, are there particularly successful MR programs or specific, non-default, MR program settings for such a case?
>>
>> Thank you,
>>
>> John and Emmanuel
>>
>>
>>
>> John R Helliwell & Emmanuel Saridakis
|