Gerard Kleywegt will remember that I once (back in the last millennium) wrote a program to quantify the information content of a diffraction data set. The reason only he will remember it is that it didn’t turn out to be brilliantly useful so there didn’t seem much point in publishing it!
The program was based on the Shannon sort of idea of information: how much was the entropy of the probability distribution reduced by having made the measurement. This compares the probability distribution of an intensity after making a measurement with the Wilson distribution that applies before making the measurement. The more precise the measurement, the more bits of information you have gained.
At the time, Gerard was looking at various quantities to see what is most predictive of the quality of model you will be able to get. I was imagining that the number of bits of information per parameter to determine would be a really great measure. It was fine, but other things (like, I think, just the nominal resolution of the data: Gerard?) were actually better.
Looking back at it, I think I have a better understanding now than I did back then of how to deal with the statistical effects of measurement errors, and it’s probably fair to say that data processing programs are getting better at estimating the standard deviations of intensity measurements. So it might be worth revisiting this, though I don’t have great hopes for it being revolutionary!
Best wishes,
Randy
-----
Randy J. Read
Department of Haematology, University of Cambridge
Cambridge Institute for Medical Research Tel: +44 1223 336500
Wellcome Trust/MRC Building Fax: +44 1223 336827
Hills Road E-mail: [log in to unmask]
Cambridge CB2 0XY, U.K. www-structmed.cimr.cam.ac.uk
> On 4 Feb 2016, at 20:09, Keller, Jacob <[log in to unmask]> wrote:
>
> Dear Crystallographers,
>
> I have always wondered whether it would be possible generally and rigorously to quantify the amount of information in a series of measurements (crystallographic or otherwise), either absolutely (in bits?) or a least relatively. This would be especially useful in crystallography. For example, one could determine how much information is present in the dataset when integrated with no resolution limits, then see how the information content diminished as a function of cutoff. Also, in comparing two datasets with similar resolution but different B factors, the information distribution would be different, which might have ramifications.
>
> In trying generally to fit data to various functions, information quantification might be more complicated, since some data points are "worth" more than others, for example near and far from the kd of a binding curve. In fitting a Fourier series to a 3D electron density function, however, this might be less important, since each of the reflections contributes to the entire 3D density function. I remember seeing a comment from James Holton here relating to this topic, in which he said with very-high-precision low resolution data, one can use B-sharpening to produce maps similar to those from higher-resolution data. It seems at least, then, that both precision and resolution are important for determining the goodness of a data set. But, as far as I know, there is no direct measure of information quantity in crystallography--perhaps there should be?
>
> The case I have before me right now is how to compare truncated data to fully-measured data. Let's say, for the sake of argument, that a given crystal would have diffracted to 1.2 Angstrom, but was truncated by the detector to 1.7 Angstrom. How would the information content of this dataset compare to the fully-measured dataset? I guess this would depend on the B factor: the higher the B-factor, the more information would be in the lower-resolution bins. So, if most of the information is present before reaching the cutoff, perhaps the structure should be modelled similarly to a higher-resolution one, and perhaps with anisotropic B-factors?
>
> Another question: if data are measured with high multiplicity to 1.7 resolution but are truncated, how does this compare to a 1.2 Angstrom but less-precisely measured dataset, in terms of information content?
>
> It seems to me that the oft-rehearsed requirement of certain data:parameter ratios depends highly on the precision of the measurements (nothing novel here), so a measure of "information," rather than either a simple ratio or an empirically-based rule of thumb, might be the best guide in deciding which parameters to model.
>
> JPK
>
>
> *******************************************
> Jacob Pearson Keller, PhD
> Looger Lab/HHMI Janelia Research Campus
> 19700 Helix Dr, Ashburn, VA 20147
> email: [log in to unmask]
> *******************************************
|