In this case, I'm more on ZO's side. Let's say that the refinement program
can't get an atom to the right position (for instance, to pick a reasonably
realistic example, because you've put a leucine side chain in backwards).
In that case, the B-factor for the atom nearest to where there should be
one in the structure will get larger to smear out its density and put some
in the right place. To a good approximation, the optimal increase in the
B-factor will be the one you'd expect for a Gaussian probability
distribution, i.e. 8Pi^2/3 times the positional error squared. So a refined
B-factor does include a measure of the uncertainty or error in the atom's
position.
Best wishes,
Randy Read
On Apr 1 2011, James Holton wrote:
>I'm not sure I entirely agree with ZO's assessment that a B factor is
>a measure of uncertainty. Pedantically, all it really is is an
>instruction to the refinement program to "build" some electron density
>with a certain width and height at a certain location. The result is
>then compared to the data, parameters are adjusted, etc. I don't
>think the B factor is somehow converted into an "error bar" on the
>calculated electron density, is it?
>
>For example, a B-factor of 500 on a carbon atom just means that the
>"peak" to build is ~0.02 electron/A^3 tall, and ~3 A wide (full width
>at half maximum). By comparison, a carbon with B=20 is 1.6
>electrons/A^3 tall and ~0.7 A wide (FWHM). One of the "bugs" that
>Dale referred to is the fact that most refinement programs do not
>"plot" electron density more than 3 A away from each atomic center, so
>a substantial fraction of the 6 electrons represented by a carbon with
>B=500 will be sharply "cut off", and missing from the FC calculation.
>Then again, all 6 electrons will be missing if the atoms are simply
>not modeled, or if the occupancy is zero.
>
>The point I am trying to make here is that there is no B factor that
>will make an atom "go away", because the way B factors are implemented
>is to always conserve the total number of electrons in the atom, but
>just spread them out over more space.
>
>Now, a peak height of 0.02 electrons/A^3 may sound like it might as
>well be zero, especially when sitting next to a B=20 atom, but what if
>all the atoms have high B factors? For example, if the average
>(Wilson) B factor is 80 (like it typically is for a ~4A structure),
>then the average peak height of a carbon atom is 0.3 electrons/A^3,
>and then 0.02 electrons/A^3 starts to become more significant. If we
>consider a ~11 A structure, then the average atomic B factor will be
>around 500. This "B vs resolution" relationship is something I
>derived empirically from the PDB (Holton JSR 2009). Specifically, the
>average B factor for PDB files at a given resolution "d" is: B =
>4*d^2+12. Admittedly, this is "on average", but the trend does make
>physical sense: atoms with high B factors don't contribute very much
>to high-angle spots.
>
>More formally, the problem with using a high B-factor as a "flag" is
>that it is not resolution-general. Dale has already pointed this out.
>
>Personally, I prefer to think of B factors as a atom-by-atom
>"resolution" rather than an "error bar", and this is how I tell
>students to interpret them (using the B = 4*d^2+12 formula). The
>problem I have with the "error bar" interpretation is that
>heterogeneity and uncertainty are not the same thing. That is, just
>because the atom is "jumping around" does not mean you don't know
>where the centroid of the distribution is. The "u_x" in
>B=8*pi^2*<u_x^2> does reflect the standard error of atomic position in
>a GIVEN unit cell, but since we are averaging over trillions of cells,
>the "error bar" on the AVERAGE atomic position is actually a great
>deal smaller than "u". I think this distinction is important because
>what we are building is a model of the AVERAGE electron density, not a
>single molecule.
>
>Just my 0.02 electrons
>
>-James Holton
>MAD Scientist
>
>
>
>On Fri, Apr 1, 2011 at 10:57 AM, Zbyszek Otwinowski
><[log in to unmask]> wrote:
>> The meaning of B-factor is the (scaled) sum of all positional
>> uncertainties, and not just its one contributor, the Atomic Displacement
>> Parameter that describes the relative displacement of an atom in the
>> crystal lattice by a Gaussian function. That meaning (the sum of all
>> contributions) comes from the procedure that calculates the B-factor in
>> all PDB X-ray deposits, and not from an arbitrary decision by a
>> committee. All programs that refine B-factors calculate an estimate of
>> positional uncertainty, where contributors can be both Gaussian and
>> non-Gaussian. For a non-Gaussian contributor, e.g. multiple occupancy,
>> the exact numerical contribution is rather a complex function, but
>> conceptually it is still an uncertainty estimate. Given the resolution
>> of the typical data, we do not have a procedure to decouple Gaussian and
>> non-Gaussian contributors, so we have to live with the B-factor being
>> defined by the refinement procedure. However, we should still improve
>> the estimates of the B-factor, e.g. by changing the restraints. In my
>> experience, the Refmac's default restraints on B-factors in side chains
>> are too tight and I adjust them. Still, my preference would be to have
>> harmonic restraints on U (square root of B) rather than on Bs
>> themselves. It is not we who cram too many meanings on the B-factor, it
>> is the quite fundamental limitation of crystallographic refinement.
>>
>> Zbyszek Otwinowski
>>
>>> The fundamental problem remains: we're cramming too many meanings into
>> one number [B factor]. This the PDB could indeed solve, by giving us
>> another column. (He said airily, blithely launching a totally new flame
>> war.)
>>> phx.
>>>
>>
>
|