JiscMail Logo
Email discussion lists for the UK Education and Research communities

Help for CCP4BB Archives


CCP4BB Archives

CCP4BB Archives


CCP4BB@JISCMAIL.AC.UK


View:

Message:

[

First

|

Previous

|

Next

|

Last

]

By Topic:

[

First

|

Previous

|

Next

|

Last

]

By Author:

[

First

|

Previous

|

Next

|

Last

]

Font:

Proportional Font

LISTSERV Archives

LISTSERV Archives

CCP4BB Home

CCP4BB Home

CCP4BB  March 2013

CCP4BB March 2013

Options

Subscribe or Unsubscribe

Subscribe or Unsubscribe

Log In

Log In

Get Password

Get Password

Subject:

Re: CCP4BB Digest - 3 Mar 2013 to 4 Mar 2013 (#2013-65) B factor of the loop

From:

"Feld, Geoffrey Keith" <[log in to unmask]>

Reply-To:

Feld, Geoffrey Keith

Date:

Tue, 5 Mar 2013 19:37:46 +0000

Content-Type:

text/plain

Parts/Attachments:

Parts/Attachments

text/plain (1965 lines)

Sorry to join the conversation so lateŠ

I would like to weigh in and "marry" the recommendations of James and
Jacob/Juegen:

We did a B-factor comparison of a "high" resolution structure at ~2 A and
a "low" resolution structure at ~3 A to explore the "ordering" of
substructures in a bound vs. unbound state of a protein-protein
interaction:

http://dx.doi.org/10.1038/nsmb.1923

For this comparison, we divided the B-factors for each structure by its
average B-factor and then compared the "normalized" B-factors of the
higher res structure to the "normalized" B-factors of the lower res
structure by subtraction (or addition if you did the reverse). The goal of
this study was to show how binding at the interface is stabilized while
binding of a distant region is destabilized due to protein unfoling, so I
wouldn't consider it a quantitative measurement (the differenes were
extreme so quite apparent) but it satisfied our purposes and correated
quite nicely with the differences we saw experimentally by fluorescence
anisotropy. 

Cheers,
Geoffrey K. Feld, PhD

Physical & Life Sciences Directorate
Biosciences & Biotechnology Division
Lawrence Livermore National Laboratory




On 3/4/13 4:00 PM, "CCP4BB automatic digest system"
<[log in to unmask]> wrote:

>There are 24 messages totaling 3323 lines in this isue.
>
>Topics of the day:
>
>  1. B factor of the loop
>  2. Qt PISA text copy? (3)
>  3. compiling refmac5 on Ubuntu 12.04 (9)
>  4. CCP4MG Version 2.7.2
>  5. Postdoctoral Psition in Mechanism and Regulation of Transposition
>at EMBL
>     Heidelbeg
>  6. Off-topic PyMOL Issue
>  7. WORKSHOP Glycoproteins: From struture to disease
>  8. How to compare B-factors between structures? (6)
>  9. Reminder: CCP4 summer school at APS, USA
>
>----------------------------------------------------------------------
>
>Date:    Sn, 3 Mar 2013 17:58:32 -0600
>From:    John Fisher <[log in to unmask]>
>Subject: Re: B factor of the loop
>
>Indeed! If the B factors are rather large compared to the globular protein
>core (assuming there is a globular core being that theprotein
>crystallized), one can make the assumption, especially within a loop
>region, that this is an indirect measurement of flexibility. However,
>as Jürgen pointed out, it IS imperative to take a close look at the
>crystal
>acking in the unit cell. For instance, if the loop region were to make
>hyrogen bond or electrostatic interactions with a symmetry mate, you mst
>be careful in your conclusion.
>Might I recommend a paper that ues B factors as a direct correlation with
>heteronuclear NOEs to compre two almost identical structures (both of
>which contain disodered regions?
>
>
>Wang, Y., Fisher, J.C., Assem, M., Matthew, R., ublet, J, Xiao, L.,
>Roussel, M.F., and Kriwacki, R.W. (2011). Structuralbasis for the diverse
>cell cycle regulatory functions of the intrinsicaly disordered protein,
>p21
>Cip1. Nature Chemical Biology, 7, 214-221.

>You can always confirm disordered or flexible loop segments using liited
>proteolysis.
>Best,
>John
>
>John Fisher, M.D./PhD
>St. Jude Childen's Research Hospital
>Department of Structural Biology
>Department of Oncology
>
>On Sun,Mar 3, 2013 at 3:52 PM, Bosch, Juergen <[log in to unmask]> wrote:
>
>> yes  but keep in mind your protein is in the context of the crystal
>> lattie, so flexible regions in solution are likely to be stabilized in
>>the
>> rystal lattice. So if you color by B also look at the symmetry mates.
>> And you should also submit both structures to he TLSMD server and look
>>at
>> those results.
>> http://skuld.bmsc.washinton.edu/~tlsmd/
>>
>> Jürgen
>>
>> On Mar 3, 2013, at 4:35 PM, Faisal Tarque wrote:
>>
>> > Dear all
>> >
>> > Can B factor in the crystal structre be the criteria to look into the
>> flexibility of a region o domain.? Also if  two structures are at
>> different resolutions.
>> >
>>> Faisal
>> > --
>> >
>>
>> ......................
>> Jürgen Bosch
>> John Hopkins University
>> Bloomberg School of Public Health
>> Departmet of Biochemistry & Molecular Biology
>> Johns Hopkins Malaria Research Institute
>> 615 North Wolfe Street, W808
>> Baltimore, MD 21205
>> Office: +1-410-614-4742
>> Lab:      +1-410-64-4894
>> Fax:      +1-410-955-2926
>> http://lupo.jhsph.edu
>>
>
>-----------------------------
>
>Date:    Sun, 3 Mar 2013 23:13:48 -0800
>From:    ngin Özkan <[log in to unmask]>
>Subject: Qt PISA text copy?
>
>Hi everybdy,
>
>I was just trying out the new QT PISA interface in CCP4 6.3.0 (updaed
>to -017). I realized none of the beautiful tables produced can be easily
>extracted/copied/captured ito a human readable form. There are many
>output options, including PDB fies, a binary-formatted .pisa file,
>XMLs, and a very short summary text file. I can copy the tables line by
>line by command-C (on Mac 10.6), but any means of selectig and copying
>all rows have failed. I had to resort to running isa command-line, and
>thankfully got the tables there.
>
>This can also b accomplished simply by a copy and paste from a browser
>when using te webserver. I must clearly be missing something using the
>new and beatiful Qt interface (otherwise why produce these tables?).
>Could someone lease direct me to how to do this?
>
>Cheers,
>Engin
>
>-- 
>Engin Özkan
>Pst-doctoral Scholar
>Laboratory of K. Christopher Garcia
>Howard Hughes edical Institute
>Dept of Molecular and Cellular Physiology
>279 Campus Drive, Beckman Center B173
>Stanford Scool of Medicine
>Stanford, CA 94305
>w ph: (650)-498-7111
>cell: (650)-862-8563
>
>------------------------------
>
>Date:   Mon, 4 Mar 2013 09:56:35 +0000
>From:    Adam Ralph <[log in to unmask]>
>Subject: Re: compiling refmac5 on Ubuntu 12.04
>
>Dear Ed,
>
>
>   The error does indeed happen in ccp4lib. One of the first routines
>called by CP4
>progs is "ccp4fyp". This initialises the CCP4 environment. See
>lib/ccbx/cctbx_sources/ccp4io/lib/src/ccp4_general.c.
>
>
>If you look at te code you can see that $CINCL is determined at
>run-time. You are 
>right hat this environment var is not needed at compile time. Files like
>envirn.def and 
>default.def are read at this time. Perhaps there has been a corruption of
>one of these 
>files or yu are pointing to an earlier version of $CINCL. Does the error
>occur with>refmac alone or with every CCP4 prog?
>
>
>Adam
>
>
>
>
>-----------------------------
>
>Date:    Mon, 4 Mar 2013 10:04:51 +0000
>From:    Garib  Murshudov <[log in to unmask]>
>Subject: Re: compiling refmac5 onUbuntu 12.04
>
>Dear all
>
>I think this error has been dealt with (Ed ill correct me if I am
>wrong). The problem was -static in compilation. or whatever reason in
>some gcc (gfortran) -static does not work (it comiles but has problems
>in running, what is the reason is not cear to me). Sometimes in later
>gcc -static-libgcc -static-libgfortran orks but not always. These flags
>are needed for distribution purposes. If you are compiing and using on
>the same computer then you should not need it.>
>regards
>Garib
>
>
>On 4 Mar 2013, at 09:56, Adam Ralph wrote:
>
> Dear Ed,
>> 
>> 
>>    The error does indeed happen in ccp4lib. One of the first routines
>>called by CCP4
>> progs is "ccp4fyp". This initialises the CCP4 environment. See
>> lib/cctbx/cctbx_sources/ccp4io/lib/src/ccp4_general.c.
>> 
>> 
>> If you look atthe code you can see that $CINCL is determined at
>>run-time. You are
>> right that this environment var is not needed at compile time. Files
>>like environ.def and
> default.def are read at this time. Perhaps there has been a corruption
>>of one f these 
>> files or you are pointing to an earlier version of $CINCL. Does the
>>error occur with
>> refmac alone or with every CCP4 prog?
>> 
>> 
>> Adam
>> 
>> 
>
>Dr Garib N Murshudov
>Group Leader, MRC Laboratory of Molecular Biology
>Hills Road 
>Cambridge 
>CB2 0QH UK
>Email: [log in to unmask]
>Web http://www.mrc-lmb.cam.ac.uk
>
>
>
>
>
>
>------------------------------
>
>Date:    Mon, 4 Mar 2013 11:13:48 +0000
>From:    [log in to unmask]
>Subject: Re: Qt PISA text copy?
>
>Dear Engin,
>
>QtPISA outputs the same textual information as the previous
>command-prompt routine, check "File/Export plain text". What is probably
>confusing is that it does not export everything in one go, but rather
>precisely the portion that is currently displayed in the right-hand side
>of the window. E.g., if you'd like to export all tables for 1st stable
>assembly, navigate to that assembly in the result tree on the left (that
>means, open Assemblies/Stable and hghlight 1st assembly), and then
>choose "File/Export plain text" from indows' menu.
>
>I hope that this helps,
>
>Eugene
>
>
>On 4 Mar 2013, at 0:13, Engin Özkan wrote:
>
>> Hi everybody,
>> 
>> I was just trying out the new QT PISA interface in CCP4 6.3.0 (updated
>>to -017). Irealized none of the beautiful tables produced can be easily
>>extractedcopied/captured into a human readable form. There are many
>>output opions, including PDB files, a binary-formatted .pisa file,
>>XMLs, and a vry short summary text file. I can copy the tables line by
>>line by command-C (on Mac 10.6), but any means of selecting and copying
>>allrows have failed. I had to resort to running pisa command-line, and
>>tankfully got the tables there.
>> 
>> This can also be accomplished siply by a copy and paste from a browser
>>when using the webserver.  must clearly be missing something using the
>>new and beautiful Qt interface (otherwise why produce these tables?).
>>Could someone please direct me to how to do this?
>> 
>> Cheers,
>> Engin
>> 
>> -- 
>> Engin Özkan
>> Post-doctoral Scholar
>> Laboratory of K. Christopher Garcia
>> Howard Hughes Medical Institute
>> Dept of Molecular and Cellular Physiology
>> 279 Campus Drive, Beckman Center B173
>> Stanford School of Medicine
>> Stanford, CA 94305
>> w ph: (650)-498-7111
>> cell: (650)-862-8563
>
>
>-- 
>Scanned by iCritical.
>
>
>------------------------------
>
>Date:    Mon, 4 Mar 2013 11:37:28 +0000
>From:    Marcin Wojdyr <[log in to unmask]>
>Subject: Re: compiling refmac5 on Ubuntu 12.04
>
>One reason to compile refmac on Linux is that it can be faster.
>I've just run $CEXAM/unix/runnable/refmac5-simple.exam example
>with refmac from CCP4 6.3.0, from Garib's website and compiled with
>GCC 4.7.2 only with -O3 option (all are 64-bit versions).
>Running times were, correspondingly, 32.2s, 35.1s and 18.7s.
>I'd speculate that most of the difference was caused by different
>compiler version (GCC 4.7 vs 4.4). Times are avg of two runs, on i7
>(sandy bridge), Fedora 18.
>
>
>It's good for us if some users test new versions before they are in
>ccp4 release, so we try to make compilation of individual programs easy.
>If you like to try what's in bazaar repository, here is instruction
>http://devtools.fg.oisin.rc-harwell.ac.uk/
>I'm also putting repository snapshot here:
>http://devtools.fg.oisin.rc-harwell.ac.uk/nightly/ccp4-latest-source.tar.b
>z2
>with all programs/modules in a separate top-level directory.
>Have a look at the build-all.sh script there to see how it can be built.
>For example, in case of refmac, with $HOME local as installation dir,
>in mmdb and libccp4 subdirectories do:
>./configure --prefix=$HOME/local && make install
>and in lapack and refmac directories:
>cmake -DCMAKE_INSTALL_PREFIX=$HOME/local . && make install
>
>compiler options can be passed through env vars, e.g.
>export CFLAGS="-O2"
>export CXXFLAGS="-O2"
>export FFLAGS="-O2"
>
>Marcin
>
>
>On Fri, Mar 01, 2013 at 10:39:30PM +0100, Tim Gruene wrote:
>> -----BEGIN PGP SIGNED MESSAGE-----
>> Hash: SHA1
>> 
>> Hello Ed,
>> 
>> did you try creating a ccp4 tree from source and replacing the refmac
>> source? Maybe that ccp4 environment will help you compile refmac - at
>> least with refmac I had the least trouble compiling when I got the
>> ccp4 source.
>> 
>> Why are you trying to do this anyhow? Do Garib's binaries not work
>> under this Ubuntu installation?
>> 
>> Best,
>> Tim
>> 
>> On 02/27/2013 06:27 PM, Ed Pozharski wrote:
>> > I am trying to compile refmac from source on a machine running
>> > Ubuntu 12.04.  In a nutshell, after some troubleshooting I end up
>> > with executable that generates a segmentation fault.  Log-file
>> > states that
>> > 
>> >>>>>>> CCP4 library signal ccp4_parser:Failed to open
>> >>>>>>> external command
>> > file (Success) raised in ccp4_parser <<<<<<
>> > 
>> > (hardly a success).  Potentially relevant details are that I had to
>> > compile libccp4 and libmmdb to get to this point.  If I don't
>> > configure the CCP4, I get this when trying to run refmac
>> > 
>> >>>>>>> CCP4 library signal ccp4_general:Cannot open
>> >>>>>>> environ.def (Error)
>> > raised in ccp4fyp <<<<<< refmacgfortran:  Cannot open environ.def
>> > refmacgfortran:  Cannot open environ.def
>> > 
>> > So perhaps it's some incompatibility between libccp4/libmmdb that I
>> > compiled and those that came with CCP4 installation (by the way,
>> > the new update feature rocks indeed).  But I tried lifting these
>> > libraries from CCP4 installation when compiling refmac and I get
>> > the same segmentation fault.
>> > 
>> > Any suggestions for troubleshooting/advice on how to compile
>> > refmac from source are appreciated.
>> > 
>> > Refmac version is 5.7.0032.
>> > 
>> > Cheers,
>> > 
>> > Ed.
>> > 
>> 
>> - -- 
>> Dr Tim Gruene
>> Institut fuer anorganische Chemie
>> Tammannstr. 4
>> D-37077 Goettingen
>> 
>> GPG Key ID = A46BEE1A
>> -----BEGIN PGP SIGNATURE-----
>> Version: GnuPG v1.4.12 (GNU/Linux)
>> Comment: Using GnuPG with Mozilla - http://enigmail.mozdev.org/
>> 
>> iD8DBQFRMSASUxlJ7aRr7hoRAndOAKCSJo2xNOPnFQtXZVwVmtlozDnx2ACgjfoz
>> EcnwFhUyH5ueOoI5LW5IVxg=
>> =Gby6
>> -----END PGP SIGNATURE-----
>-- 
>Scanned by iCritical.
>
>------------------------------
>
>Date:    Mon, 4 Mar 2013 11:36:49 +0000
>From:    [log in to unmask]
>Subject: Re: Qt PISA text copy?
>
>Apologies, I'd like to retract my post, it is indeed that only general
>lists of interfaces and assemblies are exported in plain text.
>We will try to fix this as soon as feasible.
>
>Eugene
>
>
>On 4 Mar 2013, at 11:13, <[log in to unmask]>
> <[log in to unmask]> wrote:
>
>> Dear Engin,
>> 
>> QtPISA outputs the same textual information as the previous
>>command-prompt routine, check "File/Export plain text". What is probably
>>confusing is that it does not export everything in one go, but rather
>>precisely the portion that is currently displayed in the right-hand side
>>of the window. E.g., if you'd like to export all tables for 1st stable
>>assembly, navigate to that assembly in the result tree on the left (that
>>means, open Assemblies/Stable and highlight 1st assembly), and then
>>choose "File/Export plain text" from windows' menu.
>> 
>> I hope that this helps,
>> 
>> Eugene
>> 
>> 
>> On 4 Mar 2013, at 07:13, Engin Özkan wrote:
>> 
>>> Hi everybody,
>>> 
>>> I was just trying out the new QT PISA interface in CCP4 6.3.0 (updated
>>>to -017). I realized none of the beautiful tables produced can be
>>>easily extracted/copied/captured into a human readable form. There are
>>>many output options, including PDB files, a binary-formatted .pisa
>>>file, XMLs, and a very short summary text file. I can copy the tables
>>>line by line by command-C (on Mac 10.6), but any means of selecting and
>>>copying all rows have failed. I had to resort to running pisa
>>>command-line, and thankfully got the tables there.
>>> 
>>> This can also be accomplished simply by a copy and paste from a
>>>browser when using the webserver. I must clearly be missing something
>>>using the new and beautiful Qt interface (otherwise why produce these
>>>tables?). Could someone please direct me to how to do this?
>>> 
>>> Cheers,
>>> Engin
>>> 
>>> -- 
>>> Engin Özkan
>>> Post-doctoral Scholar
>>> Laboratory of K. Christopher Garcia
>>> Howard Hughes Medical Institute
>>> Dept of Molecular and Cellular Physiology
>>> 279 Campus Drive, Beckman Center B173
>>> Stanford School of Medicine
>>> Stanford, CA 94305
>>> w ph: (650)-498-7111
>>> cell: (650)-862-8563
>> 
>> 
>> -- 
>> Scanned by iCritical.
>> 
>
>
>-- 
>Scanned by iCritical.
>
>
>------------------------------
>
>Date:    Mon, 4 Mar 2013 12:08:27 +0000
>From:    Adam Ralph <[log in to unmask]>
>Subject: Re: compiling refmac5 on Ubuntu 12.04
>
>
>
>
>Following what Marcin said, if you have a compiler other than gcc/gfortran
>then I would definitely recommend compiling from source. Generally they
>give much better performance and in addition there
>might be optimised maths functions.
>
>
>Adam
>
>
>
>
>
>
>
>> One reason to compile refmac on Linux is that it can be faster.
>> I've just run $CEXAM/unix/runnable/refmac5-simple.exam example
>> with refmac from CCP4 6.3.0, from Garib's website and compiled with
>> GCC 4.7.2 only with -O3 option (all are 64-bit versions).
>> Running times were, correspondingly, 32.2s, 35.1s and 18.7s.
>> I'd speculate that most of the difference was caused by different
>> compiler version (GCC 4.7 vs 4.4). Times are avg of two runs, on i7
>> (sandy bridge), Fedora 18.
>
>
>> It's good for us if some users test new versions before they are in
>> ccp4 release, so we try to make compilation of individual programs easy.
>> If you like to try what's in bazaar repository, here is instruction
>> http://devtools.fg.oisin.rc-harwell.ac.uk/
>> I'm also putting repository snapshot here:
>> 
>>http://devtools.fg.oisin.rc-harwell.ac.uk/nightly/ccp4-latest-source.tar.
>>bz2
>> with all programs/modules in a separate top-level directory.
>> Have a look at the build-all.sh script there to see how it can be built.
>> For example, in case of refmac, with $HOME local as installation dir,
>> in mmdb and libccp4 subdirectories do:
>> ./configure --prefix=$HOME/local && make install
>> and in lapack and refmac directories:
>> cmake -DCMAKE_INSTALL_PREFIX=$HOME/local . && make install
>
>> compiler options can be passed through env vars, e.g.
>> export CFLAGS="-O2"
>> export CXXFLAGS="-O2"
>> export FFLAGS="-O2"
>
>> Marcin
>
>------------------------------
>
>Date:    Mon, 4 Mar 2013 08:45:42 -0500
>From:    Ed Pozharski <[log in to unmask]>
>Subject: Re: compiling refmac5 on Ubuntu 12.04
>
>Adam,
>
>On Mon, 2013-03-04 at 09:56 +0000, Adam Ralph wrote:
>> One of the first routines called by CCP4
>> progs is "ccp4fyp". This initialises the CCP4 environment.
>
>I think you might have missed in my original post that I get an error
>when I *do* source ccp4 environment.
>> 
>> Does the error occur with
>> refmac alone or with every CCP4 prog?
>
>I cannot tell because I am compiling only refmac, not all of the ccp4.
>
>-- 
>Oh, suddenly throwing a giraffe into a volcano to make water is crazy?
>                                                Julian, King of Lemurs
>
>------------------------------
>
>Date:    Mon, 4 Mar 2013 08:52:11 -0500
>From:    Ed Pozharski <[log in to unmask]>
>Subject: Re: compiling refmac5 on Ubuntu 12.04
>
>Indeed, the problem goes away when -static flag is omitted.
>Interestingly, the resulting binary dependencies do not include any
>ccp4-related libraries.  For those interested, I was able to track the
>segfault down to the close() operator - so basically it fails when
>closing a file opened with ccpdpn routine.  At that point I had a lucky
>guess of removing the flag (somewhat inspired by noticing that refmac5
>binary from cc4-6.3.0 is dynamic and after trying to compile separately
>a short piece of code that only opened and closed a file), so the issue
>is solved as far as my goals are concerned.
>
>On Mon, 2013-03-04 at 10:04 +0000, Garib N Murshudov wrote:
>> Dear all
>> 
>> 
>> I think this error has been dealt with (Ed will correct me if I am
>> wrong). The problem was -static in compilation. For whatever reason in
>> some gcc (gfortran) -static does not work (it compiles but has
>> problems in running, what is the reason is not clear to me). Sometimes
>> in later gcc -static-libgcc -static-libgfortran works but not always.
>> These flags are needed for distribution purposes. If you are compiling
>> and using on the same computer then you should not need it.
>> 
>> 
>> regards
>> Garib
>> 
>> 
>> 
>> On 4 Mar 2013, at 09:56, Adam Ralph wrote:
>> 
>> > Dear Ed,
>> > 
>> > 
>> >    The error does indeed happen in ccp4lib. One of the first
>> > routines called by CCP4
>> > progs is "ccp4fyp". This initialises the CCP4 environment. See
>> > lib/cctbx/cctbx_sources/ccp4io/lib/src/ccp4_general.c.
>> > 
>> > 
>> > If you look at the code you can see that $CINCL is determined at
>> > run-time. You are
>> > right that this environment var is not needed at compile time. Files
>> > like environ.def and
>> > default.def are read at this time. Perhaps there has been a
>> > corruption of one of these
>> > files or you are pointing to an earlier version of $CINCL. Does the
>> > error occur with
>> > refmac alone or with every CCP4 prog?
>> > 
>> > 
>> > Adam
>> > 
>> > 
>> 
>> Dr Garib N Murshudov
>> Group Leader, MRC Laboratory of Molecular Biology
>> Hills Road 
>> Cambridge 
>> CB2 0QH UK
>> Email: [log in to unmask]
>> Web http://www.mrc-lmb.cam.ac.uk
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>> 
>
>-- 
>Edwin Pozharski, PhD, Assistant Professor
>University of Maryland, Baltimore
>----------------------------------------------
>When the Way is forgotten duty and justice appear;
>Then knowledge and wisdom are born along with hypocrisy.
>When harmonious relationships dissolve then respect and devotion arise;
>When a nation falls to chaos then loyalty and patriotism are born.
>------------------------------   / Lao Tse /
>
>------------------------------
>
>Date:    Mon, 4 Mar 2013 08:55:08 -0500
>From:    Ed Pozharski <[log in to unmask]>
>Subject: Re: compiling refmac5 on Ubuntu 12.04
>
>On Mon, 2013-03-04 at 11:37 +0000, Marcin Wojdyr wrote:
>> Running times were, correspondingly, 32.2s, 35.1s and 18.7s.
>> 
>Numbers are almost too impressive to believe :)
>
>How does it compare with ifort (which I thought should be the fastest
>option on intel processors and thus unavailable (not free) for most DIY
>compilation given licensing issues)?
>
>-- 
>Oh, suddenly throwing a giraffe into a volcano to make water is crazy?
>                                                Julian, King of Lemurs
>
>------------------------------
>
>Date:    Mon, 4 Mar 2013 13:56:01 +0000
>From:    Stuart McNicholas <[log in to unmask]>
>Subject: CCP4MG Version 2.7.2
>
>Dear all,
>   Version 2.7.2 of CCP4MG is now available for download from:
>
>http://www.ccp4.ac.uk/MG/download/
>
>This version fixes two critical problems on Windows (failure to create
>movies and complete failure to work at all with many Intel graphics
>chips) and one critical problem on Linux (crash/hang at program start
>with new nVidia graphics cards).
>
>The complete list of changes is give below.
>
>Best wishes,
>CCP4MG team.
>
>
>OpenGL/ Graphics Drivers
>  * Disable use of Vertex Arrays on all Intel Graphics chips on Windows
>  for now, but allow user to override this behaviour. Should now work on
>all Windows machines, if not optimally.
>  * Make sure shadows are smooth when doing screenshot.
>  * Fix dark line styles when shadows are on.
>  * Shadows work with software Mesa (gallium/llvmpipe, Mesa 9.0 at least).
>  * Shaders/shadows, etc. work with some Intel graphics chips.
>  * Shader work with older NVIDIA shader version (1.20).
>  * Fix reversed z-buffer with screenshots on Windows/Intel.
>  * Fix hang/crash at at start-up with new nVidia chips on Linux.
>  * Fix dashed lines in non-VBO/VA case (e.g. MS OpenGL 1.1).
>  * Fix line colours in non-VBO/VA case (e.g. MS OpenGL 1.1).
>  * Fix symmetry for lines in non-VBO/VA case (e.g. MS OpenGL 1.1).
>  * EXPERIMENTAL: "Perfect spheres" style. "Ray-traced" spheres in
>graphics window. Not antialiased however, so screenshot with 2X/4X size
>recommended. Currently do nothing in "Render".
>
>User Interface
>  * Fix critical bug creating movies on Windows.
>  * Restore from XML works on Windows.
>  * Fix loading of XML status files with selections with single quotes,
>e.g. C5' .
>  * Fix "Apply all" to Annotation fonts.
>  * Sequence viewer option for "traffic light" or Consurf style colouring.
>  * Two new site+(broken)ribbons wizards which colour by molecule
>instead of chain.
>  * Change back/forward to undo/redo.
>  * Fix undo/redo by always forcing reload of data files.
>  * Add download coordinates and screenshot to default toolbar buttons.
>  * Only draw half a bond if drawing external to another bond scheme
>(i.e. not work/ribbon).
>  * Selection browser: Allow simple exclusion of CA when group==side
>(exclude=alpha).
>
>General
>  * Fix R32 symmetry generation from maps.
>  * Allow shorter CN bond.
>  * Symmetry off by default when an MTZ is loaded.
>  * Fixes for running Coot/refmac from CCP4MG.
>
>Other
>  * Fixes for compatibility with CCP4 build systems (no user impact,
>currently).
>
>------------------------------
>
>Date:    Mon, 4 Mar 2013 14:10:18 +0000
>From:    Orsolya Barabas <[log in to unmask]>
>Subject: Postdoctoral Position in Mechanism and Regulation of
>Transposition at EMBL Heidelberg
>
>Dear All,
>
>We are looking for a postdoc candidate to join our lab. The ideal
>candidate should have some experience in crystallography and be
>enthusiastic to discover molecular mechanisms of complex nucleic acid -
>protein machines. Please see full add below.
>To apply please go to:  www.embl.org/jobs
>
>Best regards, 
>Orsolya
>
>Postdoctoral Fellowship in Structural Biology - Mechanism and Regulation
>of Transposition
>
>Location: 	Heidelberg, Germany
>Staff Category: 	Postdoctoral Fellow
>Contract Duration: 	2 years
>Grading: 	N/A
>Closing Date: 	20 March 2013
>Reference number: 	HD_00330
>
>Job Description
>The European Molecular Biology Laboratory (EMBL) is one of the highest
>ranked scientific research organisations in the world. The Headquarters
>Laboratory is located in Heidelberg (Germany), with additional sites in
>Grenoble (France), Hamburg (Germany), Hinxton (UK) and Monterotondo
>(Italy).
>
>EMBL pursues interdisciplinary research in a strongly collaborative
>environment (see www.embl.de).
>A postdoctoral position is available in the Barabas Group at the
>Structural and Computational Biology Unit, EMBL, Heidelberg. The Barabas
>Group studies the mechanism of movement and regulation of transposons
>using molecular, structural, and cell biology techniques.
>
>Transposons pose a great threat to all living organisms because they can
>hop from one location to another in genomes and thereby create mutations,
>disrupt regulatory patterns and jeopardize vital functions. On the other
>hand, transposons have shaped evolution for millions of years and
>contributed many novel functions and regulatory patterns to genomes. They
>also provide attractive tools for genetic research (e.g. transgenesis and
>functional genomics), genome engineering (e.g. iPCS generation), and gene
>therapy. To understand the intricate symbiosis between transposons and
>their host cells, we investigate how transposons move and how are they
>controlled by host proteins, small RNA pathways, and DNA methylation. In
>addition, we are exploring and exploiting the potential of
>transposon-based tools for genetic research and synthetic biology.
>
>We seek a skilled and passionate biochemist or structural biologist who
>is enthusiastic to venture into our area of research. He/she will be
>involved in a project that uses a combination of structural biology
>(mainly X-ray crystallography), biochemistry (nucleic acid enzymology),
>biophysics, and possibly cell biology to discover mechanistic principles
>of transposition and/or transposon regulation. The successful candidate
>will join a dynamic young research group in the resourceful environment
>of EMBL. He/she will be encouraged to work independently, but well
>integrated into a highly collaborative team. He/she will be offered
>expert training in missing experimental techniques, as well as practice
>in training students and career mentoring through institutional and
>individual means.
>Qualifications and Experience
>
>Applicants should hold a Ph.D. in structural biology, biochemistry,
>molecular biology, cell biology, chemistry, biophysics, or in a closely
>related area. Research experience in X-ray crystallography, biochemistry,
>and basic molecular biology is desired. Good knowledge of standard
>cloning techniques and protein expression and purification are required.
>Moreover, an ability to work independently, as well as in a team, good
>communication skills, and advanced knowledge of English are also
>essential.
>Application Instructions
>Please apply online through www.embl.org/jobs
>Additional Information
>EMBL is an inclusive, equal opportunity employer offering attractive
>conditions and benefits appropriate to an international research
>organisation.
>
>Please note that appointments on fixed term contracts can be renewed,
>depending on circumstances at the time of the review.
>
>------------------------------
>
>Date:    Mon, 4 Mar 2013 09:09:11 -0600
>From:    Jason Vertrees <[log in to unmask]>
>Subject: Re: Off-topic PyMOL Issue
>
>Hi Greg,
>
>Takanori Nakane's suggestion should work for you, which is our workaround
>for non-compliant video cards/drivers. If it doesn't or you see
>performance
>or stability problems, please try these two settings:
>
>unset use_shaders
>unset sphere_mode
>
>These two settings are usually only necessary for Intel-based mobile
>chipsets. Furthermore, PyMOL v1.5.0.5 and later should automatically
>fallback to the most stable settings.
>
>Cheers,
>
>-- Jason
>
>
>On Sat, Mar 2, 2013 at 6:12 PM, Greg Costakes <[log in to unmask]> wrote:
>
>> Hello Everyone,
>>
>> I am having some difficulties with PyMOL on my windows machine and was
>> wondering if anyone has come across this problem.... When I load a pdb,
>> nothing shows up except for the waters. Selecting cartoon representation
>> will show the protein, however if I try to show residues as sticks or
>>lines
>> they appear disconnected. I have attached a figure for reference. When I
>> ray-trace the protein, the residues in stick representation appear
>>normal.
>> I tried re-installing PyMOL, but that did not fix the problem. My
>>graphics
>> card drivers are up to date. Any suggestions would be greatly
>>appreciated.
>> Thank you!
>>
>>
>> 
>>-------------------------------------------------------------------------
>>------
>> Greg Costakes
>> PhD Candidate
>> Department of Structural Biology
>> Purdue University
>> Hockmeyer Hall, Room 320
>> 240 S. Martin Jischke Drive, West Lafayette, IN 47907
>>
>>
>> 
>>-------------------------------------------------------------------------
>>-------
>>
>>
>>
>
>
>-- 
>Jason Vertrees, PhD
>Director of Core Modeling Products
>Schrödinger, Inc.
>
>(e) [log in to unmask]
>(o) +1 (603) 374-7120
>
>------------------------------
>
>Date:    Mon, 4 Mar 2013 15:02:38 +0000
>From:    Marcin Wojdyr <[log in to unmask]>
>Subject: Re: compiling refmac5 on Ubuntu 12.04
>
>On 4 March 2013 13:55,  <[log in to unmask]> wrote:
>> On Mon, 2013-03-04 at 11:37 +0000, Marcin Wojdyr wrote:
>>> Running times were, correspondingly, 32.2s, 35.1s and 18.7s.
>>>
>> Numbers are almost too impressive to believe :)
>
>It also puzzled me, but I haven't done more careful benchmarking yet.
>What did you get after compiling refmac?
>
>If anyone wants to try it without compilation, daily built $CCP4/bin
>directory for Linux x64 is at:
>http://devtools.fg.oisin.rc-harwell.ac.uk/nightly/
>file linux-x64-cbin-*.tar.bz2 with the latest date.
>It should work if you temporarily replace $CCP4/bin with the bin
>directory from there.
>But then move the original directory back, this new build is untested,
>some programs may be missing or not working.
>
>Regards
>Marcin
>
>------------------------------
>
>Date:    Mon, 4 Mar 2013 10:19:38 -0500
>From:    "Fischmann, Thierry" <[log in to unmask]>
>Subject: Re: compiling refmac5 on Ubuntu 12.04
>
>Ed,
>
>Are the numerical results the same ? Not likely that there is a problem.
>But if you haven't done it already it is worth checking by running the
>tests provided with the suite. Aggressive optimization can be a source of
>bugs.
>
>Best regards
>Thierry
>
>-----Original Message-----
>From: CCP4 bulletin board [mailto:[log in to unmask]] On Behalf Of Ed
>Pozharski
>Sent: Monday, March 04, 2013 8:55 AM
>To: [log in to unmask]
>Subject: Re: [ccp4bb] compiling refmac5 on Ubuntu 12.04
>
>On Mon, 2013-03-04 at 11:37 +0000, Marcin Wojdyr wrote:
>> Running times were, correspondingly, 32.2s, 35.1s and 18.7s.
>> 
>Numbers are almost too impressive to believe :)
>
>How does it compare with ifort (which I thought should be the fastest
>option on intel processors and thus unavailable (not free) for most DIY
>compilation given licensing issues)?
>
>-- 
>Oh, suddenly throwing a giraffe into a volcano to make water is crazy?
>                                                Julian, King of Lemurs
>Notice:  This e-mail message, together with any attachments, contains
>information of Merck & Co., Inc. (One Merck Drive, Whitehouse Station,
>New Jersey, USA 08889), and/or its affiliates Direct contact information
>for affiliates is available at 
>http://www.merck.com/contact/contacts.html) that may be confidential,
>proprietary copyrighted and/or legally privileged. It is intended solely
>for the use of the individual or entity named on this message. If you are
>not the intended recipient, and have received this message in error,
>please notify us immediately by reply e-mail and then delete it from 
>your system.
>
>
>------------------------------
>
>Date:    Mon, 4 Mar 2013 17:47:40 +0100
>From:    giovanna avella <[log in to unmask]>
>Subject: WORKSHOP Glycoproteins: From structure to disease
>
>We are currently organizing an International workshop in Mallorca on the
>theme of Glyocproteins and disease.
>
>http://events.embo.org/13-glycoproteins/
>
>Please apply now if you would like to participate.
>We still have some slots available for
>contributions of young investigators.
>
>Prof. Annalisa Pastore
>The National Institute for Medical Research
>The Ridgeway
>
>------------------------------
>
>Date:    Mon, 4 Mar 2013 11:04:00 -0800
>From:    James Holton <[log in to unmask]>
>Subject: Re: How to compare B-factors between structures?
>
>Formally, the "best" way to compare B factors in two structures with 
>different average B is to add a constant to all the B factors in the 
>low-B structure until the average B factor is the same in both 
>structures.  Then you can compare "apples to apples" as it were.  The 
>"extra B" being added is equivalent to "blurring" the more well-ordered 
>map to make it match the less-ordered one. Subtracting a B factor from 
>the less-ordered structure is "sharpening", and the reason why you 
>shouldn't do that here is because you'd be assuming that a sharpened map 
>has just as much structural information as the better diffracting 
>crystal, and that's obviously no true (not as many spots).   In reality, 
>your comparison will always be limited by the worst-resolution data you 
>have.
>
>Another reason to add rather than subtract a B factor is because B 
>factors are not really "linear" with anything sensible.  Yes, B=50 is 
>"more disordered" than B=25, but is it "twice as disordered"? That 
>depends on what you mean by "disorder", but no matter how you look at 
>it, the answer is generally "no".
>
>One way to define the "degree of disorder" is the volume swept out by 
>the atom's nucleus as it "vibrates" (or otherwise varies from cell to 
>cell).  This is NOT proportional to the B-factor, but rather the 3/2 
>power of the B factor.   Yes, 3/2 power.  The value of "B", is 
>proportional to the SQUARE of the width of the probability distribution 
>of the nucleus, so to get the volume of space swept out by it you have 
>to take the square root to get something proportional the the width and 
>then you take the 3rd power to get something proportional to the volume.
>
>An then, of course, if you want to talk about the electron cloud (which 
>is what x-rays "see") and not the nuclear position (which you can only 
>see if you are a neutron person), then you have to "add" a B factor of 
>about 8 to every atom to account for the intrinsic width of the electron 
>cloud.  Formally, the B factor is "convoluted" with the intrinsic atomic 
>form factor, but a "native" B factor of 8 is pretty close for most atoms.
>
>For those of you who are interested in something more exact than 
>"proportional" the equation for the nuclear probability distribution 
>generated by a given B factor is:
>kernel_B(r) = (4*pi/B)^1.5*exp(-4*pi^2/B*r^2)
>where "r" is the distance from the "average position" (aka the x-y-z 
>coordinates in the PDB file).  Note that the width of this distribution 
>of atomic positions is not really an "error bar", it is a "range".  
>There's a difference between an atom actually being located in a variety 
>of places vs not knowing the centroid of all these locations.  Remember, 
>you're averaging over trillions of unit cells.  If you collect a 
>different dataset from a similar crystal and re-refine the structure the 
>final x-y-z coordinate assigned to the atom will not change all that much.
>
>   The full-width at half-maximum (FWHM) of this kernel_B distribution is:
>  fwhm = 0.1325*sqrt(B)
>and the probability of finding the nucleus within this radius is 
>actually only about 29%.  The radius that contains the nucleus half the 
>time is about 1.3 times wider, or:
>r_half = 0.1731*sqrt(B)
>
>That is, for B=25, the atomic nucleus is within 0.87 A of its average 
>position 50% of the time (a volume of 2.7 A^3).  Whereas for B=50, it is 
>within 1.22 A 50% of the time (7.7 A^3).  Note that although B=50 is 
>twice as big as B=25, the half-occupancy radius 0.87 A is not half as 
>big as 1.22 A, nor are the volumes 2.7 and 7.7 A^3 related by a factor 
>of two.
>
>Why is this important for comparing two structures?   Since the B factor 
>is non-linear with disorder, it is important to have a common reference 
>point when comparing them.  If the low-B structure has two atoms with 
>B=10 and B=15 with average overall B=12, that might seem to be 
>"significant" (almost a factor of two in the half-occupancy volume) but 
>if the other structure has an average B factor of 80, then suddenly 78 
>vs 83 doesn't seem all that different (only a 10% change).  Basically, a 
>difference that would be "significant" in a high-resolution structure is 
>"washed out" by the overall crystallographic B factor of the 
>low-resolution structure in this case.
>
>Whether or not a 10% difference is "significant" depends on how accurate 
>you think your B factors are.  If you "kick" your coordinates (aka using 
>"noise" in PDBSET) and re-refine, how much do the final B factors change?
>
>-James Holton
>MAD Scientist
>
>On 2/25/2013 12:08 PM, Yarrow Madrona wrote:
>> Hello,
>>
>> Does anyone know a good method to compare B-factors between structures? 
>>I
>> would like to compare mutants to a wild-type structure.
>>
>> For example, structure2 has a higher B-factor for residue X but how can 
>>I
>> show that this is significant if the average B-factor is also higher?
>> Thank you for your help.
>>
>>
>
>------------------------------
>
>Date:    Mon, 4 Mar 2013 14:16:16 -0500
>From:    Jacob Keller <[log in to unmask]>
>Subject: Re: How to compare B-factors between structures?
>
>You only entertain addition+subtraction--why not use
>multiplication/division to normalize the b-factors?
>
>JPK
>
>On Mon, Mar 4, 2013 at 2:04 PM, James Holton <[log in to unmask]> wrote:
>
>> Formally, the "best" way to compare B factors in two structures with
>> different average B is to add a constant to all the B factors in the 
>>low-B
>> structure until the average B factor is the same in both structures.  
>>Then
>> you can compare "apples to apples" as it were.  The "extra B" being 
>>added
>> is equivalent to "blurring" the more well-ordered map to make it match 
>>the
>> less-ordered one. Subtracting a B factor from the less-ordered 
>>structure is
>> "sharpening", and the reason why you shouldn't do that here is because
>> you'd be assuming that a sharpened map has just as much structural
>> information as the better diffracting crystal, and that's obviously no 
>>true
>> (not as many spots).   In reality, your comparison will always be 
>>limited
>> by the worst-resolution data you have.
>>
>> Another reason to add rather than subtract a B factor is because B 
>>factors
>> are not really "linear" with anything sensible.  Yes, B=50 is "more
>> disordered" than B=25, but is it "twice as disordered"? That depends on
>> what you mean by "disorder", but no matter how you look at it, the 
>>answer
>> is generally "no".
>>
>> One way to define the "degree of disorder" is the volume swept out by 
>>the
>> atom's nucleus as it "vibrates" (or otherwise varies from cell to cell).
>>  This is NOT proportional to the B-factor, but rather the 3/2 power of 
>>the
>> B factor.   Yes, 3/2 power.  The value of "B", is proportional to the
>> SQUARE of the width of the probability distribution of the nucleus, so 
>>to
>> get the volume of space swept out by it you have to take the square 
>>root to
>> get something proportional the the width and then you take the 3rd 
>>power to
>> get something proportional to the volume.
>>
>> An then, of course, if you want to talk about the electron cloud (which 
>>is
>> what x-rays "see") and not the nuclear position (which you can only see 
>>if
>> you are a neutron person), then you have to "add" a B factor of about 8 
>>to
>> every atom to account for the intrinsic width of the electron cloud.
>>  Formally, the B factor is "convoluted" with the intrinsic atomic form
>> factor, but a "native" B factor of 8 is pretty close for most atoms.
>>
>> For those of you who are interested in something more exact than
>> "proportional" the equation for the nuclear probability distribution
>> generated by a given B factor is:
>> kernel_B(r) = (4*pi/B)^1.5*exp(-4*pi^2/B*r^**2)
>> where "r" is the distance from the "average position" (aka the x-y-z
>> coordinates in the PDB file).  Note that the width of this distribution 
>>of
>> atomic positions is not really an "error bar", it is a "range".  
>>There's a
>> difference between an atom actually being located in a variety of 
>>places vs
>> not knowing the centroid of all these locations.  Remember, you're
>> averaging over trillions of unit cells.  If you collect a different 
>>dataset
>> from a similar crystal and re-refine the structure the final x-y-z
>> coordinate assigned to the atom will not change all that much.
>>
>>   The full-width at half-maximum (FWHM) of this kernel_B distribution 
>>is:
>>  fwhm = 0.1325*sqrt(B)
>> and the probability of finding the nucleus within this radius is 
>>actually
>> only about 29%.  The radius that contains the nucleus half the time is
>> about 1.3 times wider, or:
>> r_half = 0.1731*sqrt(B)
>>
>> That is, for B=25, the atomic nucleus is within 0.87 A of its average
>> position 50% of the time (a volume of 2.7 A^3).  Whereas for B=50, it is
>> within 1.22 A 50% of the time (7.7 A^3).  Note that although B=50 is 
>>twice
>> as big as B=25, the half-occupancy radius 0.87 A is not half as big as 
>>1.22
>> A, nor are the volumes 2.7 and 7.7 A^3 related by a factor of two.
>>
>> Why is this important for comparing two structures?   Since the B factor
>> is non-linear with disorder, it is important to have a common reference
>> point when comparing them.  If the low-B structure has two atoms with 
>>B=10
>> and B=15 with average overall B=12, that might seem to be "significant"
>> (almost a factor of two in the half-occupancy volume) but if the other
>> structure has an average B factor of 80, then suddenly 78 vs 83 doesn't
>> seem all that different (only a 10% change).  Basically, a difference 
>>that
>> would be "significant" in a high-resolution structure is "washed out" by
>> the overall crystallographic B factor of the low-resolution structure in
>> this case.
>>
>> Whether or not a 10% difference is "significant" depends on how accurate
>> you think your B factors are.  If you "kick" your coordinates (aka using
>> "noise" in PDBSET) and re-refine, how much do the final B factors 
>>change?
>>
>> -James Holton
>> MAD Scientist
>>
>>
>> On 2/25/2013 12:08 PM, Yarrow Madrona wrote:
>>
>>> Hello,
>>>
>>> Does anyone know a good method to compare B-factors between 
>>>structures? I
>>> would like to compare mutants to a wild-type structure.
>>>
>>> For example, structure2 has a higher B-factor for residue X but how 
>>>can I
>>> show that this is significant if the average B-factor is also higher?
>>> Thank you for your help.
>>>
>>>
>>>
>
>
>-- 
>*******************************************
>Jacob Pearson Keller, PhD
>Postdoctoral Associate
>HHMI Janelia Farms Research Campus
>email: [log in to unmask]
>*******************************************
>
>------------------------------
>
>Date:    Mon, 4 Mar 2013 14:19:30 -0500
>From:    "Bosch, Juergen" <[log in to unmask]>
>Subject: Re: How to compare B-factors between structures?
>
>Yep, I agree calculate the average B per structure and divide each B by 
>this value, then multiply it by any value that is reasonable so you can 
>visualize color differences :-)
>Jürgen
>
>On Mar 4, 2013, at 2:16 PM, Jacob Keller wrote:
>
>You only entertain addition+subtraction--why not use 
>multiplication/division to normalize the b-factors?
>
>JPK
>
>On Mon, Mar 4, 2013 at 2:04 PM, James Holton 
><[log in to unmask]<mailto:[log in to unmask]>> wrote:
>Formally, the "best" way to compare B factors in two structures with 
>different average B is to add a constant to all the B factors in the 
>low-B structure until the average B factor is the same in both 
>structures.  Then you can compare "apples to apples" as it were.  The 
>"extra B" being added is equivalent to "blurring" the more well-ordered 
>map to make it match the less-ordered one. Subtracting a B factor from 
>the less-ordered structure is "sharpening", and the reason why you 
>shouldn't do that here is because you'd be assuming that a sharpened map 
>has just as much structural information as the better diffracting 
>crystal, and that's obviously no true (not as many spots).   In reality, 
>your comparison will always be limited by the worst-resolution data you 
>have.
>
>Another reason to add rather than subtract a B factor is because B 
>factors are not really "linear" with anything sensible.  Yes, B=50 is 
>"more disordered" than B=25, but is it "twice as disordered"? That 
>depends on what you mean by "disorder", but no matter how you look at it, 
>the answer is generally "no".
>
>One way to define the "degree of disorder" is the volume swept out by the 
>atom's nucleus as it "vibrates" (or otherwise varies from cell to cell).  
>This is NOT proportional to the B-factor, but rather the 3/2 power of the 
>B factor.   Yes, 3/2 power.  The value of "B", is proportional to the 
>SQUARE of the width of the probability distribution of the nucleus, so to 
>get the volume of space swept out by it you have to take the square root 
>to get something proportional the the width and then you take the 3rd 
>power to get something proportional to the volume.
>
>An then, of course, if you want to talk about the electron cloud (which 
>is what x-rays "see") and not the nuclear position (which you can only 
>see if you are a neutron person), then you have to "add" a B factor of 
>about 8 to every atom to account for the intrinsic width of the electron 
>cloud.  Formally, the B factor is "convoluted" with the intrinsic atomic 
>form factor, but a "native" B factor of 8 is pretty close for most atoms.
>
>For those of you who are interested in something more exact than 
>"proportional" the equation for the nuclear probability distribution 
>generated by a given B factor is:
>kernel_B(r) = (4*pi/B)^1.5*exp(-4*pi^2/B*r^2)
>where "r" is the distance from the "average position" (aka the x-y-z 
>coordinates in the PDB file).  Note that the width of this distribution 
>of atomic positions is not really an "error bar", it is a "range".  
>There's a difference between an atom actually being located in a variety 
>of places vs not knowing the centroid of all these locations.  Remember, 
>you're averaging over trillions of unit cells.  If you collect a 
>different dataset from a similar crystal and re-refine the structure the 
>final x-y-z coordinate assigned to the atom will not change all that much.
>
>  The full-width at half-maximum (FWHM) of this kernel_B distribution is:
> fwhm = 0.1325*sqrt(B)
>and the probability of finding the nucleus within this radius is actually 
>only about 29%.  The radius that contains the nucleus half the time is 
>about 1.3 times wider, or:
>r_half = 0.1731*sqrt(B)
>
>That is, for B=25, the atomic nucleus is within 0.87 A of its average 
>position 50% of the time (a volume of 2.7 A^3).  Whereas for B=50, it is 
>within 1.22 A 50% of the time (7.7 A^3).  Note that although B=50 is 
>twice as big as B=25, the half-occupancy radius 0.87 A is not half as big 
>as 1.22 A, nor are the volumes 2.7 and 7.7 A^3 related by a factor of two.
>
>Why is this important for comparing two structures?   Since the B factor 
>is non-linear with disorder, it is important to have a common reference 
>point when comparing them.  If the low-B structure has two atoms with 
>B=10 and B=15 with average overall B=12, that might seem to be 
>"significant" (almost a factor of two in the half-occupancy volume) but 
>if the other structure has an average B factor of 80, then suddenly 78 vs 
>83 doesn't seem all that different (only a 10% change).  Basically, a 
>difference that would be "significant" in a high-resolution structure is 
>"washed out" by the overall crystallographic B factor of the 
>low-resolution structure in this case.
>
>Whether or not a 10% difference is "significant" depends on how accurate 
>you think your B factors are.  If you "kick" your coordinates (aka using 
>"noise" in PDBSET) and re-refine, how much do the final B factors change?
>
>-James Holton
>MAD Scientist
>
>
>On 2/25/2013 12:08 PM, Yarrow Madrona wrote:
>Hello,
>
>Does anyone know a good method to compare B-factors between structures? I
>would like to compare mutants to a wild-type structure.
>
>For example, structure2 has a higher B-factor for residue X but how can I
>show that this is significant if the average B-factor is also higher?
>Thank you for your help.
>
>
>
>
>
>--
>*******************************************
>Jacob Pearson Keller, PhD
>Postdoctoral Associate
>HHMI Janelia Farms Research Campus
>email: [log in to unmask]<mailto:[log in to unmask]>
>*******************************************
>
>......................
>Jürgen Bosch
>Johns Hopkins University
>Bloomberg School of Public Health
>Department of Biochemistry & Molecular Biology
>Johns Hopkins Malaria Research Institute
>615 North Wolfe Street, W8708
>Baltimore, MD 21205
>Office: +1-410-614-4742
>Lab:      +1-410-614-4894
>Fax:      +1-410-955-2926
>http://lupo.jhsph.edu
>
>
>
>
>------------------------------
>
>Date:    Mon, 4 Mar 2013 11:24:14 -0800
>From:    Filip Van Petegem <[log in to unmask]>
>Subject: Re: How to compare B-factors between structures?
>
>Good point;  I've tested this (n=1) in the past with a high-resolution
>dataset (synchrotron data) and low-resolution dataset (in-house) of
>crystals of the same protein grown in the same drop.  Same space group,
>same unit cell. B-factors for the low-resolution dataset were higher.
> After dividing every individual B-factor by the average B-factor of each,
>the normalized-B-factor-versus-residue plot was identical for both
>structures.  Adding or subtracting a constant value didn't do that.
>
>As I pointed out, this is only n=1, but comparing the high-and
>low-resolution structures of the same condition should give the answer as
>to which B-factor normalization is the most appropriate.
>
>Filip Van Petegem
>
>
>
>
>On Mon, Mar 4, 2013 at 11:16 AM, Jacob Keller <
>[log in to unmask]> wrote:
>
>> You only entertain addition+subtraction--why not use
>> multiplication/division to normalize the b-factors?
>>
>> JPK
>>
>>
>> On Mon, Mar 4, 2013 at 2:04 PM, James Holton <[log in to unmask]> wrote:
>>
>>> Formally, the "best" way to compare B factors in two structures with
>>> different average B is to add a constant to all the B factors in the 
>>>low-B
>>> structure until the average B factor is the same in both structures.  
>>>Then
>>> you can compare "apples to apples" as it were.  The "extra B" being 
>>>added
>>> is equivalent to "blurring" the more well-ordered map to make it match 
>>>the
>>> less-ordered one. Subtracting a B factor from the less-ordered 
>>>structure is
>>> "sharpening", and the reason why you shouldn't do that here is because
>>> you'd be assuming that a sharpened map has just as much structural
>>> information as the better diffracting crystal, and that's obviously no 
>>>true
>>> (not as many spots).   In reality, your comparison will always be 
>>>limited
>>> by the worst-resolution data you have.
>>>
>>> Another reason to add rather than subtract a B factor is because B
>>> factors are not really "linear" with anything sensible.  Yes, B=50 is 
>>>"more
>>> disordered" than B=25, but is it "twice as disordered"? That depends on
>>> what you mean by "disorder", but no matter how you look at it, the 
>>>answer
>>> is generally "no".
>>>
>>> One way to define the "degree of disorder" is the volume swept out by 
>>>the
>>> atom's nucleus as it "vibrates" (or otherwise varies from cell to 
>>>cell).
>>>  This is NOT proportional to the B-factor, but rather the 3/2 power of 
>>>the
>>> B factor.   Yes, 3/2 power.  The value of "B", is proportional to the
>>> SQUARE of the width of the probability distribution of the nucleus, so 
>>>to
>>> get the volume of space swept out by it you have to take the square 
>>>root to
>>> get something proportional the the width and then you take the 3rd 
>>>power to
>>> get something proportional to the volume.
>>>
>>> An then, of course, if you want to talk about the electron cloud (which
>>> is what x-rays "see") and not the nuclear position (which you can only 
>>>see
>>> if you are a neutron person), then you have to "add" a B factor of 
>>>about 8
>>> to every atom to account for the intrinsic width of the electron cloud.
>>>  Formally, the B factor is "convoluted" with the intrinsic atomic form
>>> factor, but a "native" B factor of 8 is pretty close for most atoms.
>>>
>>> For those of you who are interested in something more exact than
>>> "proportional" the equation for the nuclear probability distribution
>>> generated by a given B factor is:
>>> kernel_B(r) = (4*pi/B)^1.5*exp(-4*pi^2/B*r^**2)
>>> where "r" is the distance from the "average position" (aka the x-y-z
>>> coordinates in the PDB file).  Note that the width of this 
>>>distribution of
>>> atomic positions is not really an "error bar", it is a "range".  
>>>There's a
>>> difference between an atom actually being located in a variety of 
>>>places vs
>>> not knowing the centroid of all these locations.  Remember, you're
>>> averaging over trillions of unit cells.  If you collect a different 
>>>dataset
>>> from a similar crystal and re-refine the structure the final x-y-z
>>> coordinate assigned to the atom will not change all that much.
>>>
>>>   The full-width at half-maximum (FWHM) of this kernel_B distribution 
>>>is:
>>>  fwhm = 0.1325*sqrt(B)
>>> and the probability of finding the nucleus within this radius is 
>>>actually
>>> only about 29%.  The radius that contains the nucleus half the time is
>>> about 1.3 times wider, or:
>>> r_half = 0.1731*sqrt(B)
>>>
>>> That is, for B=25, the atomic nucleus is within 0.87 A of its average
>>> position 50% of the time (a volume of 2.7 A^3).  Whereas for B=50, it 
>>>is
>>> within 1.22 A 50% of the time (7.7 A^3).  Note that although B=50 is 
>>>twice
>>> as big as B=25, the half-occupancy radius 0.87 A is not half as big as 
>>>1.22
>>> A, nor are the volumes 2.7 and 7.7 A^3 related by a factor of two.
>>>
>>> Why is this important for comparing two structures?   Since the B 
>>>factor
>>> is non-linear with disorder, it is important to have a common reference
>>> point when comparing them.  If the low-B structure has two atoms with 
>>>B=10
>>> and B=15 with average overall B=12, that might seem to be "significant"
>>> (almost a factor of two in the half-occupancy volume) but if the other
>>> structure has an average B factor of 80, then suddenly 78 vs 83 doesn't
>>> seem all that different (only a 10% change).  Basically, a difference 
>>>that
>>> would be "significant" in a high-resolution structure is "washed out" 
>>>by
>>> the overall crystallographic B factor of the low-resolution structure 
>>>in
>>> this case.
>>>
>>> Whether or not a 10% difference is "significant" depends on how 
>>>accurate
>>> you think your B factors are.  If you "kick" your coordinates (aka 
>>>using
>>> "noise" in PDBSET) and re-refine, how much do the final B factors 
>>>change?
>>>
>>> -James Holton
>>> MAD Scientist
>>>
>>>
>>> On 2/25/2013 12:08 PM, Yarrow Madrona wrote:
>>>
>>>> Hello,
>>>>
>>>> Does anyone know a good method to compare B-factors between 
>>>>structures? I
>>>> would like to compare mutants to a wild-type structure.
>>>>
>>>> For example, structure2 has a higher B-factor for residue X but how 
>>>>can I
>>>> show that this is significant if the average B-factor is also higher?
>>>> Thank you for your help.
>>>>
>>>>
>>>>
>>
>>
>> --
>> *******************************************
>> Jacob Pearson Keller, PhD
>> Postdoctoral Associate
>> HHMI Janelia Farms Research Campus
>> email: [log in to unmask]
>> *******************************************
>>
>
>
>
>-- 
>Filip Van Petegem, PhD
>Associate Professor
>The University of British Columbia
>Dept. of Biochemistry and Molecular Biology
>2350 Health Sciences Mall - Rm 2.356
>Vancouver, V6T 1Z3
>
>phone: +1 604 827 4267
>email: [log in to unmask]
>http://crg.ubc.ca/VanPetegem/
>
>------------------------------
>
>Date:    Mon, 4 Mar 2013 11:26:13 -0800
>From:    James Holton <[log in to unmask]>
>Subject: Re: How to compare B-factors between structures?
>
>
>No, you can only add and subtract B values because that is 
>mathematically equivalent to multiplication in reciprocal space (which 
>is equivalent to convolution in real space):
>
>exp(-B1*s^2) * exp(-B2*s^2) = exp(-(B1+B2)*s^2)
>
>Multiplying and dividing B values is mathematically equivalent to 
>applying fractional power-law or fractional root functions in reciprocal 
>space (and I don't even want to think about what that does in real space).
>
>exp(-B1*B2*s^2) = ???
>
>-James Holton
>MAD Scientist
>
>
>On 3/4/2013 11:19 AM, Bosch, Juergen wrote:
>> Yep, I agree calculate the average B per structure and divide each B 
>> by this value, then multiply it by any value that is reasonable so you 
>> can visualize color differences :-)
>> Jürgen
>>
>> On Mar 4, 2013, at 2:16 PM, Jacob Keller wrote:
>>
>>> You only entertain addition+subtraction--why not use 
>>> multiplication/division to normalize the b-factors?
>>>
>>> JPK
>>>
>>> On Mon, Mar 4, 2013 at 2:04 PM, James Holton <[log in to unmask] 
>>> <mailto:[log in to unmask]>> wrote:
>>>
>>>     Formally, the "best" way to compare B factors in two structures
>>>     with different average B is to add a constant to all the B
>>>     factors in the low-B structure until the average B factor is the
>>>     same in both structures.  Then you can compare "apples to apples"
>>>     as it were.  The "extra B" being added is equivalent to
>>>     "blurring" the more well-ordered map to make it match the
>>>     less-ordered one. Subtracting a B factor from the less-ordered
>>>     structure is "sharpening", and the reason why you shouldn't do
>>>     that here is because you'd be assuming that a sharpened map has
>>>     just as much structural information as the better diffracting
>>>     crystal, and that's obviously no true (not as many spots).   In
>>>     reality, your comparison will always be limited by the
>>>     worst-resolution data you have.
>>>
>>>     Another reason to add rather than subtract a B factor is because
>>>     B factors are not really "linear" with anything sensible.  Yes,
>>>     B=50 is "more disordered" than B=25, but is it "twice as
>>>     disordered"? That depends on what you mean by "disorder", but no
>>>     matter how you look at it, the answer is generally "no".
>>>
>>>     One way to define the "degree of disorder" is the volume swept
>>>     out by the atom's nucleus as it "vibrates" (or otherwise varies
>>>     from cell to cell).  This is NOT proportional to the B-factor,
>>>     but rather the 3/2 power of the B factor.   Yes, 3/2 power.  The
>>>     value of "B", is proportional to the SQUARE of the width of the
>>>     probability distribution of the nucleus, so to get the volume of
>>>     space swept out by it you have to take the square root to get
>>>     something proportional the the width and then you take the 3rd
>>>     power to get something proportional to the volume.
>>>
>>>     An then, of course, if you want to talk about the electron cloud
>>>     (which is what x-rays "see") and not the nuclear position (which
>>>     you can only see if you are a neutron person), then you have to
>>>     "add" a B factor of about 8 to every atom to account for the
>>>     intrinsic width of the electron cloud.  Formally, the B factor is
>>>     "convoluted" with the intrinsic atomic form factor, but a
>>>     "native" B factor of 8 is pretty close for most atoms.
>>>
>>>     For those of you who are interested in something more exact than
>>>     "proportional" the equation for the nuclear probability
>>>     distribution generated by a given B factor is:
>>>     kernel_B(r) = (4*pi/B)^1.5*exp(-4*pi^2/B*r^2)
>>>     where "r" is the distance from the "average position" (aka the
>>>     x-y-z coordinates in the PDB file).  Note that the width of this
>>>     distribution of atomic positions is not really an "error bar", it
>>>     is a "range".  There's a difference between an atom actually
>>>     being located in a variety of places vs not knowing the centroid
>>>     of all these locations.  Remember, you're averaging over
>>>     trillions of unit cells.  If you collect a different dataset from
>>>     a similar crystal and re-refine the structure the final x-y-z
>>>     coordinate assigned to the atom will not change all that much.
>>>
>>>       The full-width at half-maximum (FWHM) of this kernel_B
>>>     distribution is:
>>>      fwhm = 0.1325*sqrt(B)
>>>     and the probability of finding the nucleus within this radius is
>>>     actually only about 29%.  The radius that contains the nucleus
>>>     half the time is about 1.3 times wider, or:
>>>     r_half = 0.1731*sqrt(B)
>>>
>>>     That is, for B=25, the atomic nucleus is within 0.87 A of its
>>>     average position 50% of the time (a volume of 2.7 A^3).  Whereas
>>>     for B=50, it is within 1.22 A 50% of the time (7.7 A^3).  Note
>>>     that although B=50 is twice as big as B=25, the half-occupancy
>>>     radius 0.87 A is not half as big as 1.22 A, nor are the volumes
>>>     2.7 and 7.7 A^3 related by a factor of two.
>>>
>>>     Why is this important for comparing two structures? Since the B
>>>     factor is non-linear with disorder, it is important to have a
>>>     common reference point when comparing them.  If the low-B
>>>     structure has two atoms with B=10 and B=15 with average overall
>>>     B=12, that might seem to be "significant" (almost a factor of two
>>>     in the half-occupancy volume) but if the other structure has an
>>>     average B factor of 80, then suddenly 78 vs 83 doesn't seem all
>>>     that different (only a 10% change).  Basically, a difference that
>>>     would be "significant" in a high-resolution structure is "washed
>>>     out" by the overall crystallographic B factor of the
>>>     low-resolution structure in this case.
>>>
>>>     Whether or not a 10% difference is "significant" depends on how
>>>     accurate you think your B factors are.  If you "kick" your
>>>     coordinates (aka using "noise" in PDBSET) and re-refine, how much
>>>     do the final B factors change?
>>>
>>>     -James Holton
>>>     MAD Scientist
>>>
>>>
>>>     On 2/25/2013 12:08 PM, Yarrow Madrona wrote:
>>>
>>>         Hello,
>>>
>>>         Does anyone know a good method to compare B-factors between
>>>         structures? I
>>>         would like to compare mutants to a wild-type structure.
>>>
>>>         For example, structure2 has a higher B-factor for residue X
>>>         but how can I
>>>         show that this is significant if the average B-factor is also
>>>         higher?
>>>         Thank you for your help.
>>>
>>>
>>>
>>>
>>>
>>> -- 
>>> *******************************************
>>> Jacob Pearson Keller, PhD
>>> Postdoctoral Associate
>>> HHMI Janelia Farms Research Campus
>>> email: [log in to unmask] <mailto:[log in to unmask]>
>>> *******************************************
>>
>> ......................
>> Jürgen Bosch
>> Johns Hopkins University
>> Bloomberg School of Public Health
>> Department of Biochemistry & Molecular Biology
>> Johns Hopkins Malaria Research Institute
>> 615 North Wolfe Street, W8708
>> Baltimore, MD 21205
>> Office: +1-410-614-4742
>> Lab:      +1-410-614-4894
>> Fax:      +1-410-955-2926
>> http://lupo.jhsph.edu
>>
>>
>>
>>
>
>------------------------------
>
>Date:    Mon, 4 Mar 2013 14:31:24 -0600
>From:    John Fisher <[log in to unmask]>
>Subject: Re: How to compare B-factors between structures?
>
>Seriously? 
>I believe this specific forum has become quicksand rather than a useful 
>tool. Normalization based simply on ratios of one struct to the other 
>should allow normalization. Correct?, or have I just simply lost my mind 
>here?
>J
>
>John Fisher, M.D./PhD
>St. Jude Children's Research Hospital
>Department of Oncology
>Department of Structural Biology
>W: 901-595-6193
>C: 901-409-5699
>
>On Mar 4, 2013, at 1:26 PM, James Holton <[log in to unmask]> wrote:
>
>> 
>> No, you can only add and subtract B values because that is 
>>mathematically equivalent to multiplication in reciprocal space (which 
>>is equivalent to convolution in real space):
>> 
>> exp(-B1*s^2) * exp(-B2*s^2) = exp(-(B1+B2)*s^2)
>> 
>> Multiplying and dividing B values is mathematically equivalent to 
>>applying fractional power-law or fractional root functions in reciprocal 
>>space (and I don't even want to think about what that does in real 
>>space).
>> 
>> exp(-B1*B2*s^2) = ???
>> 
>> -James Holton
>> MAD Scientist
>> 
>> 
>> On 3/4/2013 11:19 AM, Bosch, Juergen wrote:
>>> Yep, I agree calculate the average B per structure and divide each B 
>>>by this value, then multiply it by any value that is reasonable so you 
>>>can visualize color differences :-)
>>> Jürgen
>>> 
>>> On Mar 4, 2013, at 2:16 PM, Jacob Keller wrote:
>>> 
>>>> You only entertain addition+subtraction--why not use 
>>>>multiplication/division to normalize the b-factors?
>>>> 
>>>> JPK
>>>> 
>>>> On Mon, Mar 4, 2013 at 2:04 PM, James Holton <[log in to unmask]> wrote:
>>>>> Formally, the "best" way to compare B factors in two structures with 
>>>>>different average B is to add a constant to all the B factors in the 
>>>>>low-B structure until the average B factor is the same in both 
>>>>>structures.  Then you can                   compare "apples to 
>>>>>apples" as it were.  The "extra B" being added is equivalent to 
>>>>>"blurring" the more well-ordered map to make it match the 
>>>>>less-ordered one. Subtracting a B factor from the less-ordered 
>>>>>structure is "sharpening", and the reason why you shouldn't do that 
>>>>>here is because you'd be assuming that a sharpened map has just as 
>>>>>much structural information as the better diffracting crystal, and 
>>>>>that's obviously no true (not as many spots).   In reality, your 
>>>>>comparison will always be limited by the worst-resolution data you 
>>>>>have.
>>>>> 
>>>>> Another reason to add rather than subtract a B factor is because B 
>>>>>factors are not really "linear" with anything sensible.  Yes, B=50 is 
>>>>>"more disordered" than B=25, but is it "twice as disordered"? That 
>>>>>depends on what you mean by "disorder", but no matter how you look at 
>>>>>it, the answer is generally "no".
>>>>> 
>>>>> One way to define the "degree of disorder" is the volume swept out 
>>>>>by the atom's nucleus as it "vibrates" (or otherwise varies from cell 
>>>>>to cell).  This is NOT proportional to the B-factor, but rather the 
>>>>>3/2 power of the B factor.   Yes, 3/2 power.  The value of "B", is 
>>>>>proportional to the SQUARE of the width of the probability 
>>>>>distribution of the nucleus, so to get the volume of space swept out 
>>>>>by it you have to take the square root to get something proportional 
>>>>>the the width and then you take the 3rd power to get something 
>>>>>proportional to the volume.
>>>>> 
>>>>> An then, of course, if you want to talk about the electron cloud 
>>>>>(which is what x-rays "see") and not the nuclear position (which you 
>>>>>can only see if you are a neutron person), then you have to "add" a B 
>>>>>factor of about 8 to every atom to account for the intrinsic width of 
>>>>>the electron cloud.  Formally, the B factor is "convoluted" with the 
>>>>>intrinsic atomic form factor, but a "native" B factor of 8 is pretty 
>>>>>close for most atoms.
>>>>> 
>>>>> For those of you who are interested in something more exact than 
>>>>>"proportional" the equation for the nuclear probability distribution 
>>>>>generated by a given B factor is:
>>>>> kernel_B(r) = (4*pi/B)^1.5*exp(-4*pi^2/B*r^2)
>>>>> where "r" is the distance from the "average position" (aka the x-y-z 
>>>>>coordinates in the PDB file).  Note that the width of this 
>>>>>distribution of atomic positions is not really an "error bar", it is 
>>>>>a "range".  There's a difference between an atom actually being 
>>>>>located in a variety of places vs not knowing the centroid of all 
>>>>>these locations.  Remember, you're averaging over trillions of unit 
>>>>>cells.  If you collect a different dataset from a similar crystal and 
>>>>>re-refine the structure the final x-y-z coordinate assigned to the 
>>>>>atom will not change all that much.
>>>>> 
>>>>>   The full-width at half-maximum (FWHM) of this kernel_B 
>>>>>distribution is:
>>>>>  fwhm = 0.1325*sqrt(B)
>>>>> and the probability of finding the nucleus within this radius is 
>>>>>actually only about 29%.  The radius that contains the nucleus half 
>>>>>the time is about 1.3 times wider, or:
>>>>> r_half = 0.1731*sqrt(B)
>>>>> 
>>>>> That is, for B=25, the atomic nucleus is within 0.87 A of its 
>>>>>average position 50% of the time (a volume of 2.7 A^3).  Whereas for 
>>>>>B=50, it is within 1.22 A 50% of the time (7.7 A^3).  Note that 
>>>>>although B=50 is twice as big as B=25, the half-occupancy radius 0.87 
>>>>>A is not half as big as 1.22 A, nor are the volumes 2.7 and 7.7 A^3 
>>>>>related by a factor of two.
>>>>> 
>>>>> Why is this important for comparing two structures?   Since the B 
>>>>>factor is non-linear with disorder, it is important to have a common 
>>>>>reference point when comparing them.  If the low-B structure has two 
>>>>>atoms with B=10 and B=15 with average overall B=12, that might seem 
>>>>>to be "significant" (almost a factor of two in the half-occupancy 
>>>>>volume) but if the other structure has an average B factor of 80, 
>>>>>then suddenly 78 vs 83 doesn't seem all that different (only a 10% 
>>>>>change).  Basically, a difference that would be "significant" in a 
>>>>>high-resolution structure is "washed out" by the overall 
>>>>>crystallographic B factor of the low-resolution structure in this 
>>>>>case.
>>>>> 
>>>>> Whether or not a 10% difference is "significant" depends on how 
>>>>>accurate you think your B factors are.  If you "kick" your 
>>>>>coordinates (aka using "noise" in                   PDBSET) and 
>>>>>re-refine, how much do the final B factors change?
>>>>> 
>>>>> -James Holton
>>>>> MAD Scientist
>>>>> 
>>>>> 
>>>>> On 2/25/2013 12:08 PM, Yarrow Madrona wrote:
>>>>>> Hello,
>>>>>> 
>>>>>> Does anyone know a good method to compare B-factors between 
>>>>>>structures? I
>>>>>> would like to compare mutants to a wild-type structure.
>>>>>> 
>>>>>> For example, structure2 has a higher B-factor for residue X but how 
>>>>>>can I
>>>>>> show that this is significant if the average B-factor is also 
>>>>>>higher?
>>>>>> Thank you for your help.
>>>> 
>>>> 
>>>> 
>>>> -- 
>>>> *******************************************
>>>> Jacob Pearson Keller, PhD
>>>> Postdoctoral Associate
>>>> HHMI Janelia Farms Research Campus
>>>> email: [log in to unmask]
>>>> *******************************************
>>> 
>>> ......................
>>> Jürgen Bosch
>>> Johns Hopkins University
>>> Bloomberg School of Public Health
>>> Department of Biochemistry & Molecular Biology
>>> Johns Hopkins Malaria Research Institute
>>> 615 North Wolfe Street, W8708
>>> Baltimore, MD 21205
>>> Office: +1-410-614-4742
>>> Lab:      +1-410-614-4894
>>> Fax:      +1-410-955-2926
>>> http://lupo.jhsph.edu
>> 
>
>------------------------------
>
>Date:    Mon, 4 Mar 2013 16:13:03 -0600
>From:    "Ruslan Sanishvili (Nukri)" <[log in to unmask]>
>Subject: Reminder: CCP4 summer school at APS, USA
>
>Dear Colleagues,
>
>This is a reminder that the deadline for applications for the 6th annual 
>CCP4 Summer School ³From data collection to structure refinement and 
>beyond² is April 5, 2013. The school will take place from June 18 through 
>June 26, 2013 at the Advanced Photon Source (APS) near Chicago.
>
>There is no registration fee for the school. The students will be 
>responsible for their own travel and lodging expenses. These and other 
>details (the program, the list of speakers, the application process, 
>accommodations, site access, contacts etc) can be found at the workshop 
>website at http://www.ccp4.ac.uk/schools/APS-2013/index.php
>
>The school will include data collection, processing, structure solution, 
>model building, refinement, validation, automation of many steps etc. 
>Participants are encouraged to bring their own crystals, raw data or 
>processed data for hands-on problem solving under the guidance of 
>software developers and other experts.
>We hope to see you in June
>
>Garib, Ronan and Nukri
>
>Ruslan Sanishvili (Nukri)
>Macromolecular Crystallographer
>GM/CA@APS
>X-ray Science Division, ANL
>9700 S. Cass Ave.
>Argonne, IL 60439
>
>Tel: (630)252-0665
>Fax: (630)252-0667
>[log in to unmask]
>
>------------------------------
>
>End of CCP4BB Digest - 3 Mar 2013 to 4 Mar 2013 (#2013-65)
>**********************************************************

Top of Message | Previous Page | Permalink

JiscMail Tools


RSS Feeds and Sharing


Advanced Options


Archives

April 2024
March 2024
February 2024
January 2024
December 2023
November 2023
October 2023
September 2023
August 2023
July 2023
June 2023
May 2023
April 2023
March 2023
February 2023
January 2023
December 2022
November 2022
October 2022
September 2022
August 2022
July 2022
June 2022
May 2022
April 2022
March 2022
February 2022
January 2022
December 2021
November 2021
October 2021
September 2021
August 2021
July 2021
June 2021
May 2021
April 2021
March 2021
February 2021
January 2021
December 2020
November 2020
October 2020
September 2020
August 2020
July 2020
June 2020
May 2020
April 2020
March 2020
February 2020
January 2020
December 2019
November 2019
October 2019
September 2019
August 2019
July 2019
June 2019
May 2019
April 2019
March 2019
February 2019
January 2019
December 2018
November 2018
October 2018
September 2018
August 2018
July 2018
June 2018
May 2018
April 2018
March 2018
February 2018
January 2018
December 2017
November 2017
October 2017
September 2017
August 2017
July 2017
June 2017
May 2017
April 2017
March 2017
February 2017
January 2017
December 2016
November 2016
October 2016
September 2016
August 2016
July 2016
June 2016
May 2016
April 2016
March 2016
February 2016
January 2016
December 2015
November 2015
October 2015
September 2015
August 2015
July 2015
June 2015
May 2015
April 2015
March 2015
February 2015
January 2015
December 2014
November 2014
October 2014
September 2014
August 2014
July 2014
June 2014
May 2014
April 2014
March 2014
February 2014
January 2014
December 2013
November 2013
October 2013
September 2013
August 2013
July 2013
June 2013
May 2013
April 2013
March 2013
February 2013
January 2013
December 2012
November 2012
October 2012
September 2012
August 2012
July 2012
June 2012
May 2012
April 2012
March 2012
February 2012
January 2012
December 2011
November 2011
October 2011
September 2011
August 2011
July 2011
June 2011
May 2011
April 2011
March 2011
February 2011
January 2011
December 2010
November 2010
October 2010
September 2010
August 2010
July 2010
June 2010
May 2010
April 2010
March 2010
February 2010
January 2010
December 2009
November 2009
October 2009
September 2009
August 2009
July 2009
June 2009
May 2009
April 2009
March 2009
February 2009
January 2009
December 2008
November 2008
October 2008
September 2008
August 2008
July 2008
June 2008
May 2008
April 2008
March 2008
February 2008
January 2008
December 2007
November 2007
October 2007
September 2007
August 2007
July 2007
June 2007
May 2007
April 2007
March 2007
February 2007
January 2007


JiscMail is a Jisc service.

View our service policies at https://www.jiscmail.ac.uk/policyandsecurity/ and Jisc's privacy policy at https://www.jisc.ac.uk/website/privacy-notice

For help and support help@jisc.ac.uk

Secured by F-Secure Anti-Virus CataList Email List Search Powered by the LISTSERV Email List Manager