I would like to mention some other issues now that Ajees et al. has stirred all sorts of
discussions. I hope I haven't opened Pandora's box.
From what I have learned around here, very often, there seems to be little time allowed or allocated
to actually learn--a bit beyond the surface--some of the crystallography or what the
crystallographic software is doing during the structure solution process.
A good deal of the postdocs and students here are under incredible pressure to get the structure
DONE asap. For some of them, it is their first time solving a crystal structure. Yes, the same
heapful of reasons: because it's "hot", "competitive", grant deadline, PI tenure pressure etc. etc.
Learning takes the backseat and this is total rubbish and very scary, in my biased personal opinion.
Although I think it is the person's responsibility to take the time and initiative to learn, I also
see that the pressure often is insurmountable. Often, the PI and/or assigned "structure solver" in
the lab pretty much takes charge at some early stage of structure determination and solves the
structure with much lesser contribution from the scientist in training (student/postdoc). All that
slog to clone, purify, crystallize, optimize diffraction only to realize someone else will come
along, process the data and "finish up" the structure for you. Such 'training' (or lack thereof) is
a recipe for generating 'bad' structures in future and part of the reason for this endless thread.
I think it is NOT as common for someone else to, say, run all the Western blots for you, maintain
your tissue cell lines for you, do your protein preps for you. Is it because it is much easier to
upload someone else's crystallographic data on one's machine and solve the structure (since this
does not demand the same kind of physical labor and effort and is also a lot of fun) that this
happens? I understand when the PI or "structure solver" does the above as part of a teamwork and
allows for the person in question to learn. But often, I see the person is somewhat left overwhelmed
and clueless in the end.
I bring this issue to the forum since I do not know if this phenomenon is ubiquitous. If this
practice is a rampant weed, can we as a crystallographic community place some measures to stanch
such practices?
How about ALL journals explicitly listing who did what during the crystallographic analysis? Is
there a practical solution?
I suspect that what I describe is not merely anecdotal. Any solutions?
Raji
------
Date: Thu, 23 Aug 2007 16:17:23 -0700
Reply-To: Dale Tronrud <[log in to unmask]>
Sender: CCP4 bulletin board <[log in to unmask]>
From: Dale Tronrud <[log in to unmask]>
Subject: Re: The importance of USING our validation tools
In-Reply-To: <[log in to unmask]>
Content-Type: text/plain; charset=ISO-8859-1; format=flowed
In the cases you list, it is clearly recognized that the fault lies with the investigator and not
the method. In most of the cases where serious problems have been identified in published models the
authors have stonewalled by saying that the method failed them.
"The methods of crystallography are so weak that we could not detect (for years) that our program
was swapping F+ and F-."
"The scattering of X-rays by bulk solvent is a contentious topic."
"We should have pointed out that the B factors of the peptide are higher then those of the protein."
It appears that the problems occurred because these authors were not following established
procedures in this field. They are, as near as I can tell, somehow immune from the consequences of
their errors. Usually the paper isn't even retracted, when the model is clearly wrong. They can dump
blame on the technique and escape personal responsibility. This is what upsets so many of us.
It would be so refreshing to read in one of these responses "We were under a great deal of pressure
to get our results out before our competitors and cut corners that we shouldn't have, and that
choice resulted in our failure to detect the obvious errors in our model."
If we did see papers retracted, if we did see nonrenewal of grants, if we did see people get fired,
if we did see prison time (when the line between carelessness and fraud is crossed), then we could
be comforted that there is practical incentive to perform quality work.
Dale Tronrud
|