Congratulations to George!
Citations are an interesting sociopolitical phenomenon, particularly
when it comes to methods. Most people cite computer programs that they
used, but certainly not all of them, and almost never cite "laboratory
tricks". Case in point: one would think that T.-Y Teng's J. Appl.
Cryst. (1990) paper describing a (then "new") method for mounting a
crystal into a "loop" and plunge-cooling it in liquid nitrogen would be
the most heavily cited work in all of protein crystallography today.
Yet Google Scholar tells me it has only been cited 311 times (in 20
years). This is in stark contrast to "the most heavily-cited work in
all of science": O. H. Lowry et al. JBC (1951), which describes the
"Lowry assay" of measuring protein concentration and has been cited
about 200,000 times. However, I think if you count up all the
citations to misspellings of Laemmli's name, the U. K. Laemmli, Nature
(1971) paper might have more citations than Lowry.
Even more interesting (I think) is that if you read Laemmli's paper you
will find that the description of the method we now refer to as a
"Laemmli gel" is entirely crammed into a single paragraph at the bottom
of one figure caption. The rest of the paper is about phage head
proteins. The funny thing is, a lot of young biochemists seem to think
that Laemmli invented electrophoresis (actually, that was A. Tiselius,
Trans. Faraday Soc. (1937), who was awarded the 1948 Nobel for that, and
a few other things like ion exchange, reverse phase and affinity
chromatography). Tiselius is one of my favorite scientists because he
transformed his field so much that noone can remember his name.
Admittedly, this does not have to be the case: a series of papers in
1905 by a fellow named "A Einstein" (mostly in Ann. Phys.) transformed
physics and biophysics alike. People remember his name, but ironically
almost never cite his papers. The paper explaining Brownian motion and
diffusion coefficients has only been cited ~1500 times in more than a
century. Still, this is more citations than his paper on something
called "Special Relativity", published the same year.
So, what do you have to do to get people to cite your methods paper?
Near as I can tell, the method you describe must be highly useful (but
not too useful) and also very difficult to comprehend. I don't mean
that the paper should be poorly written, but the sad truth is that
people don't generally cite "methods" that they think are "obvious"
(inasmuch as they understand how it works, and think everyone else does
too). People also don't cite methods that they think they could have
come up with themselves, and especially not those they see as a "common"
commercial product (like mini-prep kits). Generally, something from
"outside the field" must be part of the method for it to be "citation
worthy", so for biologists this can be chemistry (copper binds to ALL
proteins? Really?), physics, or especially mathematics. Computer
programs are particularly well-suited for this, but it can't be a
computer program that does something "transparent" like sequence
alignment or collecting diffraction data (ahem...). In these cases, the
user knows (or thinks they know) exactly what the program is doing, and
assumes that their audience will too, so why cite it?
On the other hand, one must also be very careful not to produce an
algorithm that is "too useful" and rapidly becomes incorporated into
everyday life. An excellent example of this is L. Ten Eyck's work on
something called "FFT" (L. F. Ten Eyck, Acta A 1973). I think he wins
the prize for the largest "unfairness ratio", which I define as: (papers
that used the method)/citations. Perhaps it is important to give your
program a memorable name. However, as long as you have some fancy math
in there (like "direct methods" or "likelihood"), or at least pretty
graphics AND the program can do something that no other program can
(such as solve a structure that was "hard" enough to end up in a
high-impact Journal ... like Acta A), then you've definitely got a
"citation classic" in the making.
BTW. I hope everyone understands that in no way do I mean to belittle
the efforts of those who write heavily-cited computer programs. Quite
the contrary. I think they are simply fortunate to have an "unfairness
ratio" close to 1.
-James Holton
MAD Scientist
Paul Emsley wrote:
>> Well, good luck to all the methods-folk who are up for tenure, here
>> is your chance guys and girls ... it will not last long!!!
>
>
> Indeed.
>
> http://community.thomsonreuters.com/t5/Citation-Impact-Center/What-does-it-mean-to-be-2-in-Impact/ba-p/11386
>
>
>
> p.s. "methods-folk who are up for tenure"? - haha...
|