I hope I do not abuse this list by saying that our journal Research
Evaluation has been publishing peer-reviewed papers for years which bang
on about the dangers of bibliometrics and measuring research quality -
and it is in the ISI system, so it must be good! All papers over 24
months old are open access at
www.ingentaconnect.com\content\beech\rev and it is edited by the
previously mentioned Tony van Rraan.
Bill Page
_________________________
William Page
Beech Tree Publishing
- Science and Public Policy
- Research Evaluation
10 Watford Close, Guildford, Surrey GU1 2EP, UK
Email [log in to unmask]
Telephone +44 1483 824871 Fax +44 1483 567497
Website [being redesigned] www.scipol.co.uk [including links to journal
full texts on Ingenta Connect]
-----Original Message-----
From: An informal open list set up by the UK Serials Group
[mailto:[log in to unmask]] On Behalf Of Gerry Mckiernan
Sent: 13 June 2008 04:20
To: [log in to unmask]
Subject: [LIS-E-JOURNALS] _Citation Statistics_: A Report From The
International Mathematical Union
Friends/
Those Damn Statistics: Can't Live WithOut Them, Can't Live With Them
/Gerry
Citation Statistics
A report from the International Mathematical Union (IMU) in cooperation
with the International Council of Industrial and Applied Mathematics
(ICIAM) and the Institute of Mathematical Statistics (IMS)
Executive Summary
This is a report about the use and misuse of citation data in the
assessment of scientific research. The idea that research assessment
must be done using "simple and objective" methods is increasingly
prevalent today. The "simple and objective" methods are broadly
interpreted as bibliometrics, that is, citation data and the statistics
derived from them. There is a belief that citation statistics are
inherently more accurate because they substitute simple numbers for
complex judgments, and hence overcome the possible subjectivity of peer
review. But this belief is unfounded.
[snip]
Using citation data to assess research ultimately means using citation‐
based statistics to rank things-journals, papers, people, programs, and
disciplines. The statistical tools used to rank these things are often
misunderstood and misused.
[snip]
The validity of statistics such as the impact factor and h‐index is
neither well understood nor well studied. The connection of these
statistics with research quality is sometimes established on the basis
of "experience." The justification for relying on them is that they are
"readily available." The few studies of these statistics that were done
focused narrowly on showing a correlation with some other measure of
quality rather than on determining how one can best derive useful
information from citation data.
[more]
Links to The Full-Text Available at
[
http://scholarship20.blogspot.com/2008/06/citation-statistics-report-fro
m.html
]
OR
[ http://tinyurl.com/3ldmts ]
Regards,
/Gerry
Gerry McKiernan
Associate Professor
Science and Technology Librarian
Iowa State University Library
Ames IA 50011
[log in to unmask]
There is Nothing More Powerful Than An Idea Whose Time Has Come Victor
Hugo [ http://www.blogger.com/profile/09093368136660604490 ]
Iowa: Where the Tall Corn Flows and the (North)West Wind Blows
[ http://alternativeenergyblogs.blogspot.com/ ]
____________________________________________________________
lis-e-journals is changing ... to lis-e-resources from the 1st July 2008.
Find out more about UKSG at: http://www.uksg.org/ or join the UKSG facebook network at: http://www.facebook.com/group.php?gid=21810140156
|