JiscMail Logo
Email discussion lists for the UK Education and Research communities

Help for JISC-REPOSITORIES Archives


JISC-REPOSITORIES Archives

JISC-REPOSITORIES Archives


JISC-REPOSITORIES@JISCMAIL.AC.UK


View:

Message:

[

First

|

Previous

|

Next

|

Last

]

By Topic:

[

First

|

Previous

|

Next

|

Last

]

By Author:

[

First

|

Previous

|

Next

|

Last

]

Font:

Proportional Font

LISTSERV Archives

LISTSERV Archives

JISC-REPOSITORIES Home

JISC-REPOSITORIES Home

JISC-REPOSITORIES  September 2007

JISC-REPOSITORIES September 2007

Options

Subscribe or Unsubscribe

Subscribe or Unsubscribe

Log In

Log In

Get Password

Get Password

Subject:

British Academy Report on Peer Review and Metrics

From:

Stevan Harnad <[log in to unmask]>

Reply-To:

Stevan Harnad <[log in to unmask]>

Date:

Tue, 4 Sep 2007 13:38:55 +0100

Content-Type:

MULTIPART/MIXED

Parts/Attachments:

Parts/Attachments

TEXT/PLAIN (178 lines)

> A pall of gloom lies over the vital system of peer review. 
> But the British Academy has some bright ideas.
> The Guardian, Jessica Shepherd reports, Tuesday September 4, 2007
     http://education.guardian.co.uk/higher/research/story/0,,2161680,00.html

Jessica Shepherd's report on peer review seems to be a good one. The
only thing it lacks is some conclusions (which journalists are often
reluctant to take the responsibility of making):

(1) Yes, peer review, like all human judgment, is fallible, and
susceptible to error and abuse.

(2) But, in point of fact, peer review just means the assessment of
research by qualified experts. (In the case of research proposals, it is
assessment for fundability, and in the case of research reports, it is
assessment for publishability.)

(3) Funding and publishing without any assessment is not a solution:

     (3a) Everything cannot be funded (there aren't enough funds), and even
     funded projects first need some expert advice in their design.

     (3b) And everything *does* get published, eventually, but there is
     a hierarchy of journal peer-review quality standards, serving as an
     essential guide for users, to guide them in what they can take the
     risk of trying to read, use and build upon. (There is not enough time
     to read everything, and it's to risky to try to build on anything
     that claims to have been found); and even accepted papers first need
     from expert advice in their revision.)

(4) So far, nothing as good as or better than peer review (i.e.,
qualified experts vetting the work of their fellow-experts) has been
found, tested and demonstrated. So peer review remains the only straw
afloat, if the alternative is not to be tossing a coin for funding,
and publishing everything on a par.

(5) Peer review *can* be improved.  The weak link is always the
editor (or Board of Editors), who choose the reviewers and to whom
the reviewers and authors are answerable; and the Funding Officer(s)
or committee choosing the reviewers for proposals, and deciding how
to act on the basis of the reviews. There are many possibilities for
experimenting with ways to make this meta-review component more accurate,
equitable, answerable, and efficient, especially now that we are 
in the online era:
http://users.ecs.soton.ac.uk/harnad/Temp/peerev.pdf

(6) Metrics are not a substitute for peer review, they are a *supplement*
to it.

     In the case of the UK RAE, a Dual System of prospective funding of
     (i) individual competitive proposals (RCUK) and (ii) retrospective
     top-sliced funding of entire university departments, based on
     their recent past research performance (RAE), metrics can help
     inform and guide funding officers, committees, editors, Boards
     and reviewers. And in the case of the RAE in particular, they can
     shoulder a lot of the former peer-review burden: The RAE, being a
     retrospective rather than a prospective exercise, can benefit from
     the prior publication peer review that the journals have already done
     for the submissions, rank the outcomes with metrics, and then only
     add expert judgment afterward, as a way of checking and fine-tuning
     the metric rankings. Funders and universities explicitly recognizing
     peer review performance as a metric would be a very good idea,
     both for the reviewers and the researchers being reviewed.

     Harnad, S. (2007) Open Access Scientometrics and the UK Research
     Assessment Exercise. In Proceedings of 11th Annual Meeting of the
     International Society for Scientometrics and Informetrics 11(1), pp.
     27-33, Madrid, Spain. Torres-Salinas, D. and Moed, H. F., Eds.
     http://eprints.ecs.soton.ac.uk/13804/

     Brody, T., Carr, L., Gingras, Y., Hajjem, C., Harnad, S. and
     Swan, A.  (2007) Incentivizing the Open Access Research Web:
     Publication-Archiving, Data-Archiving and Scientometrics. CTWatch
     Quarterly 3(3). http://eprints.ecs.soton.ac.uk/14418/

     Shadbolt, N., Brody, T., Carr, L. and Harnad, S. (2006) The Open
     Research Web: A Preview of the Optimal and the Inevitable, in Jacobs,
     N., Eds. Open Access: Key Strategic, Technical and Economic Aspects,
     chapter 21. Chandos. http://eprints.ecs.soton.ac.uk/12453/

Some more generic references on peer review follow below.

Stevan Harnad
American Scientist Open Access Forum
http://amsci-forum.amsci.org/archives/American-Scientist-Open-Access-Forum.html

Chaire de recherche du Canada		Professor of Cognitive Science 
Institut des sciences cognitives	Electronics & Computer Science
Universite du Quebec a Montreal		University of Southampton 
Montreal, Quebec						Highfield, Southampton
Canada  H3C 3P8						SO17 1BJ United Kingdom
http://www.crsc.uqam.ca/			http://users.ecs.soton.ac.uk/harnad/

Harnad, S. (ed.) (1982) Peer commentary on peer review: A case study in
scientific quality control, New York: Cambridge University Press.

Harnad, Stevan (1985) Rational disagreement in peer review. Science,
Technology and Human Values, 10 p.55-62.
http://cogprints.org/2128/

Harnad, S. (1986) Policing the Paper Chase. (Review of S. Lock, A
difficult balance: Peer review in biomedical publication.) Nature 322:
24 - 5.

Harnad, S. (1996) Implementing Peer Review on the Net: Scientific
Quality Control in Scholarly Electronic Journals. In: Peek, R. & Newby,
G. (Eds.) Scholarly Publishing: The Electronic Frontier. Cambridge MA:
MIT Press. Pp 103-118. http://cogprints.org/1692/

Harnad, S. (1997) Learned Inquiry and the Net: The Role of Peer Review,
Peer Commentary and Copyright. Learned Publishing 11(4) 283-292. 
http://cogprints.org/1694/

Harnad, S. (1998/2000/2004) The invisible hand of peer review. Nature
[online] (5 Nov. 1998), Exploit Interactive 5 (2000): and in Shatz, B.
(2004) (ed.) Peer Review: A Critical Inquiry. Rowland & Littlefield. Pp.
235-242. http://cogprints.org/1646/

Peer Review Reform Hypothesis-Testing (started 1999)
http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#480

A Note of Caution About "Reforming the System" (2001)
http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#1170

Self-Selected Vetting vs. Peer Review: Supplement or
Substitute? (2002)
http://www.ecs.soton.ac.uk/~harnad/Hypermail/Amsci/subject.html#2341

----------------------------------------

>
> No fewer than three academic journals dismissed the economist George Akerlof's paper The Market for Lemons as "trivial" and "too generic" when it was submitted in the late 1960s. Almost four decades later it was regarded as a seminal text and its author thought worthy of the Nobel prize for economics.
>
> Peer review, when an academic submits a scholarly work to the scrutiny of other experts in the field for publication in a journal or for a grant, for example, has always been an imperfect science. But lately it has had more, and fiercer, critics.
>
> They say peer review is biased against innovation and originality. They argue that it costs too much - more than £196m a year was the estimate by Research Councils UK last year. And they say it takes up too much time now that more academics than ever are submitting papers and fewer claim they can afford the time to "peer review" them.
>
> Today a report published by the British Academy - an academic club of 800 scholars elected for distinction in the humanities and social sciences - speaks up for peer review. The professors quote Joan Sieber, a psychologist at California State University, who has said: "One suspects that peer review is a bit like democracy - a bad system, but the best one possible."
>
> Albert Weale, professor of government at the University of Essex and chair of the committee responsible for the report, describes peer review as "the essential backbone to knowledge and the crucial mechanism in maintaining its quality".
>
> Robert Bennett, professor of geography at Cambridge University, says it is "an essential, if imperfect, practice for the humanities and social sciences".
>
> The report's writers snap back at those who attack peer review. They back up their ripostes with the comments of journal editors, research councils, charities and funders, academics and postdoctoral students. To those who say peer review is biased against innovation and that journal editors "play safe" and are "friendly to their own work", the academy's response is that universities and research councils are awarding more grants for risky, avant-garde research projects than ever.
>
> The report admits that "there may be scope for the government to consider ways in which it can encourage endowments ... within universities to support small grants for innovative, high-risk research".
>
> But it warns: "It is important not to commit the fallacy of assuming that, because high quality will be innovative, the innovative is necessarily high quality ... other criteria include: accuracy, validity, replicability, reliability, substantively significant, authoritative and so on."
>
> Banality gets acceptance
>
> Marian Hobson, professor of French at Queen Mary, University of London, says: "If a journal editor gets everything right all the time, they are probably aiming for the middle, banally all-right work, which will be out of date in the blink of an eyelid. Really excellent work may sometimes take a while to be accepted."
>
> To those who lambast peer reviewing for being too time consuming and costly, the professors have the following suggestion: give far more recognition to the unpaid, altruistic labour of those who do it and the system will be under less strain.
>
> Hobson says: "If done properly, [peer review] entails bibliographical searches, checking of statements, repeated visits to the university library, not just to Google. Yet this kind of activity counts for nix, nothing, zilch in the research assessment exercise [in which every active researcher in every university in the UK is assessed by panels of other academics to receive grants for their research]."
>
> The academy stops short of demanding peer reviewers be paid. It realises this would be impossible for all but the most wealthy journal publishers. Instead, the report recommends that the importance of peer reviewing should be better reflected in the research assessment exercise. "Those responsible for the management of universities and research institutes need to ensure that they ... encourage and reward peer review activity," it says.
>
> This might stop some high calibre academics, already overburdened with work, from being put off peer reviewing, the professors say. It might also attract junior lecturers and even postdoctoral students. More reviewers would mean the system was under less strain. The strain is partly triggered by an increase of up to 62% in the number of academic papers submitted of in some fields in the past five years.
>
> Here lies another problem, says the report. "As we conducted our review, we were struck ... by the extent to which there is little attention to training in peer review," it says. "Training is important, not just in itself, but because of the privileged position that peer reviewers enjoy.
>
> "By virtue of reading a paper, reviewers can acquire access to original data sets, new empirical results or innovative conceptual work. In the business world, these would count as commercial secrets. In the academic world, the ethos is that reviewers are part of the gatekeeping system, the ultimate rationale of which is the fast and efficient dissemination of research findings.
>
> "The integrity of the peer review system is therefore of great importance. One of the ways in which that integrity is maintained is through its dependence upon professional and unselfish motivations, and this in turn suggests the importance of training in the professional and ethical conventions of the practice."
>
> The academy's report ends with a warning to the government: plans to overhaul the way research is assessed after next year will change peer review for the worse, especially in the humanities.
>
> Metrics-based approach
>
> After 2008, the quality of research - and hence the amount of funding that universities receive from the government - will be judged largely on the basis of statistics such as grant income and contracts. It is accepted in the sector that this "metrics-based" approach will work better for science and engineering than for arts and humanities research, which does not receive much income and where books take longer to have an impact.
>
> Hobson says: "Metrics is helpful in giving a kind of overview measured in terms of items. A bit like a waistline measurement. It doesn't give much of an idea of whether they are slim or fat, unless they are at the extreme ends of the spectrum.
>
> "Wittgenstein at his death had one book and one article published. Another book was on the way, but unfinished. "Heaven knows what would have happened to him in today's academia."
>

Top of Message | Previous Page | Permalink

JiscMail Tools


RSS Feeds and Sharing


Advanced Options


Archives

April 2024
March 2024
February 2024
January 2024
December 2023
November 2023
October 2023
September 2023
August 2023
July 2023
June 2023
May 2023
April 2023
March 2023
February 2023
January 2023
December 2022
November 2022
October 2022
September 2022
August 2022
July 2022
June 2022
May 2022
April 2022
March 2022
February 2022
January 2022
December 2021
November 2021
October 2021
September 2021
August 2021
July 2021
June 2021
May 2021
April 2021
March 2021
February 2021
January 2021
December 2020
November 2020
October 2020
September 2020
August 2020
July 2020
June 2020
May 2020
April 2020
March 2020
February 2020
January 2020
December 2019
November 2019
October 2019
September 2019
August 2019
July 2019
June 2019
May 2019
April 2019
March 2019
February 2019
January 2019
December 2018
November 2018
October 2018
September 2018
August 2018
July 2018
June 2018
May 2018
April 2018
March 2018
February 2018
January 2018
December 2017
November 2017
October 2017
September 2017
August 2017
July 2017
June 2017
May 2017
April 2017
March 2017
February 2017
January 2017
December 2016
November 2016
October 2016
September 2016
August 2016
July 2016
June 2016
May 2016
April 2016
March 2016
February 2016
January 2016
December 2015
November 2015
October 2015
September 2015
August 2015
July 2015
June 2015
May 2015
April 2015
March 2015
February 2015
January 2015
December 2014
November 2014
October 2014
September 2014
August 2014
July 2014
June 2014
May 2014
April 2014
March 2014
February 2014
January 2014
December 2013
November 2013
October 2013
September 2013
August 2013
July 2013
June 2013
May 2013
April 2013
March 2013
February 2013
January 2013
December 2012
November 2012
October 2012
September 2012
August 2012
July 2012
June 2012
May 2012
April 2012
March 2012
February 2012
January 2012
December 2011
November 2011
October 2011
September 2011
August 2011
July 2011
June 2011
May 2011
April 2011
March 2011
February 2011
January 2011
December 2010
November 2010
October 2010
September 2010
August 2010
July 2010
June 2010
May 2010
April 2010
March 2010
February 2010
January 2010
December 2009
November 2009
October 2009
September 2009
August 2009
July 2009
June 2009
May 2009
April 2009
March 2009
February 2009
January 2009
December 2008
November 2008
October 2008
September 2008
August 2008
July 2008
June 2008
May 2008
April 2008
March 2008
February 2008
January 2008
December 2007
November 2007
October 2007
September 2007
August 2007
July 2007
June 2007
May 2007
April 2007
March 2007
February 2007
January 2007
December 2006
November 2006
October 2006
September 2006
August 2006
July 2006
June 2006
May 2006
April 2006
March 2006
February 2006
January 2006
November 2005
October 2005


JiscMail is a Jisc service.

View our service policies at https://www.jiscmail.ac.uk/policyandsecurity/ and Jisc's privacy policy at https://www.jisc.ac.uk/website/privacy-notice

For help and support help@jisc.ac.uk

Secured by F-Secure Anti-Virus CataList Email List Search Powered by the LISTSERV Email List Manager