JiscMail Logo
Email discussion lists for the UK Education and Research communities

Help for LIS-BIBLIOMETRICS Archives


LIS-BIBLIOMETRICS Archives

LIS-BIBLIOMETRICS Archives


LIS-BIBLIOMETRICS@JISCMAIL.AC.UK


View:

Message:

[

First

|

Previous

|

Next

|

Last

]

By Topic:

[

First

|

Previous

|

Next

|

Last

]

By Author:

[

First

|

Previous

|

Next

|

Last

]

Font:

Proportional Font

LISTSERV Archives

LISTSERV Archives

LIS-BIBLIOMETRICS Home

LIS-BIBLIOMETRICS Home

LIS-BIBLIOMETRICS  May 2019

LIS-BIBLIOMETRICS May 2019

Options

Subscribe or Unsubscribe

Subscribe or Unsubscribe

Log In

Log In

Get Password

Get Password

Subject:

Re: Consultation on criteria for fair & responsible university rankings

From:

"Charles Oppenheim (vpr)" <[log in to unmask]>

Reply-To:

A bibliometrics discussion list for the Library and Research Community <[log in to unmask]>

Date:

Wed, 22 May 2019 12:34:16 +0000

Content-Type:

multipart/related

Parts/Attachments:

Parts/Attachments

text/plain (1 lines) , image001.png (1 lines)

​I've already responded to Justin on this excellent document





Charles





Professor Charles Oppenheim

________________________________

From: A bibliometrics discussion list for the Library and Research Community <[log in to unmask]> on behalf of Elizabeth Gadd <[log in to unmask]>

Sent: 22 May 2019 13:32

To: [log in to unmask]

Subject: Consultation on criteria for fair & responsible university rankings



Dear Lis-Bibliometrics colleagues,



I’m delighted to announce that the Research Evaluation Working Group have now developed a draft set of criteria for fair and responsible university rankings and would welcome your feedback.



This work is the first stage of the Rankings Sub-Group’s efforts to develop a means of rating university rankers to highlight their current deficiencies and mitigate their negative impact on university behaviours.



The criteria are available from the INORMS REWG web page at https://inorms.net/activities/research-evaluation-working-group/ and also included in this email below.  We welcome discussion on this list or directly to the Rankings Sub-Group leader, Justin Shearer, on [log in to unmask]<mailto:[log in to unmask]>



Best regards

Lizzie





[cid:image001.png@01D51096.518076D0]

Research Evaluation Working Group



What makes a fair and responsible university ranking?

Draft criteria for comment



Introduction

The International Network of Research Management Societies (INORMS) established a two-year Research Evaluation Working Group (REWG) in 2018. It consists of representatives from a range of global member research management societies all seeking to work towards better, fairer and more meaningful research evaluation. One of the group’s two areas of focus is the burgeoning influence of University Rankings on the behaviours of universities despite often poor methodological approaches and practices.  The purpose of this work-package is to consider what we, as an international group of research managers, think the characteristics of a fair and responsible University Ranking should look like. The idea is to then ‘turn the tables’ on the rankings and rate them against our agreed criteria.  We are now seeking feedback on our draft list of characteristics, particularly around:



1) Whether the characteristics, as written, make sense to you?

2) Are there any characteristics you think are missing?

3) What you think are the priority and non-priority characteristics?



Please note that at this stage, we are not considering how these characteristics might be assessed, only whether they are desirable. The references in brackets lead to texts that inspired these principles, they are not direct quotations.



The consultation is open until Monday 10 June and feedback can be emailed to either [log in to unmask]<mailto:[log in to unmask]> (if you are a member) or directly to the Rankings Sub-Group leader, Justin Shearer, on [log in to unmask]<mailto:[log in to unmask]>



We look forward to hearing from you!

Lizzie Gadd, INORMS REWG Chair

_______________________________________________________________________________________________

_______________________________________________________________________________________________

1. General approach



  *   Profiles not rankings.  Accepts that higher education and research organisation are complex, multi-faceted entities and provide a facility by which their range of strengths can be displayed.  (BP)

  *   Measure against mission. Accepts that different universities have different missions and provides a facility by which universities can be assessed against their own goals. (LM, BP, Blank, Shen)

  *   One thing at a time. Does not combine indicators to create a composite metric. (YG1) (CWTS)

  *   Provides context. Provides a link out to further qualitative and contextual information about the university being ranked (LM).

  *   Damage limitation activity. Recognises and proactively seeks to limit the systemic effects of rankings. (LM, Adam)

  *   No unfair advantage. Makes every effort to ensure the approach taken does not discriminate against organisations by size, disciplinary mix, language, wealth, age and geography.



2. Governance



  *   Transparent aims. States clearly the purpose of the ranking and their target groups. (BP)

  *   Engage with the ranked.  Has a clear mechanism for engaging with both the academic faculty at ranked institutions and their senior managers, for example, through an independent international academic advisory board. (BP, Bilder et al)

  *   Self-improving. Regularly applies measures of quality assurance to their ranking processes (BP).

  *   No commercial gain. Does not seek to exploit their ranking for financial gain by, for example, offering consultancy services, or selling the underlying data.

  *   Manage any conflict of interests. Where conflicts of interest may arise, makes every effort to manage these.







3. Methodology



  *   Transparent methodology. Publishes full details of their ranking methodology, including detailed descriptions of the data sources being used, so that given the data a third party could replicate the results. (CWTS, DORA, BP)

  *   Open and transparent data availability. Makes all data on which the ranking is based available openly so that those being evaluated can verify the data and analysis. (LM, DORA, BP)

  *   Rigorous methodology. Data collection and analysis methods should pass tests of scientific rigour, including sample size, representation, normalisation, handling of outliers, etc.(BP)

  *   No sloppy surveys. Uses opinion surveys sparingly, if at all, and ensures that where they are used that the methodology is sound and unbiased.

  *   Open to correction. Data and indicators should be made available in a way that errors and faults can be corrected. Any adjustments that are made to the original data and indicators should be clearly indicated.(BP)

  *   Deals with gaming. Has a published statement about what constitutes inappropriate manipulation of data submitted for ranking and what measures will be taken to combat this. (DORA)

  *   Defines “University”. When using multiple data sources to take measurements, uses a consistent definition of university across the different data sources. (E.g., universities with multiple campuses or teaching hospitals) (CWTS)

  *   Outcomes over inputs. Measures of performance should be weighted towards outcomes rather than inputs. (BP)







4. Indicators



  *   Validity. Indicators have a clear relationship with the characteristic they claim to measure.  For example, teaching quality should not be indicated by staff-student ratios. (BP, YG2)

  *   Sensitivity. Indicators are sensitive to the nature of the characteristic they claim to measure. (YG1)

  *   Monotonicity. Does not use indicators where the ‘best’ score will vary according to the mission of an institution giving a monotonic relationship between the variable being measured and the value it has. For example, with staff-student ratios neither a ratio of 1:1 or 1:1000 is desirable, and the ‘best’ ratio will depend on the ambitions of the university.  (YG1)

  *   Size-independence. Indicators should not favour universities purely based on their size – either large or small.(CWTS)

  *   Field-normalised. Indicators should normalise for disciplinary differences in the variable being measured. (LM)

  *   Geographical equality. Indicators should not introduce regional disadvantages. For example, the use of bibliographic databases that do not have global representation. (LM, Maheu & Lacroix).

  *   Honest about uncertainty.  The statistical uncertainty of the data being presented should be clearly indicated using error bars, confidence intervals or other techniques. (LM)





5. Usability



  *   Easy to use. The data presented is clearly labelled, easy to access, interpret, export and use.



  *   Tailored to different audiences. The ranking provides different windows onto the data that may be relevant to different audiences. For example, by providing an opportunity to focus in on teaching elements for students.

  *   Minimise workload for the ranked. The ranking requires minimal input from organisations being ranked so smaller and less wealthy institutions are not disadvantaged.











References







LM=Leiden Manifesto http://www.leidenmanifesto.org/ (2016)



YG1=Yves Gingras. (2014). Bibliometrics and Research Evaluation: Uses and Abuses. Cambridge, Mass.: MIT Press.



BP=Berlin Principles on Ranking of HE Institutions



https://www.che.de/downloads/Berlin_Principles_IREG_534.pdf  (2006)



CWTS=CWTS 10 Principles for Responsible Use of University Rankings http://www.leidenranking.com/information/responsibleuse (2017)



DORA= Declaration on Research Assessment https://sfdora.org/read/



(2013)



Bilder, Lin & Neylon, Principles for Open Scholarly Infrastructures https://figshare.com/articles/Principles_for_Open_Scholarly_Infrastructures_v1/1314859  (2015)

Adam, Edmund, New ranking to look at universities’ contributions to sustainable development https://www.universityaffairs.ca/opinion/in-my-opinion/new-ranking-to-look-at-universities-contributions-to-sustainable-development/ (February 2019)

Blank, Kim, University rankings: The Emperor has at least some clothes, https://www.universityaffairs.ca/opinion/in-my-opinion/university-rankings-emperor-least-clothes/ (November 2016)

YG2 = Gingras, Yves, Academic rankings: The university’s new clothes? https://www.universityaffairs.ca/opinion/in-my-opinion/academic-rankings-universitys-new-clothes/ (November 2016)

Shen, Anqi, Consultations are underway for a ‘made-in-Canada’ Athena SWAN program, https://www.universityaffairs.ca/news/news-article/consultations-are-underway-for-a-made-in-canada-athena-swan-program/ (October 2018)

Maheu, Louis & Robert Lacroix, The university rankings roller coaster, https://www.universityaffairs.ca/opinion/in-my-opinion/university-rankings-roller-coaster/ (February 2015)

JW = Wilsdon, James. (2019) Deliver us from rankers. https://wonkhe.com/blogs/deliver-us-from-rankers/







Dr Elizabeth Gadd FHEA MCLIP

Research Policy Manager (Publications)

Loughborough University

Loughborough, Leics, UK, LE11 3TU



Chair, INORMS Research Evaluation Working Group<https://inorms.net/activities/research-evaluation-working-group/>

Chair, Lis-Bibliometrics<https://www.jiscmail.ac.uk/cgi-bin/webadmin?A0=LIS-BIBLIOMETRICS>

Champion, ARMA Research Evaluation SIG<https://arma.ac.uk/special-interest-groups/>



Working hours: M: 8.30-5/ Tu: 8.30-3/ W: 8.30-3/ F: 8.30-3



Phone: +44 (0)1509228594

Twitter: @lizziegadd

Skype: lizziegadd

Web: https://about.me/elizabeth.gadd

ORCID: https://orcid.org/0000-0003-4509-7785

ImpactStory: https://profiles.impactstory.org/u/0000-0003-4509-7785





________________________________



To unsubscribe from the LIS-BIBLIOMETRICS list, click the following link:

https://www.jiscmail.ac.uk/cgi-bin/webadmin?SUBED1=LIS-BIBLIOMETRICS&A=1



________________________________



Robert Gordon University has been awarded a TEF Gold award for the quality of its undergraduate teaching and learning, placing it in the top 20% of Universities in the UK





Robert Gordon University, a Scottish charity registered under charity number SC 013781.



This e-mail and any attachment is for authorised use by the intended recipient(s) only. It may contain proprietary material, confidential information and/or be subject to legal privilege. It should not be copied, disclosed to, retained or used by, any other party. If you are not an intended recipient then please promptly delete this e-mail and any attachment and all copies and inform the sender. Please note that any views or opinions presented in this email are solely those of the author and do not necessarily represent those of Robert Gordon University. Thank you.



########################################################################



To unsubscribe from the LIS-BIBLIOMETRICS list, click the following link:

https://www.jiscmail.ac.uk/cgi-bin/webadmin?SUBED1=LIS-BIBLIOMETRICS&A=1

Top of Message | Previous Page | Permalink

JiscMail Tools


RSS Feeds and Sharing


Advanced Options


Archives

April 2024
March 2024
February 2024
January 2024
December 2023
November 2023
October 2023
September 2023
August 2023
July 2023
June 2023
May 2023
April 2023
March 2023
February 2023
January 2023
December 2022
November 2022
October 2022
September 2022
August 2022
July 2022
June 2022
May 2022
April 2022
March 2022
February 2022
January 2022
December 2021
November 2021
October 2021
September 2021
August 2021
July 2021
June 2021
May 2021
April 2021
March 2021
February 2021
January 2021
December 2020
November 2020
October 2020
September 2020
August 2020
July 2020
June 2020
May 2020
April 2020
March 2020
February 2020
January 2020
December 2019
November 2019
October 2019
September 2019
August 2019
July 2019
June 2019
May 2019
April 2019
March 2019
February 2019
January 2019
December 2018
November 2018
October 2018
September 2018
August 2018
July 2018
June 2018
May 2018
April 2018
March 2018
February 2018
January 2018
December 2017
November 2017
October 2017
September 2017
August 2017
July 2017
June 2017
May 2017
April 2017
March 2017
February 2017
January 2017
December 2016
November 2016
October 2016
September 2016
August 2016
July 2016
June 2016
May 2016
April 2016
March 2016
February 2016
January 2016
December 2015
November 2015
October 2015
September 2015
August 2015
July 2015
June 2015
May 2015
April 2015
March 2015
February 2015
December 2014
November 2014
October 2014
September 2014
August 2014
July 2014
June 2014
May 2014
April 2014
March 2014
February 2014
January 2014
December 2013
November 2013
October 2013
September 2013
August 2013
July 2013
June 2013
May 2013
April 2013
March 2013
February 2013
January 2013
December 2012
November 2012
October 2012
September 2012
August 2012
July 2012
June 2012
May 2012
March 2012
February 2012
January 2012
December 2011
October 2011
September 2011
August 2011
July 2011
June 2011
May 2011
April 2011
March 2011
February 2011
January 2011
December 2010


JiscMail is a Jisc service.

View our service policies at https://www.jiscmail.ac.uk/policyandsecurity/ and Jisc's privacy policy at https://www.jisc.ac.uk/website/privacy-notice

For help and support help@jisc.ac.uk

Secured by F-Secure Anti-Virus CataList Email List Search Powered by the LISTSERV Email List Manager