I've already responded to Justin on this excellent document
Charles
Professor Charles Oppenheim
________________________________
From: A bibliometrics discussion list for the Library and Research Community <[log in to unmask]> on behalf of Elizabeth Gadd <[log in to unmask]>
Sent: 22 May 2019 13:32
To: [log in to unmask]
Subject: Consultation on criteria for fair & responsible university rankings
Dear Lis-Bibliometrics colleagues,
I’m delighted to announce that the Research Evaluation Working Group have now developed a draft set of criteria for fair and responsible university rankings and would welcome your feedback.
This work is the first stage of the Rankings Sub-Group’s efforts to develop a means of rating university rankers to highlight their current deficiencies and mitigate their negative impact on university behaviours.
The criteria are available from the INORMS REWG web page at https://inorms.net/activities/research-evaluation-working-group/ and also included in this email below. We welcome discussion on this list or directly to the Rankings Sub-Group leader, Justin Shearer, on [log in to unmask]<mailto:[log in to unmask]>
Best regards
Lizzie
[cid:image001.png@01D51096.518076D0]
Research Evaluation Working Group
What makes a fair and responsible university ranking?
Draft criteria for comment
Introduction
The International Network of Research Management Societies (INORMS) established a two-year Research Evaluation Working Group (REWG) in 2018. It consists of representatives from a range of global member research management societies all seeking to work towards better, fairer and more meaningful research evaluation. One of the group’s two areas of focus is the burgeoning influence of University Rankings on the behaviours of universities despite often poor methodological approaches and practices. The purpose of this work-package is to consider what we, as an international group of research managers, think the characteristics of a fair and responsible University Ranking should look like. The idea is to then ‘turn the tables’ on the rankings and rate them against our agreed criteria. We are now seeking feedback on our draft list of characteristics, particularly around:
1) Whether the characteristics, as written, make sense to you?
2) Are there any characteristics you think are missing?
3) What you think are the priority and non-priority characteristics?
Please note that at this stage, we are not considering how these characteristics might be assessed, only whether they are desirable. The references in brackets lead to texts that inspired these principles, they are not direct quotations.
The consultation is open until Monday 10 June and feedback can be emailed to either [log in to unmask]<mailto:[log in to unmask]> (if you are a member) or directly to the Rankings Sub-Group leader, Justin Shearer, on [log in to unmask]<mailto:[log in to unmask]>
We look forward to hearing from you!
Lizzie Gadd, INORMS REWG Chair
_______________________________________________________________________________________________
_______________________________________________________________________________________________
1. General approach
* Profiles not rankings. Accepts that higher education and research organisation are complex, multi-faceted entities and provide a facility by which their range of strengths can be displayed. (BP)
* Measure against mission. Accepts that different universities have different missions and provides a facility by which universities can be assessed against their own goals. (LM, BP, Blank, Shen)
* One thing at a time. Does not combine indicators to create a composite metric. (YG1) (CWTS)
* Provides context. Provides a link out to further qualitative and contextual information about the university being ranked (LM).
* Damage limitation activity. Recognises and proactively seeks to limit the systemic effects of rankings. (LM, Adam)
* No unfair advantage. Makes every effort to ensure the approach taken does not discriminate against organisations by size, disciplinary mix, language, wealth, age and geography.
2. Governance
* Transparent aims. States clearly the purpose of the ranking and their target groups. (BP)
* Engage with the ranked. Has a clear mechanism for engaging with both the academic faculty at ranked institutions and their senior managers, for example, through an independent international academic advisory board. (BP, Bilder et al)
* Self-improving. Regularly applies measures of quality assurance to their ranking processes (BP).
* No commercial gain. Does not seek to exploit their ranking for financial gain by, for example, offering consultancy services, or selling the underlying data.
* Manage any conflict of interests. Where conflicts of interest may arise, makes every effort to manage these.
3. Methodology
* Transparent methodology. Publishes full details of their ranking methodology, including detailed descriptions of the data sources being used, so that given the data a third party could replicate the results. (CWTS, DORA, BP)
* Open and transparent data availability. Makes all data on which the ranking is based available openly so that those being evaluated can verify the data and analysis. (LM, DORA, BP)
* Rigorous methodology. Data collection and analysis methods should pass tests of scientific rigour, including sample size, representation, normalisation, handling of outliers, etc.(BP)
* No sloppy surveys. Uses opinion surveys sparingly, if at all, and ensures that where they are used that the methodology is sound and unbiased.
* Open to correction. Data and indicators should be made available in a way that errors and faults can be corrected. Any adjustments that are made to the original data and indicators should be clearly indicated.(BP)
* Deals with gaming. Has a published statement about what constitutes inappropriate manipulation of data submitted for ranking and what measures will be taken to combat this. (DORA)
* Defines “University”. When using multiple data sources to take measurements, uses a consistent definition of university across the different data sources. (E.g., universities with multiple campuses or teaching hospitals) (CWTS)
* Outcomes over inputs. Measures of performance should be weighted towards outcomes rather than inputs. (BP)
4. Indicators
* Validity. Indicators have a clear relationship with the characteristic they claim to measure. For example, teaching quality should not be indicated by staff-student ratios. (BP, YG2)
* Sensitivity. Indicators are sensitive to the nature of the characteristic they claim to measure. (YG1)
* Monotonicity. Does not use indicators where the ‘best’ score will vary according to the mission of an institution giving a monotonic relationship between the variable being measured and the value it has. For example, with staff-student ratios neither a ratio of 1:1 or 1:1000 is desirable, and the ‘best’ ratio will depend on the ambitions of the university. (YG1)
* Size-independence. Indicators should not favour universities purely based on their size – either large or small.(CWTS)
* Field-normalised. Indicators should normalise for disciplinary differences in the variable being measured. (LM)
* Geographical equality. Indicators should not introduce regional disadvantages. For example, the use of bibliographic databases that do not have global representation. (LM, Maheu & Lacroix).
* Honest about uncertainty. The statistical uncertainty of the data being presented should be clearly indicated using error bars, confidence intervals or other techniques. (LM)
5. Usability
* Easy to use. The data presented is clearly labelled, easy to access, interpret, export and use.
* Tailored to different audiences. The ranking provides different windows onto the data that may be relevant to different audiences. For example, by providing an opportunity to focus in on teaching elements for students.
* Minimise workload for the ranked. The ranking requires minimal input from organisations being ranked so smaller and less wealthy institutions are not disadvantaged.
References
LM=Leiden Manifesto http://www.leidenmanifesto.org/ (2016)
YG1=Yves Gingras. (2014). Bibliometrics and Research Evaluation: Uses and Abuses. Cambridge, Mass.: MIT Press.
BP=Berlin Principles on Ranking of HE Institutions
https://www.che.de/downloads/Berlin_Principles_IREG_534.pdf (2006)
CWTS=CWTS 10 Principles for Responsible Use of University Rankings http://www.leidenranking.com/information/responsibleuse (2017)
DORA= Declaration on Research Assessment https://sfdora.org/read/
(2013)
Bilder, Lin & Neylon, Principles for Open Scholarly Infrastructures https://figshare.com/articles/Principles_for_Open_Scholarly_Infrastructures_v1/1314859 (2015)
Adam, Edmund, New ranking to look at universities’ contributions to sustainable development https://www.universityaffairs.ca/opinion/in-my-opinion/new-ranking-to-look-at-universities-contributions-to-sustainable-development/ (February 2019)
Blank, Kim, University rankings: The Emperor has at least some clothes, https://www.universityaffairs.ca/opinion/in-my-opinion/university-rankings-emperor-least-clothes/ (November 2016)
YG2 = Gingras, Yves, Academic rankings: The university’s new clothes? https://www.universityaffairs.ca/opinion/in-my-opinion/academic-rankings-universitys-new-clothes/ (November 2016)
Shen, Anqi, Consultations are underway for a ‘made-in-Canada’ Athena SWAN program, https://www.universityaffairs.ca/news/news-article/consultations-are-underway-for-a-made-in-canada-athena-swan-program/ (October 2018)
Maheu, Louis & Robert Lacroix, The university rankings roller coaster, https://www.universityaffairs.ca/opinion/in-my-opinion/university-rankings-roller-coaster/ (February 2015)
JW = Wilsdon, James. (2019) Deliver us from rankers. https://wonkhe.com/blogs/deliver-us-from-rankers/
Dr Elizabeth Gadd FHEA MCLIP
Research Policy Manager (Publications)
Loughborough University
Loughborough, Leics, UK, LE11 3TU
Chair, INORMS Research Evaluation Working Group<https://inorms.net/activities/research-evaluation-working-group/>
Chair, Lis-Bibliometrics<https://www.jiscmail.ac.uk/cgi-bin/webadmin?A0=LIS-BIBLIOMETRICS>
Champion, ARMA Research Evaluation SIG<https://arma.ac.uk/special-interest-groups/>
Working hours: M: 8.30-5/ Tu: 8.30-3/ W: 8.30-3/ F: 8.30-3
Phone: +44 (0)1509228594
Twitter: @lizziegadd
Skype: lizziegadd
Web: https://about.me/elizabeth.gadd
ORCID: https://orcid.org/0000-0003-4509-7785
ImpactStory: https://profiles.impactstory.org/u/0000-0003-4509-7785
________________________________
To unsubscribe from the LIS-BIBLIOMETRICS list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/webadmin?SUBED1=LIS-BIBLIOMETRICS&A=1
________________________________
Robert Gordon University has been awarded a TEF Gold award for the quality of its undergraduate teaching and learning, placing it in the top 20% of Universities in the UK
Robert Gordon University, a Scottish charity registered under charity number SC 013781.
This e-mail and any attachment is for authorised use by the intended recipient(s) only. It may contain proprietary material, confidential information and/or be subject to legal privilege. It should not be copied, disclosed to, retained or used by, any other party. If you are not an intended recipient then please promptly delete this e-mail and any attachment and all copies and inform the sender. Please note that any views or opinions presented in this email are solely those of the author and do not necessarily represent those of Robert Gordon University. Thank you.
########################################################################
To unsubscribe from the LIS-BIBLIOMETRICS list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/webadmin?SUBED1=LIS-BIBLIOMETRICS&A=1
|