JiscMail Logo
Email discussion lists for the UK Education and Research communities

Help for LIS-BIBLIOMETRICS Archives


LIS-BIBLIOMETRICS Archives

LIS-BIBLIOMETRICS Archives


LIS-BIBLIOMETRICS@JISCMAIL.AC.UK


View:

Message:

[

First

|

Previous

|

Next

|

Last

]

By Topic:

[

First

|

Previous

|

Next

|

Last

]

By Author:

[

First

|

Previous

|

Next

|

Last

]

Font:

Monospaced Font

LISTSERV Archives

LISTSERV Archives

LIS-BIBLIOMETRICS Home

LIS-BIBLIOMETRICS Home

LIS-BIBLIOMETRICS  May 2019

LIS-BIBLIOMETRICS May 2019

Options

Subscribe or Unsubscribe

Subscribe or Unsubscribe

Log In

Log In

Get Password

Get Password

Subject:

Re: Consultation on criteria for fair & responsible university rankings

From:

"Isidro F. Aguillo" <[log in to unmask]>

Reply-To:

A bibliometrics discussion list for the Library and Research Community <[log in to unmask]>

Date:

Sat, 25 May 2019 09:58:45 +0200

Content-Type:

text/plain

Parts/Attachments:

Parts/Attachments

text/plain (427 lines)

Thanks for the prompt reply. My comments follow:

Elizabeth Gadd <[log in to unmask]> escribió:

> Many thanks for your very useful engagement Isidro. I have taken a
> number of things from your comments around:
>
>
> 1. Where we need to explain better what our criteria mean;

My major concern is not related further explanation, but regarding the
model you intend to adopt as standard. One of the keys for
understanding the success of the rankings is simplicity. I strongly
support some o the criteria you propose, but not for the current
understanding of what a ranking is. I going to provide an example:
ARWU global ranking is a ranking of universities, ARWU subject
rankings are a completely different product.

> 2. Where your particular ranking feels it already meets our criteria;

I think most of them, excluding mainly those that request more
complexit and multiranking

> 3. Some assumptions you appear to be making ('Big is beautiful'
> and 'Web is global');

Yes, both are personal positions. (1) I prefer to rank countries
according to GDP instead of FDP per capita and the same applies to
universities )2) Regarding discriminations for any reasons, the Web is
probably the more neutral proxy


> 4. Some challenges we will face when seeking to measure against
> these criteria ('Measure against mission' but which mission?)

All the missions. Again your are asking for a different model. I
strongly support the development of a huge reliable university
information system with rich updated profiles, but that is not a
ranking.

>> You make the point that 'we are scientists' and should therefore be
> trusted to rank fairly, but the use of rankings data by
> bibliometrics scholars that you highlight does rather challenge that
> point. Most rankings have serious validity problems, and yet are
> being used as a proxy for university 'excellence', even by scholars
> who should know better.

This should be the focus of the debate. It is not about setting up
standards for rankings 2.0 but to expel from scientific publication
the use of certain rankings


> I'm aware our work will not be universally popular, but some
> rankings have become very powerful on very little methodological
> foundation.

As just I mentioned, then the priority is not to promote new rules
that can be "trickly" adopted but to expel them.

I
> believe universities have to reflect this back to them, and this is
> an opportunity for them to do so.


I agree in the final aim, not in the proposed path to achieve

All best,

>
> All best
> Lizzie
>
>
> From: A bibliometrics discussion list for the Library and Research
> Community <[log in to unmask]> On Behalf Of Isidro F.
> Aguillo
> Sent: 24 May 2019 15:03
> To: [log in to unmask]
> Subject: Re: Consultation on criteria for fair & responsible
> university rankings
>
> Dear colleagues,
>
>
> In recent years, the number of attacks on the rankings by the
> bibliometric community has increased a lot of. That is a bit
> surprising as there are now hundreds of bibliometric papers using
> data from what are usually described as "prestigious" rankings.
>
>
> I think that part of that response is due to certain very
> questionable practices of some ranking editors. Therefore, the
> preparation of this series of criteria is an important step forward,
> but I think they have important biases due to the inclusion of
> personal interests and the ignorance of some other rankings that I
> believe are treated unfairly.
>
>
> Since I believe in transparency, I would like to accept the
> challenge of this group and as editor of the Ranking Web
> (webometrics) of Universities that I have published since 2004, I
> have allowed myself to answer briefly each of these criteria with my
> position regarding each one of them.
>
>
> In the Excel file, after our input, you can add your own comments.
> Or make the comments openly on the list.
>
>
> Your turn,
>
> El 22/05/2019 a las 14:32, Elizabeth Gadd escribió:
> Dear Lis-Bibliometrics colleagues,
>
> I'm delighted to announce that the Research Evaluation Working Group
> have now developed a draft set of criteria for fair and responsible
> university rankings and would welcome your feedback.
>
> This work is the first stage of the Rankings Sub-Group's efforts to
> develop a means of rating university rankers to highlight their
> current deficiencies and mitigate their negative impact on
> university behaviours.
>
> The criteria are available from the INORMS REWG web page at
> https://inorms.net/activities/research-evaluation-working-group/ and
> also included in this email below. We welcome discussion on this
> list or directly to the Rankings Sub-Group leader, Justin Shearer,
> on [log in to unmask]<mailto:[log in to unmask]>
>
> Best regards
> Lizzie
>
>
> [cid:image001.png@01D5124C.3A81D7A0]
> Research Evaluation Working Group
>
> What makes a fair and responsible university ranking?
> Draft criteria for comment
>
> Introduction
> The International Network of Research Management Societies (INORMS)
> established a two-year Research Evaluation Working Group (REWG) in
> 2018. It consists of representatives from a range of global member
> research management societies all seeking to work towards better,
> fairer and more meaningful research evaluation. One of the group's
> two areas of focus is the burgeoning influence of University
> Rankings on the behaviours of universities despite often poor
> methodological approaches and practices. The purpose of this
> work-package is to consider what we, as an international group of
> research managers, think the characteristics of a fair and
> responsible University Ranking should look like. The idea is to then
> 'turn the tables' on the rankings and rate them against our agreed
> criteria. We are now seeking feedback on our draft list of
> characteristics, particularly around:
>
> 1) Whether the characteristics, as written, make sense to you?
> 2) Are there any characteristics you think are missing?
> 3) What you think are the priority and non-priority characteristics?
>
> Please note that at this stage, we are not considering how these
> characteristics might be assessed, only whether they are desirable.
> The references in brackets lead to texts that inspired these
> principles, they are not direct quotations.
>
> The consultation is open until Monday 10 June and feedback can be
> emailed to either
> [log in to unmask]<mailto:[log in to unmask]> (if
> you are a member) or directly to the Rankings Sub-Group leader,
> Justin Shearer, on
> [log in to unmask]<mailto:[log in to unmask]>
>
> We look forward to hearing from you!
> Lizzie Gadd, INORMS REWG Chair
> _______________________________________________________________________________________________
> _______________________________________________________________________________________________
> 1. General approach
>
> * Profiles not rankings. Accepts that higher education and
> research organisation are complex, multi-faceted entities and
> provide a facility by which their range of strengths can be
> displayed. (BP)
> * Measure against mission. Accepts that different universities
> have different missions and provides a facility by which
> universities can be assessed against their own goals. (LM, BP,
> Blank, Shen)
> * One thing at a time. Does not combine indicators to create a
> composite metric. (YG1) (CWTS)
> * Provides context. Provides a link out to further qualitative
> and contextual information about the university being ranked (LM).
> * Damage limitation activity. Recognises and proactively seeks
> to limit the systemic effects of rankings. (LM, Adam)
> * No unfair advantage. Makes every effort to ensure the approach
> taken does not discriminate against organisations by size,
> disciplinary mix, language, wealth, age and geography.
>
> 2. Governance
>
> * Transparent aims. States clearly the purpose of the ranking
> and their target groups. (BP)
> * Engage with the ranked. Has a clear mechanism for engaging
> with both the academic faculty at ranked institutions and their
> senior managers, for example, through an independent international
> academic advisory board. (BP, Bilder et al)
> * Self-improving. Regularly applies measures of quality
> assurance to their ranking processes (BP).
> * No commercial gain. Does not seek to exploit their ranking for
> financial gain by, for example, offering consultancy services, or
> selling the underlying data.
> * Manage any conflict of interests. Where conflicts of interest
> may arise, makes every effort to manage these.
>
>
>
> 3. Methodology
>
> * Transparent methodology. Publishes full details of their
> ranking methodology, including detailed descriptions of the data
> sources being used, so that given the data a third party could
> replicate the results. (CWTS, DORA, BP)
> * Open and transparent data availability. Makes all data on
> which the ranking is based available openly so that those being
> evaluated can verify the data and analysis. (LM, DORA, BP)
> * Rigorous methodology. Data collection and analysis methods
> should pass tests of scientific rigour, including sample size,
> representation, normalisation, handling of outliers, etc.(BP)
> * No sloppy surveys. Uses opinion surveys sparingly, if at all,
> and ensures that where they are used that the methodology is sound
> and unbiased.
> * Open to correction. Data and indicators should be made
> available in a way that errors and faults can be corrected. Any
> adjustments that are made to the original data and indicators should
> be clearly indicated.(BP)
> * Deals with gaming. Has a published statement about what
> constitutes inappropriate manipulation of data submitted for ranking
> and what measures will be taken to combat this. (DORA)
> * Defines "University". When using multiple data sources to take
> measurements, uses a consistent definition of university across the
> different data sources. (E.g., universities with multiple campuses
> or teaching hospitals) (CWTS)
> * Outcomes over inputs. Measures of performance should be
> weighted towards outcomes rather than inputs. (BP)
>
>
>
> 4. Indicators
>
> * Validity. Indicators have a clear relationship with the
> characteristic they claim to measure. For example, teaching quality
> should not be indicated by staff-student ratios. (BP, YG2)
> * Sensitivity. Indicators are sensitive to the nature of the
> characteristic they claim to measure. (YG1)
> * Monotonicity. Does not use indicators where the 'best' score
> will vary according to the mission of an institution giving a
> monotonic relationship between the variable being measured and the
> value it has. For example, with staff-student ratios neither a ratio
> of 1:1 or 1:1000 is desirable, and the 'best' ratio will depend on
> the ambitions of the university. (YG1)
> * Size-independence. Indicators should not favour universities
> purely based on their size - either large or small.(CWTS)
> * Field-normalised. Indicators should normalise for disciplinary
> differences in the variable being measured. (LM)
> * Geographical equality. Indicators should not introduce
> regional disadvantages. For example, the use of bibliographic
> databases that do not have global representation. (LM, Maheu &
> Lacroix).
> * Honest about uncertainty. The statistical uncertainty of the
> data being presented should be clearly indicated using error bars,
> confidence intervals or other techniques. (LM)
>
>
> 5. Usability
>
> * Easy to use. The data presented is clearly labelled, easy to
> access, interpret, export and use.
>
> * Tailored to different audiences. The ranking provides
> different windows onto the data that may be relevant to different
> audiences. For example, by providing an opportunity to focus in on
> teaching elements for students.
> * Minimise workload for the ranked. The ranking requires minimal
> input from organisations being ranked so smaller and less wealthy
> institutions are not disadvantaged.
>
>
>
>
>
> References
>
>
>
> LM=Leiden Manifesto http://www.leidenmanifesto.org/ (2016)
>
> YG1=Yves Gingras. (2014). Bibliometrics and Research Evaluation:
> Uses and Abuses. Cambridge, Mass.: MIT Press.
>
> BP=Berlin Principles on Ranking of HE Institutions
>
> https://www.che.de/downloads/Berlin_Principles_IREG_534.pdf (2006)
>
> CWTS=CWTS 10 Principles for Responsible Use of University Rankings
> http://www.leidenranking.com/information/responsibleuse (2017)
>
> DORA= Declaration on Research Assessment https://sfdora.org/read/
>
> (2013)
>
> Bilder, Lin & Neylon, Principles for Open Scholarly Infrastructures
> https://figshare.com/articles/Principles_for_Open_Scholarly_Infrastructures_v1/1314859
> (2015)
> Adam, Edmund, New ranking to look at universities' contributions to
> sustainable development
> https://www.universityaffairs.ca/opinion/in-my-opinion/new-ranking-to-look-at-universities-contributions-to-sustainable-development/ (February
> 2019)
> Blank, Kim, University rankings: The Emperor has at least some
> clothes,
> https://www.universityaffairs.ca/opinion/in-my-opinion/university-rankings-emperor-least-clothes/ (November
> 2016)
> YG2 = Gingras, Yves, Academic rankings: The university's new
> clothes?
> https://www.universityaffairs.ca/opinion/in-my-opinion/academic-rankings-universitys-new-clothes/ (November
> 2016)
> Shen, Anqi, Consultations are underway for a 'made-in-Canada' Athena
> SWAN program,
> https://www.universityaffairs.ca/news/news-article/consultations-are-underway-for-a-made-in-canada-athena-swan-program/ (October
> 2018)
> Maheu, Louis & Robert Lacroix, The university rankings roller
> coaster,
> https://www.universityaffairs.ca/opinion/in-my-opinion/university-rankings-roller-coaster/ (February
> 2015)
> JW = Wilsdon, James. (2019) Deliver us from rankers.
> https://wonkhe.com/blogs/deliver-us-from-rankers/
>
>
>
> Dr Elizabeth Gadd FHEA MCLIP
> Research Policy Manager (Publications)
> Loughborough University
> Loughborough, Leics, UK, LE11 3TU
>
> Chair, INORMS Research Evaluation Working
> Group<https://inorms.net/activities/research-evaluation-working-group/>
> Chair,
> Lis-Bibliometrics<https://www.jiscmail.ac.uk/cgi-bin/webadmin?A0=LIS-BIBLIOMETRICS>
> Champion, ARMA Research Evaluation
> SIG<https://arma.ac.uk/special-interest-groups/>
>
> Working hours: M: 8.30-5/ Tu: 8.30-3/ W: 8.30-3/ F: 8.30-3
>
> Phone: +44 (0)1509228594
> Twitter: @lizziegadd
> Skype: lizziegadd
> Web: https://about.me/elizabeth.gadd
> ORCID: https://orcid.org/0000-0003-4509-7785
> ImpactStory: https://profiles.impactstory.org/u/0000-0003-4509-7785
>
>
> ________________________________
>
> To unsubscribe from the LIS-BIBLIOMETRICS list, click the following link:
> https://www.jiscmail.ac.uk/cgi-bin/webadmin?SUBED1=LIS-BIBLIOMETRICS&A=1
>
>
>
> --
>
> **************************************************************
>
> Isidro F. Aguillo
>
> Dr. Honoris Causa Universitas Indonesia
>
> Dr. Honoris Causa National Research Nuclear University Moscow
>
> Editor Rankings Web
>
> Cybermetrics Lab - Scimago Group, IPP-CSIC
>
> Madrid. SPAIN
>
>
>
> [log in to unmask]<mailto:[log in to unmask]>
>
> ORCID 0000-0001-8927-4873
>
> ResearcherID: A-7280-2008
>
> Scholar Citations SaCSbeoAAAAJ
>
> Twitter @isidroaguillo
>
> Rankings webometrics.info
>
> ***************************************************************
>
> [https://ipmcdn.avast.com/images/icons/icon-envelope-tick-round-orange-animated-no-repeat-v1.gif]<https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=emailclient>
> Libre de virus.
> www.avast.com<https://www.avast.com/sig-email?utm_medium=email&utm_source=link&utm_campaign=sig-email&utm_content=emailclient>
>
> ________________________________
>
> To unsubscribe from the LIS-BIBLIOMETRICS list, click the following link:
> https://www.jiscmail.ac.uk/cgi-bin/webadmin?SUBED1=LIS-BIBLIOMETRICS&A=1
>
> ########################################################################
>
> To unsubscribe from the LIS-BIBLIOMETRICS list, click the following link:
> https://www.jiscmail.ac.uk/cgi-bin/webadmin?SUBED1=LIS-BIBLIOMETRICS&A=1


--
Isidro F. Aguillo
Editor Rankings Web
Dr. Honoris Causa Universitas Indonesia
Dr. Honoris Causa National Nuclear Research University Moscow
Cybermetrics Lab - Scimago Group, IPP.
Spanish National Research Council CSIC
Albasanz, 26-28, Despacho 3E14; 28037 Madrid. SPAIN
isidro.aguillo @ csic.es
www. webometrics.info

########################################################################

To unsubscribe from the LIS-BIBLIOMETRICS list, click the following link:
https://www.jiscmail.ac.uk/cgi-bin/webadmin?SUBED1=LIS-BIBLIOMETRICS&A=1

Top of Message | Previous Page | Permalink

JiscMail Tools


RSS Feeds and Sharing


Advanced Options


Archives

May 2024
April 2024
March 2024
February 2024
January 2024
December 2023
November 2023
October 2023
September 2023
August 2023
July 2023
June 2023
May 2023
April 2023
March 2023
February 2023
January 2023
December 2022
November 2022
October 2022
September 2022
August 2022
July 2022
June 2022
May 2022
April 2022
March 2022
February 2022
January 2022
December 2021
November 2021
October 2021
September 2021
August 2021
July 2021
June 2021
May 2021
April 2021
March 2021
February 2021
January 2021
December 2020
November 2020
October 2020
September 2020
August 2020
July 2020
June 2020
May 2020
April 2020
March 2020
February 2020
January 2020
December 2019
November 2019
October 2019
September 2019
August 2019
July 2019
June 2019
May 2019
April 2019
March 2019
February 2019
January 2019
December 2018
November 2018
October 2018
September 2018
August 2018
July 2018
June 2018
May 2018
April 2018
March 2018
February 2018
January 2018
December 2017
November 2017
October 2017
September 2017
August 2017
July 2017
June 2017
May 2017
April 2017
March 2017
February 2017
January 2017
December 2016
November 2016
October 2016
September 2016
August 2016
July 2016
June 2016
May 2016
April 2016
March 2016
February 2016
January 2016
December 2015
November 2015
October 2015
September 2015
August 2015
July 2015
June 2015
May 2015
April 2015
March 2015
February 2015
December 2014
November 2014
October 2014
September 2014
August 2014
July 2014
June 2014
May 2014
April 2014
March 2014
February 2014
January 2014
December 2013
November 2013
October 2013
September 2013
August 2013
July 2013
June 2013
May 2013
April 2013
March 2013
February 2013
January 2013
December 2012
November 2012
October 2012
September 2012
August 2012
July 2012
June 2012
May 2012
March 2012
February 2012
January 2012
December 2011
October 2011
September 2011
August 2011
July 2011
June 2011
May 2011
April 2011
March 2011
February 2011
January 2011
December 2010


JiscMail is a Jisc service.

View our service policies at https://www.jiscmail.ac.uk/policyandsecurity/ and Jisc's privacy policy at https://www.jisc.ac.uk/website/privacy-notice

For help and support help@jisc.ac.uk

Secured by F-Secure Anti-Virus CataList Email List Search Powered by the LISTSERV Email List Manager