International Rankings of Academic Institutions—Abstract

Total Page:16

File Type:pdf, Size:1020Kb

International Rankings of Academic Institutions—Abstract Review April 20Th 2021 Written by: Dr. Eliran Zered | Approved by: Yuval Vurgan International Rankings of Academic Institutions—Abstract Abstract This document was written for the Knesset Education, Culture and Sports Committee following a discussion held in June 2020 on the place of Israeli universities in international rankings. The document will present a general background on the subject and will describe five major international rankings: the Academic Ranking of World Universities (ARWU), also known as the Shanghai Ranking; the Times Higher Education World University Rankings; the QS World University Rankings; the CWTS Leiden Ranking; and the SCImago Institutions Rankings. It will also present criticisms that have been expressed regarding the academic rankings, in terms of both methodology and, more fundamentally, their effects on the world of higher education. It will then review the placement of the Israeli universities in the five rankings over the years and present the positions of the Council for Higher Education (CHE) and of some of the universities themselves regarding the international rankings. At the committee's request, we have also included an appendix with economic data on the investment in higher education in Israel as compared to the OECD countries, as background for the discussion on the state of higher education in Israel. National rankings and comparisons of academic institutions were already common in the early twentieth century, but the first ranking that compared institutions on the international level, the Chinese Shanghai rankings, initially appeared in 2003. Other rankings appeared later, chief among them—alongside the Shanghai Ranking—the British Times ranking and the QS ranking, which first appeared in 2004. Each of the rankings has its own approach and research methods to examine the qualities of institutions of higher education around the world. The rankings examined in this document generally emphasize the research aspect of universities. Leiden is based almost exclusively on bibliometric rankings, Shanghai emphasizes various aspects of research excellence, and research comprises at least half of the scores in the Times and SCImago rankings. The Times ranking also emphasizes the quality of teaching and the SCImago ranking also accounts for Research & Information Center www.knesset.gov.il/mmm 2 | International Rankings of Academic Institutions—Abstract popularity and networking. The QS rankings place their main emphasis on reputation surveys among academics and employers. Criticisms of the rankings versus the benefits attributed to them As the rankings gained in popularity and influence and in use by students, universities, investors, and educational policy professionals, criticisms began to surface questioning the validity of the rankings and their ability to serve as a means of decision-making. The criticisms can be divided into two types: Methodological criticisms—Many doubts have been raised regarding the validity of the various parameters that the rankings examine; this is especially true of the subjective components and surveys, but also applies to the seemingly objective parameters related to scientific achievement. There are even questions regarding how to measure bibliometric data on the number of publications and quotations, such as whether it is possible to compare areas with different levels and methods of publicity (articles, conferences, books) and whether these data reflect all of an institution's work or even its academic quality. A fundamental criticism of the nature of the rankings' influence on the higher education system—This criticism states, among other things, that the rankings create a uniform set of parameters to test institutions of higher education around the world, thereby reducing the different institutions' varying, complex, and context-dependent aims to a superficial matter. It has also been suggested that there is very little mobility in the rankings of the leading institutions and that the rankings highlight 1% of the institutions of higher education in the world. According to this suggestion, it is generally the same wealthy, large, veteran, English- speaking institutions that lead the lists, with the rankings reinforcing this reality by increasing the prestige of these institutions and thereby allowing them to raise more money and attract the most accomplished students and staff. The other institutions are forced to adjust to the same standards and shift resources to meet them at the expense of other goals, even as their chances of actually competing with the leading institutions are extremely low. Despite their harsh critiques of the rankings, even the critics note the importance of the rankings in contributing to the creation of a system of transparency and accountability for institutions of higher education, which have traditionally enjoyed prestige, autonomy, and great freedom. They believe the rankings are sparking an important discussion about the quality of higher education and the proper ways of measuring and evaluating it. Some argue that the rankings create healthy competition that motivate institutions and governments to encourage and improve research, to invest in it, and to develop new methods of measuring and evaluating Research & Information Center www.knesset.gov.il/mmm 3 | International Rankings of Academic Institutions—Abstract quality. Researchers note that because the rankings are here to stay, we need to find ways to improve them, increase awareness of the need to use them with caution, and deal with the consequences they might have for higher education. It has also been suggested that the criticisms have led to the improvement of the rankings themselves, their methods of measurement, and their level of transparency. Effects of the use of rankings Studies cite evidence as to the impact of the international rankings on systems of higher education around the world and on decision-making by institutions and states. In some countries, institutions of higher education have been merged, and strategic plans have been launched to have institutions rank among the leaders. Various countries use the rankings to determine recognition of degrees and institutions, grant scholarships, conduct academic collaborations, and even set conditions for immigration standards. In the United States, several states use the rankings to set performance thresholds for the universities in their jurisdictions and to determine pay grades. Several countries have approached the rankings companies for advice on improving the rankings of the institutions of higher education within their borders. There is also evidence of attempts by institutions to improve their ranking by using various methods to manipulate the results, for example, in reports on students and faculty. Rankings of Israeli institutions In Israel, the issue of the international rankings of the country's academic institutions is periodically placed on the public agenda. In a discussion of the topic held in June 2020 in the Knesset Education, Culture and Sports Committee, the results of the QS ranking were presented, which showed that the Israeli universities had dropped several slots in the rankings. The representatives of the universities criticized the credibility of the rankings, both on methodological grounds and due to the fact that it is produced by a commercial company that engages in consulting and academic placement of students and faculty. Representatives of the CHE presented a similar critique and noted that the Israeli universities had achieved impressive results given their limited budget. A review of the positions of the Israeli universities in the five aforementioned rankings suggests that the Hebrew University, Tel Aviv University, the Technion, and the Weizmann Institute are the country's leading institutions. The general trend emerging from most of the rankings of almost all the Israeli universities is a gradual decline in recent years. Some of the universities have seen sharper drops in their ranking (the Technion in the Times rankings and Ben- Research & Information Center www.knesset.gov.il/mmm 4 | International Rankings of Academic Institutions—Abstract Gurion University in the QS rankings) and some have seen their ranking rise (the Technion and the Weizmann Institute in the Shanghai rankings). In order to examine Israel's position with respect to other countries in terms of the place of the country's universities in the rankings, the document presents data regarding the number of universities in every country that ranked among the top 200 institutions in the Shanghai and Times rankings. This presentation should be considered with extreme caution because there are significant differences between countries in terms of the number of universities; the countries leading the rankings generally have dozens or hundreds of institutions of higher education. The Shanghai rankings have four Israeli institutions among the top 200, and Israel ranks alongside Belgium in places 12–13 among countries. The Times rankings include only one Israeli university (Tel Aviv University) among the top 200 institutions, which places Israel 26th among all countries and 21st among OECD countries. Because of the differences in scale between countries, we also present data on the countries' ranking relative to their population size. Using this measure, Israel ranks fourth among countries on the Shanghai rankings. Policy in Israel and comments from the institutions The CHE noted in its response to the inquiry from
Recommended publications
  • IREG Inventory of International University Rankings
    IREG Inventory of International University Rankings 2021 The purpose of IREG Observatory on Academic Ranking and Excellence is to strengthen public awareness and understanding of university rankings and their role in reflecting quality of higher education and academic excellence. from the IREG Observatory Statute IREG Observatory on Academic Ranking and Excellence (IREG stands for International Ranking Expert Group) www.ireg-observatory.org IREG Inventory of International University Rankings 2021 Brussels-Warsaw 2021 www.ireg-observatory.org/en/inventory-international-rankings The „IREG Inventory of International University Rankings” was prepared by the Perspektywy Education Foundation at the request of the IREG Observatory on Academic Ranking and Excellence whose aim is the improvement of the quality of academic rankings and quality of higher education. IREG Observatory on Academic Ranking and Excellence rue Washington 40 1050 Brussels, Belgium www.ireg-observatory.org PERSPEKTYWY Education Foundation 31 Nowogrodzka Str., 00-511 Warsaw, Poland www.perspektywy.org © IREG Observatory on Academic Ranking and Excellence This publication is based on information made available by ranking organizations. The publisher has made every effort to ensure the accuracy and completeness of the information contained in this publication, however, it takes no responsibility for any errors or omissions. The information listed is subject to change. Edited by Waldemar Siwinski, Richard Holmes, Justyna Kopanska DTP: Artur Zebrowski, Karolina Sitnicka This publication is available at www.ireg-observatory.org/en/inventory-international-rankings Warsaw 2021 ISBN: 978-83-61239-61-1 3 Executive summary IREG Observatory on Academic Ranking and Excellence initiated a project called “IREG Inventory of International University Rankings (Global and Regional)” as a part of its statutory mission.
    [Show full text]
  • Performance Ranking of Scientific Papers for World Universities
    Performance Ranking of Scientific Papers for World Universities Sito: http://nturanking.lis.ntu.edu.tw/ Nazione: Taiwan Il ranking viene pubblicato dalla National Taiwan University Ranking dal 2007. La classifica è basata su una serie di indicatori bibliometrici (8 raggruppati in 3 aree) ricavabili dalla banca dati Clarivate Analitics. Le aree riguardano Produttività, Impatto ed Eccellenza della ricerca scientifica, con peso rispettivamente 25%, 35%, 40%. 7 indicatori su 8 dipendono dalla dimensione degli atenei, pertanto questo ranking è molto influenzato dal fattore dimensionale. Numero Posizione Numero Posizione di Atenei di Bari di Atenei di Bari italiani tra gli in nel in Atenei Primo ateneo in Ranking Anno classifica Mondo classifica italiani Italia Performance Ranking of 2020 500 360 26 15 Padova Scientific Papers for World Universities 2019 500 374 28 15 Padova 2018 500 372 28 14 Padova CWTS Leiden Ranking Sito: https://www.leidenranking.com/ Nazione: Paesi Bassi La classifica della CWTS Leiden è un ranking internazionale basato su dati e analisi bibliometrici e curato dal Centre for Science and Technology Studies (CWTS) della Leiden University dal 2007. Il Direzione Generale – Staff Sviluppo Organizzativo, Programmazione, Controllo e Valutazione – U.O. Statistiche di Ateneo Palazzo Ateneo P.zza Umberto I, 70121 Bari (Italia) – tel. 080-5714001 - [email protected] - www.uniba.it 1 database usato per ricavare i dati bibliometrici è quello di Clarivate Analytics (Web of Science). Al contrario di altri ranking della ricerca, ad esempio Taiwan ranking e URAP, gli indicatori prodotti da Leiden ranking non vengono aggregati per ottenere un unico valore di sintesi; per ciascuno viene presentato un ranking che considera o meno il grado di proprietà (fraction counting vs full counting).
    [Show full text]
  • Analyzing the Activities of Visitors of the Leiden Ranking Website
    STI 2018 Conference Proceedings Proceedings of the 23rd International Conference on Science and Technology Indicators All papers published in this conference proceedings have been peer reviewed through a peer review process administered by the proceedings Editors. Reviews were conducted by expert referees to the professional and scientific standards expected of a conference proceedings. Chair of the Conference Paul Wouters Scientific Editors Rodrigo Costas Thomas Franssen Alfredo Yegros-Yegros Layout Andrea Reyes Elizondo Suze van der Luijt-Jansen The articles of this collection can be accessed at https://hdl.handle.net/1887/64521 ISBN: 978-90-9031204-0 © of the text: the authors © 2018 Centre for Science and Technology Studies (CWTS), Leiden University, The Netherlands This ARTICLE is licensed under a Creative Commons Atribution-NonCommercial-NonDetivates 4.0 International Licensed Analyzing the activities of visitors of the Leiden Ranking website Nees Jan van Eck and Ludo Waltman [email protected]; [email protected] Centre for Science and Technology Studies, Leiden University, PO box 905, 2300 AX Leiden (The Netherlands) Introduction In the scientometric literature, university rankings are discussed primarily from a methodological point of view (e.g., Waltman et al., 2012). In this paper, we take a different perspective. In our view, constructing a high-quality university ranking requires not only an advanced understanding of methodological issues but also a sufficient level of knowledge of the way in which university rankings are used. The use of university rankings has been studied using questionnaires and interviews (e.g., Hazelkorn, 2015). We take an alternative approach by analyzing the activities of visitors of a university ranking website.
    [Show full text]
  • On the Necessity of Multiple University Rankings
    COLLNET Journal of Scientometrics and Information Management ISSN : 0973 – 7766 (Print) 2168 – 930X (Online) DOI : 10.1080/09737766.2018.1550043 On the necessity of multiple university rankings Lefteris Angelis Nick Bassiliades Yannis Manolopoulos Nowadays university rankings are ubiquitous commodities; a plethora of them is published every year by private enterprises, state authorities and universities. University rankings are very popular to governments, journalists, university administrations and families as well. At the same time, they are heavily criticized as being very subjective and contradic- tory to each other. University rankings have been studied with respect to political, educational and data management aspects. In this paper, we Lefteris Angelis* focus on a specific research question regarding the alignment of some School of Informatics well-known such rankings, ultimately targeting to investigate the use- Aristotle University of fulness of the variety of all these rankings. First, we describe in detail Thessaloniki the methodology to collect and homogenize the data and, second, we Thessaloniki 54124 statistically analyze these data to examine the correlation among the dif- Greece ferent rankings. The results show that despite their statistically signifi- [email protected] cant correlation, there are many cases of high divergence and instability, which can be reduced by ordered categorization. Our conclusion is that Nick Bassiliades if, in principle, someone accepts the reliability of university rankings, School of Informatics the necessity and the usefulness, of all of them is questionable since Aristotle University of only few of them could be sufficient representatives of the whole set. Thessaloniki The overabundance of university rankings is especially conspicuous for Thessaloniki 54124 the top universities.
    [Show full text]
  • Strategically Engaging with International Rankings – Experiences from Three Universities
    STRATEGICALLY ENGAGING WITH INTERNATIONAL RANKINGS – EXPERIENCES FROM THREE UNIVERSITIES The Illuminate Consulting Group 22 January 2019 ICG © 2019 AIEA Conference International Rankings – 22 February 2019 1 DISCLAIMER • This presentation was delivered by ICG, Auburn University, Case Western Reserve University, and the University of Rochester on 22 February 2019 at the AIEA Conference in San Francisco. • The presentation shall be considered incomplete without oral clarification. • The opinions expressed in this presentation are those of the authors alone. • ICG makes no warranty regarding any claim or data presented in this presentation, and does not take any responsibility for any third party acting upon information contained in this presentation. • Copyright ICG, 2019. All rights reserved. ICG © 2019 AIEA Conference International Rankings – 22 February 2019 2 CONTENTS Introduction and Housekeeping Overview of International Rankings Auburn, CWRU, and Rochester in Key International Rankings Auburn’s Rankings Journey Rochester's Rankings Journey CWRU's Rankings Journey Panelist Discussion Audience Discussion ICG © 2019 AIEA Conference International Rankings – 22 February 2019 3 PRESENTERS AND CHAIR: BIOS Bios • David Fleshler serves as the Vice Provost for International Affairs at Case Western Reserve University. He has been involved in a number of leadership roles in international education, including service with the AIEA. David received his Bachelor's degree from the University of Michigan, and a JD from Boston College Law School. • Andy Gillespie serves as the Assistant Provost for International Programs at Auburn University. He has been involved in a number of leadership roles in international education, including service with the AIEA. He holds a Bachelor’s degree in Natural Resource Management from the State University of New York, a Master’s degree in Forest Biology from the University of New Hampshire, and a Ph.D.
    [Show full text]
  • Leiden Ranking 2017
    Leiden Ranking 2017 Methodology Centre for Science and Technology Studies, Leiden University Data The CWTS Leiden Ranking 2017 is based exclusively on bibliographic data from the Web of Science database produced by Clarivate Analytics. Below we discuss the Web of Science data that is used in the Leiden Ranking. We also discuss the enrichments made to this data by CWTS. Web of Science The Web of Science database consists of a number of citation indices. The Leiden Ranking uses data from the Science Citation Index Expanded, the Social Sciences Citation Index, and the Arts & Humanities Citation Index. The Leiden Ranking is based on Web of Science data because Web of Science offers a good coverage of the international scientific literature and generally provides high quality data. The Leiden Ranking does not take into account conference proceedings publications and book publications. This is an important limitation in certain research fields, especially in computer science, engineering, and the social sciences and humanities. Enriched data CWTS enriches Web of Science data in a number of ways. First of all, CWTS performs its own citation matching (i.e., matching of cited references to the publications they refer to). Furthermore, in order to calculate the distance-based collaboration indicators included in the Leiden Ranking, CWTS performs geocoding of the addresses listed in publications in Web of Science. Most importantly, CWTS puts a lot of effort in assigning publications to universities in a consistent and accurate way. This is by no means a trivial issue. Universities may be referred to using many different name variants, and the definition and delimitation of universities is not obvious at all.
    [Show full text]
  • Aggregate Ranking of the World's Leading Universities
    1 Webology, Volume 12, Number 1, June, 2015 Home Table of Contents Titles & Subject Index Authors Index Aggregate ranking of the world's leading universities Vladimir M. Moskovkin Belgorod State University, Pobeda St., 85, 308015, Belgorod, Russian Federation. E-mail: [email protected] Nikolay A. Golikov Independent Researcher, Kharkov, Ukraine. E-mail: [email protected] Andrey P. Peresypkin Belgorod State University, Pobeda St., 85, 308015, Belgorod, Russian Federation. E-mail: [email protected] Olesya V. Serkina Belgorod State University, Pobeda St., 85, 308015, Belgorod, Russian Federation. E-mail: [email protected] Received October 15, 2014; Accepted June 15, 2015 Abstract The paper presents a methodology for calculating the aggregate global university ranking (Aggregated Global University Ranking, or AGUR), which consists of an automated presentation of the comparable lists of names for different universities from particular global university rankings (using Machine Learning and Mining Data algorithms) and a simple procedure of aggregating particular global university rankings (summing up the university ranking positions from different particular rankings and their subsequent ranking). The second procedure makes it possible to bring lists of universities from particular rankings, which are nonidentical by length, to one size. The paper includes a sample AGUR for six particular global university rankings as of 2013, as well as cross- correlation matrices and intersection matrices for AGUR for 2011-2013, all created by means
    [Show full text]
  • Leiden Ranking 2020
    Leiden Ranking 2020 Methodology Centre for Science and Technology Studies, Leiden University Data The CWTS Leiden Ranking 2020 is based on bibliographic data from the Web of Science database produced by Clarivate Analytics. Below we discuss the Web of Science data that is used in the Leiden Ranking. We also discuss the enrichments made to this data by CWTS. Web of Science The Web of Science database consists of a number of citation indices. The Leiden Ranking uses data from the Science Citation Index Expanded, the Social Sciences Citation Index, and the Arts & Humanities Citation Index. The Leiden Ranking is based on Web of Science data because Web of Science offers a good coverage of the international scientific literature and generally provides high quality data. The Leiden Ranking does not take into account conference proceedings publications and book publications. This is an important limitation in certain research fields, especially in computer science, engineering, and the social sciences and humanities. Enriched data CWTS enriches Web of Science data in a number of ways. First of all, CWTS performs its own citation matching (i.e., matching of cited references to the publications they refer to). Furthermore, in order to calculate the various indicators included in the Leiden Ranking, CWTS identifies publications by industrial organizations in Web of Science, CWTS performs geocoding of the addresses listed in publications, CWTS assigns open access labels (gold, hybrid, bronze, green) to publications, and CWTS disambiguates authors and attempts to determine their gender. Most importantly, CWTS puts a lot of effort in assigning publications to universities in a consistent and accurate way.
    [Show full text]
  • College Rankings
    Project Number: IQP-CEW-1401 COLLEGE RANKINGS An Interactive Qualifying Project Report submitted to the Faculty of the WORCESTER POLYTECHNIC INSTITUTE in partial fulfillment of the requirements for the Degree of Bachelor of Science by ______________________________ Xiaoyu Wang and ______________________________ Yifan Zhao Date: May 1, 2014 ______________________________ Professor Craig E. Wills, Project Advisor Abstract Many college rankings exist, each based on a set of factors determined by publishers of the rankings. People considering colleges often use college rankings as a tool to aid them in their search. This project compares the methodology of rankings by organizing the factors of each into six categories. It was found that worldwide rankings have a much higher weighting on research than U.S.-only rankings. In addition a survey was conducted over different demographic groups. From the survey results an ideal ranking was constructed for different groups and compared to existing rankings. All demographic groups examined seek a better mix of categorized factors than any existing ranking provides. ii Table of Contents Abstract ......................................................................................................................................................... ii 1. Introduction .............................................................................................................................................. 5 1.1 Road Map ...........................................................................................................................................
    [Show full text]
  • The Leiden Ranking
    CWTS B.V. Centre for Science and Technology Studies Leiden University Wassenaarseweg 62A P.O. Box 905 2300 AX Leiden The Netherlands T: +31 71 527 39 09 F: +31 71 527 39 11 E: [email protected] The Leiden Ranking www.cwts.nl April 2015 Table of Contents The Leiden Ranking ...........................................................................................................1 General background and characteristics ...................................................................2 Methodology.........................................................................................................................2 Data collection.............................................................................................................................. 2 Identification of universities .............................................................................................................3 Affiliated institutions............................................................................................................................3 The 750 universities: selection and counting method............................................................4 Data quality...............................................................................................................................................4 Main fields...................................................................................................................................... 4 Indicators ......................................................................................................................................
    [Show full text]
  • The Brazilian Ranking of Research and the Berlin Principles for Rankings of Institutions of Higher Education ABSTRACT University
    PROFUTURO: FUTURE STUDIES PROGRAM Scientific Editor: James Terence Coulter Wright Evaluation: Double Blind Review, by SEER/OJS Review: Grammar, normative and layout Received on: 11/23/2015 Approved on: 8/20/2016 The Brazilian Ranking of Research and the Berlin principles for Rankings of Institutions of Higher Education Sibele Fausto Librarian from Universidade de São Paulo (USP), Brazil [email protected] Clara Calero-Medina Researcher from Center of Studies of Science and Technology from Universidade de Leiden, The Netherlands [email protected] Ed Noyons Researcher from Center of Studies of Science and Technology from Universidade de Leiden, The Netherlands [email protected] ABSTRACT University rankings are gaining prominence as assessment tools of higher education institutions. In recent years there have emerged various rankings, either with national or international focus. The CWTS Brazilian Research Ranking (BRR) was launched in 2014 in Brazil and measures the performance of Brazilian scientific research institutions (not just universities). Using a sophisticated set of bibliometric indicators, this ranking aims to provide highly accurate measurements of the scientific impact of these organizations and their involvement in scientific collaboration, and its data source is the Web of Science database, considering indexed publications between 2003 and 2012. The aim of this paper is an analysis and a discussion if the BRR follows the recommendations from the document "Berlin Principles for Higher Education Institutions Rankings", published in 2006 by the International Rankings Expert Group, which contains a set of sixteen guidelines to guide producers in developing their rankings. The comparison of the BRR with the Berlin principles showed that this ranking is close to complete its accordance with the recommendations for rankings.
    [Show full text]
  • Purposeful Evaluation of Scholarship in the Open Science Era
    challenges Article Purposeful Evaluation of Scholarship in the Open Science Era Mario Pagliaro Istituto per lo Studio dei Materiali Nanostrutturati, CNR, via U. La Malfa 153, 90146 Palermo, Italy; [email protected] Abstract: In most of the world’s countries, scholarship evaluation for tenure and promotion continues to rely on conventional criteria of publications in journals of high impact factor and achievements in securing research funds. Continuing to hire and promote scholars based on these criteria exposes universities to risk because students, directly and indirectly through government funds, are the main source of revenues for academic institutions. At the same time, talented young researchers increasingly look for professors renowned for excellence in mentoring doctoral students and early career researchers. Purposeful scholarship evaluation in the open science era needs to include all three areas of scholarly activity: research, teaching and mentoring, and service to society. Keywords: scholarship evaluation; tenure and promotion; teaching and mentoring; researcher evaluation; academic career; open science 1. Introduction Plentiful research has been devoted in the last three decades (1990–2019) to scholarship evaluation for granting tenure and promotion to higher professor rankings [1–3]. Academic tenure (from Latin tenere “to hold”) is permanent employment at universities, safeguarding scholarly freedom to conduct research in any area without endangering the tenured scholar future position at the university [4]. Citation: Pagliaro, M. Purposeful Following the broadened scholarship concept proposed by Boyer [5], today’s scholars Evaluation of Scholarship in the Open in academic evaluation generally agree that beyond achievements in research, evaluation Science Era. Challenges 2021, 12, 6. should take into account teaching as well as scholarship of service and integration [3].
    [Show full text]