Johann Mouton (CREST and SciSTIP) [email protected] “Rankings are hypnotic and become an end in themselves without regard to exactly what they measure, whether they are solidly grounded or whether their use has constructive effects. The desire for rank ordering overrules all else” (Simon Marginson 2007). Tied for 1st among research universities (with UC Berkeley and Penn State) as the top producer of U.S. Fulbright Scholars, 2012-13 1st in the world in the category of research impact in life and earth sciences (Centre for Science and Technology Studies at , , 2015) news story 4th internationally and 1st in the U.S. in 2014 UI GreenMetric World University Ranking 6th among universities with the most students hired by top companies in and around Silicon Valley (Business Insider 2015) 9th among U.S. institutions granting undergraduate degrees to students of color for 2013-14 (Diverse Issues in Higher Education) 9th among U.S. universities in the number of international scholars 2013-14 (“Open Doors Report on International Educational Exchange”) 10th best female-led institutions among the world’s top universities for 2015 (The World University Rankings) 16th among U.S. universities based on our contributions to society ( 2014) 24th public research universities for graduation rates over six years at 81.3 percent and 46th over four years at 51.3 percent (Chronicle of Higher Education College Completion 2013) 36rd among the nation's research universities and 55th among the world's research universities (2014 Academic Ranking of World Universities by the Center for World-Class Universities of Shanghai Jiao Tong University) Tied 37th among 500 best global universities in 51 countries (U.S. News and World Report's 2015 Best Global Universities) 44th overall among the world's universities (The World University Rankings 2013-14 by Times Higher Education) Between 51 and 60 in The World Reputation Rankings 2014 (Times Higher Education) 57rd overall among the world's universities (Center for World University Rankings 2013) Rankings are pervasive

University rankings or league tables have been around for a long time. The first nationwide university ranking was published in the United States in 1983 by US News and World Report. However, classifications and specialised university rankings with a narrower focus had already been compiled in the US since 1870 (Salmi & Saroyan, 2007). But it is especially over the past 10 – 15 years, with the advent of growing trends in globalisation and internationalisation and with the new demands of accountability fuelled by NPM and related developments, that we have seen an explosion of university ranking systems. In a recent EU report the authors write:

Despite their many shortcomings, biases and flaws’ rankings enjoy a high level of acceptance among stakeholders and the wider public because of their simplicity and consumer-type information ’ (AUBR Expert Group 2009). Thus, university rankings are not going to disappear; indeed, the number of rankings is expected to increase although they will become more specialised (Marginson, 2011)

Themes

1. Caveats and limitations of rankings 2. SU in the world rankings 3. Understanding the deep structure of rankings 4. A South African rankings system? 5. Using research performance measures for strategy and planning

Caveats and limitations – A select group

The most popular global league tables (ARWU, THE-QS and THE-Thomson Reuters, US News and World Report Ranking (USNWR), HEEACT, Reitor and others) concern the world’s top universities only. First of all, the league tables include roughly 1% to 3% of universities (200-500 universities) out of approximately 17,000 universities in the world. Secondly, it is important to note that the rankings producing global league tables use methodologies that simply cannot produce stable results for more than 700-1200 universities in global league tables and just around 300 universities in subject area rankings.

Caveats and limitations – subjective and limited

Composite scores always contain subjective elements. In all cases where a composite score is calculated, ranking providers assign weights to each indicator in the overall score. This means that the ranking provider’s subjective judgement determines which indicators are more important. In other words, the composite score reflects the ranking provider’s concept of quality. The above considerations demonstrate why rankings producing league tables cannot, in principle, be ‘objective’. Overall, global university rankings reflect university research performance far more accurately than teaching. The bibliometric indicators, which are used for measuring research performance in most rankings, also have their biases and flaws, but they still are direct measurements. Existing indicators on teaching are all proxies and their link to the quality of teaching is indirect at best. Caveats and limitations - natural science and medicine bias

The natural sciences and medicine bias in most rankings has been apparent since the first ARWU ranking was published in 2003. Natural sciences and medicine are favoured by all rankings based on bibliometric indicators – the ISI 21 ‘broad subject areas’ are mainly sub-areas of natural sciences and medicine, while social sciences are underrepresented and humanities are simply ignored. At the same time, various areas have different publication and citation cultures. There are more publications and more citations per publication in natural sciences and especially in medicine, in particular because the main citations databases – WoS and Scopus – have little coverage of books. Caveats and limitations - Peer review bias

Peer review bias. The term ‘peer review’ itself is ambiguous as it is used to denote quite different processes in quality assurance (QA) and rankings. In QA of both research and teaching, the term ‘peer review’ is used for assessment by (usually visiting) peers, which involves rigorous procedures. By contrast, in rankings, ‘peer review’ exercises are usually no more than reputation surveys. In the THE-QS ranking, even if a large number of academics have been approached, only some 5% actually answered. Secondly, at least in the case of the THE-Qs based ranking, the ‘peers’ are not in fact nominating the universities they consider excellent – they are restricted to pre-prepared lists, from which many universities and even whole countries have been omitted. Thirdly, there is evidence that the opinion of ‘peers’ can be influenced by the reputation that an institution has already built up (AUBR, 2010). Caveats and limitations - Language and regional bias Language bias and regional bias. It has been noted since the publication of the first world rankings that global rankings favour universities from English-language nations because non-English language work is both published and cited less. A recent study by the Leiden Ranking team has shown that the of publications of French and German universities in French or German, respectively, was smaller than the citation impact of publications of the same universities published in English (van Raan et al., 2010). Benefits of rankings?

Universities are often either flattered or ashamed depending on their current position in the league table or the change of position since the previous year. There are forces both inside and outside the university encouraging it to make every effort to improve its position in the rankings or simply be included in the league tables at all costs. So what are the (perceived) benefits of such rankings?

• Rankings can inform a student’s choice of institution. • Rakings can promote a culture of transparency. • Rankings strengthen competition among and often bring about policy change in universities, which strive to improve their standing in the league tables. • Rankings provide simple and easily readable information and are therefore beginning to be used as a basis for funding allocations to universities, as well as for developing national or regional higher education policies.

An overview of SU in selected World Ranking Systems (2014)

THE World QS World CWTS Leiden Shanghai University University Ranking (out of 500) Ranking Ranking (out of 750) (out of 401) (out of 800) UCT 201-300 238 124 141 WITS 201-300 575 251-275 318 SU 401-500 481 276 390 UKZN 401-500 643 471-480 UP 683 501-550 Rhodes 601-650 UJ 601-650 Academic Ranking of World Universities (Shanghai)

2014 2013 2012 World National World National World National UCT 201-300 1-2 201-300 1 201-300 1

WITS 201-300 1-2 301-400 2 301-400 2 SU 401-500 3-4 UKZN 401-500 3-4 401-500 3 401-500 3

Criteria Indicator Code Weight Quality of Education Alumni of an institution winning Nobel Prizes and Fields Medals Alumni 10% Quality of Faculty Staff of an institution winning Nobel Prizes and Fields Medals Award 20% Highly cited researchers in 21 broad subject categories HiCi 20% Research Output Papers published in Nature and Science* N&S 20% Papers indexed in Science Citation Index-expanded and Social PUB 20% Science Citation Index Per Capita Performance Per capita academic performance of an institution PCP 10% Total 100% * For institutions specialized in humanities and social sciences such as London School of Economics, N&S is not considered, and the weight of N&S is relocated to other indicators. CWTS Leiden Ranking

2014 2013 2011/12 World Region World Region World Region UCT 238 1 217 1 333 1 SU 481 2 395 2 432 2 WITS 575 3 415 3 445 3 UKZN 643 5 466 4 UP 683 7 470 5 489 4

The CWTS Leiden Ranking 2014 is based on publications in Thomson Reuters' database in the period 2009 – 2012. Within the Web of Science database, only publications in international scientific journals are included. In addition, only publications of the Web of Science document types, article and review are considered. The Leiden Ranking offers the following indicators: Impact indicators; Collaboration indicators; Core journals; Size-dependent vs size-independent indicators; Counting method and Stability intervals Times Higher Education World University Rankings

2014-2015 2013-2014 2012-2013 2011-2012 UCT 124 126 113 103 WITS 251-275 226-250 226-250 251-275 SU 276 301-350 251-275 251-275 UKZN 351-400

Overall indicator Individual indicator Percentage weighting Industry Income – innovation Research income from industry (per academic staff) 2.5% Ratio of international to domestic staff 3% International diversity Ratio of international to domestic students 2% Reputational survey (teaching) 15% PhDs awards per academic 6% Teaching – the learning Undergrad. admitted per academic 4.5% environment Income per academic 2.25% PhDs/undergraduate degrees awarded 2.25% Reputational survey (research) 19.5% Research – volume, income Research income (scaled) 5.25% and reputation Papers per research and academic staff 4.5% Public research income/ total research income 0.75% Citation impact (normalised average citation per Citations – research influence 32.5% paper) QS World University Rankings

2014 2013 2013 World Region World Region World Region UCT 141 1 145 1 154 1 WITS 318 2 313 2 363 2 SU 390 4 387 4 401-450 4 UP 471-480 5 471-480 5 501-550 6 UKZN 501-550 6 501-550 6 551-600 8 Rhodes 601-650 8 551-600 7 UJ 601-650 9 601-650 9

The rankings compare the top 800 universities across four broad areas of interest to prospective students: research, teaching, employability and international outlook. These four key areas are assessed using six indicators, each of which is given a different percentage weighting (see below). Four of the indicators are based on ‘hard’ data, and the remaining two on major global surveys – one of academics and another of employers – each the largest of their kind. Below is a guide to each of the six indicators used: • Academic Reputation – 40% (Based on a global survey of academics) • Employer reputation – 10% (Based on a global survey of graduate employers) • Faculty/student ratio – 20% (An indication of commitment to teaching) • Citations per faculty – 20% (An indication of research impact) • ratio – 5% (Measuring international diversity of the student community) • International Staff ratio – 5% (Measuring international diversity of the academic faculty)

Why UCT is consistently ranked higher than SU

Universities – despite common missions – are still very different in many respects. This is undoubtedly true globally, but even for universities within the same science or higher education system. Within the South African university system we find embedded deep structural differences that are the direct result of decades of historical, socio-political and geographical differences.

We have coined the term the “the shape of knowledge production” to refer to these differences. One way to operationally define the “shape of knowledge production” at a specific university is look at the distribution of papers or article equivalents by journal list or index. UCT Output (Full Papers) by Journal Index and year (2006 – 2011)

100% 13 29 28 24 42 77 216 218 236 195 90% 204 223 307 257 353 80% 210 245 264 70%

60%

50%

40% 1719 1848 1944 1042 1296 1403 30%

20%

10%

0% 2006 2007 2008 2009 2010 2011 Foreign ISI SA (ISI) Local SA IBSS UP Output (Fractional Counts) by Journal Index and year (2005 – 2010)

100%

90% 182.81 224.03 223.84 253.61 235.80 194.85 80% 70% 368.45 362.83 317.43 60% 395.20 374.84 368.15 50% 40%

30% 460.02 452.31 462.11 442.48 455.82 498.03 20% 10% 66.17 0% 30.84 27.20 43.78 45.28 44.70 2005 2006 2007 2008 2009 2010 IBSS SA (Local) WOS (Foreign) WOS (SA) UNISA Output (Full Papers) by Journal Index and year (2006 – 2011)

100% 90% 80%

70% 538 552 60% 485 429 573 481 50% 40% 30% 147 20% 53 105 68 83 83 61 83 10% 30 57 57 63 60 0% 34 34 41 20 41 2006 2007 2008 2009 2010 2011 Foreign Journals not in WoS SA journals in WoS Foreign Journals in WoS SA Journals not in WoS The shape of knowledge production and university rankings

Given the importance/weight accorded to publications in the Web of Science (and also Scopus) it is obvious that those universities which have a very high presence in the WoS have a distinct “advantage” in rankings. Given the “interaction” effects with other indicators (such as between scientific fields and collaboration and between collaboration and citation visibility), this advantage becomes even more pronounced.

So it should be obvious how universities should respond! Simply increase your research articles in these indexes…….

Are there SA rankings systems?

Although there is no “official” South African university rankings system, the annual research reports by the DHET are producing (since 2005) an unofficial ranking of SA universities. We present examples of two versions of these below. In addition SciSTIP is currently involved in the development of a comprehensive, appropriate and customised ranking system for SA universities. DHET Rankings?

After the revision of the Research Funding Framework in 2003, the DHET subsequently recognized 6 categories of research/ knowledge production for subsidy purposes. These are: 1. Articles in a journals recognized by the department 2. & 3. Books and book chapters published by publishers recognized by the department 4. Papers published in peer-reviewed conference proceedings 5. Research Masters students graduated 6. PhD students graduated

On the basis of these at least two different rankings of SA universities are possible: the first lists universities in terms of the per capita research output (first 4 categories); the second in terms of weighted per capita knowledge output (all 6 categories). Ranking of SA universities in terms of average per capita research output (2008 – 2013)

1.40 1.27 1.25 1.20 1.16

0.99 1.00 0.92 0.88

0.80 0.72 0.65 0.63 0.59 0.60 0.53 0.52 0.48

0.40 0.30 0.27 0.24 0.18 0.17 0.17 0.17 0.20 0.12 0.07 0.07

0.00 Weighted Output per Capita (2008 – 2013)

3.00

2.54 2.50 2.29 2.07 2.00 1.87 1.79 1.57 1.50 1.35 1.32 1.26 1.26 1.19 1.17

1.00 0.86 0.63 0.48 0.46 0.40 0.50 0.37 0.34 0.26 0.24 0.09 0.07 0.00 Ranking of SA universities i.t.o. weighted per capita knowledge output (2008 - 2013)

Rank University 2008 2009 2010 2011 2012 2013 Average

1 2.13 2.3 2.37 2.38 3.06 2.97 2.54 2 2.01 2.3 2.21 2.24 2.38 2.57 2.29 3 Rhodes University 1.75 1.75 1.92 2.17 2.31 2.49 2.07 4 University of the Witwatersrand 1.53 1.75 1.68 1.99 1.94 2.32 1.87 5 University of Pretoria 1.35 1.4 1.43 2.03 2.14 2.4 1.79 6 University of KwaZulu-Natal 1.2 1.4 1.48 1.49 1.78 2.08 1.57 7 University of the Western Cape 0.93 1.11 1.31 1.48 1.51 1.75 1.35 8 North West University 1.17 1.19 1.22 1.21 1.41 1.69 1.32 9 University of the Free State 1.06 1.28 1.31 1.39 1.26 1.27 1.26 10 University of Johannesburg 1.04 1.05 1.13 1.42 1.47 1.43 1.26 11 Nelson Mandela Metropolitan University 0.95 0.89 1.14 1.37 1.42 1.39 1.19 12 University of Fort Hare 0.49 0.84 1.26 1.49 1.53 1.38 1.17 13 University of South Africa 0.72 0.67 0.7 0.84 1.05 1.19 0.86 14 University of Zululand 0.53 0.69 0.69 0.64 0.67 0.53 0.63 15 Tshwane University of Technology 0.34 0.41 0.44 0.55 0.58 0.57 0.48 16 University of Venda 0.32 0.29 0.41 0.61 0.55 0.57 0.46 17 University of Limpopo 0.28 0.34 0.34 0.38 0.54 0.54 0.40 18 Cape Peninsula University of Technology 0.25 0.35 0.36 0.37 0.46 0.43 0.37 19 Central University of Technology 0.3 0.31 0.24 0.34 0.34 0.49 0.34 20 Vaal University of Technology 0.14 0.19 0.21 0.32 0.36 0.35 0.26 21 Durban University of Technology 0.12 0.18 0.2 0.3 0.21 0.4 0.24 22 Walter Sisulu University 0.04 0.05 0.09 0.11 0.13 0.12 0.09 23 Mangosutho University of Technology 0.01 0.03 0.05 0.13 0.09 0.09 0.07

Nr of PhD graduates per Doctorate Staff Member (2008 - 2013)

0.40 0.36 0.35 0.35 0.33 0.30 0.30 0.30 0.28 0.28 0.28 0.27 0.27 0.26 0.24 0.24 0.25 0.22 0.21 0.21 0.20 0.18

0.14 0.15 0.10 0.09 0.10 0.07 0.06 0.05 0.03 0.00 0.00 CHET/CREST project on additional indicators

Average UCT Stell Wits Rhodes UWC for 11

1: % of programmes in general academic & professional fields 3.7 4.0 4.0 4.0 3.9 4.0 2: % of total head count enrolments in SET 3.4 4.0 4.0 4.0 2.8 3.7 3: Masters + doctoral enrolments as % of total heads 3.5 4.0 4.0 4.0 4.0 3.6 4: Doctoral enrolments as % of total heads 3.4 4.0 4.0 4.0 4.0 4.0 5: % of professors + associate professor + senior lecturers 3.6 4.0 4.0 4.0 4.0 4.0 6: % of permanent academic staff with doctorates 3.5 4.0 4.0 4.0 4.0 4.0 7: SET ratio of FTE students to FTE academics 3.6 4.0 3.2 4.0 4.0 3.6 8: Humanities + social science ratio of FTE students to FTE academics 3.5 4.0 3.7 4.0 4.0 3.3 9: Average pass rate in SET undergraduate courses 4.0 4.0 4.0 4.0 4.0 4.0 10: Average pass rate in other undergraduate courses 3.9 4.0 4.0 4.0 4.0 3.9 11: Total graduates as % of total head count enrolments 3.6 4.0 4.0 3.6 4.0 3.2 12. Ratio of % SET graduates to % SET enrolments 3.6 3.6 3.4 3.4 3.7 4.0 13: Masters graduates as % of masters head count enrolments 3.5 4.0 4.0 3.3 4.0 3.6 14: Doctoral graduates as % of doctoral head count enrolments 3.5 3.8 4.0 3.4 4.0 3.9 15: Ratio of research publications to permanent academic staff 3.3 4.0 4.0 4.0 4.0 2.7 16 Ratio of doctoral graduates to permanent academic staff 3.4 4.0 4.0 33 4.0 4.0 4.0 PERFORMANCE EVALUATIONS: RADAR GRAPHS

The radar graphs which follow reflect the extent to which an individual university complies with the performance targets linked to the mandates for traditional universities. Preliminary points to note about the graphs are these:

• the outer 4.0 grid line represents the target related to each datum point;

• the dotted line in the graph represents the average for the eleven traditional universities in relation to each input and output target;

• the solid line represents the individual university’s average score for the years 2011-2013.

34 University of Cape Town: averages for 2011- 2013

1: Academic structure 16: D grads per acad 4.0 2: % SET enrol 15: Res pubs per acad 3.0 3: % M + D enrol 2.0 14: D grad to D enrol ratio 4: % D enrol 1.0

13: M grad to M enrol… 0.0 5: % senior acads

12. % SET grads = % SET… 6: % acad with D

11: Grads to enrol ratio 7: SET students to acads 10: HUM u/grad success 8: HUM students to acads 9: SET u/grad succes

Average 11 traditional univs UCT

UCT 2013 Head count enrolment Permanent academics Doctoral graduates Research35 article units 26 115 1 093 205 1 315 Stellenbosch University : averages for 2011-2013

1: Academic structure 16: D grads per acad 4.0 2: % SET enrol

15: Res pubs per acad 3.0 3: % M + D enrol

2.0 14: D grad to D enrol ratio 4: % D enrol 1.0

13: M grad to M enrol… 0.0 5: % senior acads

12. % SET grads = % SET… 6: % acad with D

11: Grads to enrol ratio 7: SET students to acads

10: HUM u/grad success 8: HUM students to… 9: SET u/grad succes

Average 11 traditional univs SU

Stellenbosch 2013 Head count enrolment Permanent academics Doctoral graduates Research article units 27 326 1 006 225 361 245

Research strategy and planning issues

1. How can we expand the overall research productivity of the university? 2. How can we increase our international visibility (and hence citation impact?) 3. What are realistic research performance targets (which are field- sensitive)? 4. Which “sub-groups” should be targeted in incentivizing research output?

Comparing universities on the distribution of productive scientists (Lotka’s Law)

25.00 25 21.54 UKZN 22 UP 20.00 20

15.00 15 12.39 11.8 10.00 9.64 10 8.2 7.01 5.00 5.10 5 5.4 3.67 4.2 2.65 3 1.56 2.1 1.2 0.00 0.960.39 0 0.8 0.3 32 56 77 99 136 189 262 446 721 1758 30 55 80 122 156 216 317 548 864 1960

25 45.00 UWC UCT 40.00 41.10 21.1 20 35.00 30.00 15 25.00 20.00 10 10.9 15.00 7.8 13.58 5 5.54 10.00 8.66 3.88 6.53 2.83 5.00 5.02 3.60 1.95 1.19 2.39 1.61 0 0.740.27 0.00 0.920.33 28 54 75 106 151 207 300 489 789 2176 5 15 24 31 40 56 85 127 221 624 Comparing the human capital base of UKZN and UCT

Number of UKZN Total UKZN art. Total UCT Number of UCT authors Publication year authors responsible Equivalents art. equivalents responsible for articles for articles 1993 604.00 608 718.17 896 1994 678.73 584 758.23 952 1995 682.00 661 719.04 937 1996 723.01 621 635.18 920 1997 751.41 708 646.35 943 1998 635.29 653 640.29 900 1999 598.66 702 707.80 988 2000 607.31 688 660.30 919 2001 660.94 716 643.61 978 2002 639.15 745 669.69 1012 2003 736.79 725 548.08 844 2004 716.35 717 727.39 1188 2005 948.64 726 754.26 1167 2006 1083.71 N/A 816.28 1193 2007 876.48 N/A 934.24 1294 2008 977.45 1125 925.13 1365 2009 1108.20 1105 1040.44 1584 2010 1146.51 1228 1050.42 1649 2011 1250.37 1401 1090.07 1768 2012 1424.22 1453 UKZN = Headcount in 2013 (1376) 2013 1627.21 1670 UCT = Headcount in 2011 (1055)

UKZN Trends in citation impact by collaboration type (1993 – 2012) Comparison of top five SA universities in terms of collaboration type, 1990-2012

A comparison of collaboration type by university shows that UCT recorded the highest proportion of internationally collaborative papers (42%) over the period 1990 - 2012. Stellenbosch University (37%), Wits (36%) and UKZN (35%) have similar profiles with UP with the lowest proportion of internationally co-authored papers (31%).

Comparison of average individual research productivity by Faculty/School

Papers by Art. Units by Papers by Art. Units by Papers by Art. Units by Faculty author author author author author author (UP) (UP) (NWU) (NWU) (UWC) (UWC)

ARTS 2.66 1.73 2.67 2.38 3.62 2.78

CHS 3.26 1.05 1.57 1.12 3.79 1.33

EDUCATION 2.21 1.16 2.57 1.89 3.06 1.77

EMS 2.99 1.56 2.06 1.50 3.60 2.16

LAW 4.42 3.69 2.88 2.77 3.83 3.07

NATURAL SCIENCES 4.70 1.39 2.34 1.20 4.64 1.14

Gender and research output at UWC

Over the past eight years, 38% of article units at UWC were produced by female authors and 62% by male authors. Men still dominate the production of journal articles at UWC. This should also be read against the fact that the average ratio of male to female instructional staff according the HEMIS-system is nearly 50:50. The graph below shows that the average ratio of 62:38 (male to female article output) has not shifted significantly over this period

100% 90% 80% 70% 183.8 196.7 97.4 110.6 138.0 148.4 143.0 191.2 60% 50% 40% 30% 134.0 20% 132.7 61.3 63.8 66.9 83.8 87.5 112.2 10% 0% 2006 2007 2008 2009 2010 2011 2012 2013 Female Male UWC: article units by Gender and Faculty (2006 – 2013) In conclusion

Although rankings have some value in research performance assessment (and hence in research management in general), our view is that its value and utility is limited. Our approach rather is to develop a range of research (and teaching) performance measures that are: Aligned with national and institutional policy imperatives (such as growth, efficiency, internationalisation, transformation) Sensitive to the specific mission and history of the university (and hence to organisational structures and its responsiveness to its own audiences) Sensitive to field- and disciplinary differences (which must translate into differential strategies, plans and targets at the sub- organisation level) In the final analysis, research performance assessment frameworks and systems must both monitor and evaluate but also inform and improve institutional research strategies and initiatives.