<<

Intelligent metrics: The need for normalization of counts.

Inform your strategy decisions and evaluate the impact of research using data from:- Essential Indicators Journal Citation Reports Incites B&A Bob Green Solution Specialist April 2019 Agenda 1 Web of Science – what better place to start

2 Citation Counts

In , absolute citation counts given without context, are usually meaningless. 3 Journal To truly understand the impact of papers, journals, or institutions, have to be normalized and a wider view 4 Author Performance need to be taken.

5 Putting it all together

6 The Future of Metrics

2 The Web of The Web of Science Core Collection is a trusted, high quality collection Science Core of journals, and conference proceedings Collection

The Heart of the Web of Science Platform

Curated by a professional and publisher-neutral expert team of in-house Web of Science editors 3 Web of Science is neutral and objective Our in-house editors have no conflict-of-interest

Clarivate is not a commercial Fact: Constant evaluation Policy: Editors can not sit of journals, some resulting on Journal Advisory publisher; our editorial policies ensure in de-selection Boards we are publisher-neutral. Fact: Staff includes fluency Policy: Editors are full time in 12 languages employees of Clarivate Our editorial team works full-time on Policy: Editors may not evaluations and collection Fact: Consistent rejection publish papers. of predatory journals management and has done so for No Conflict of interest many decades, day-in and day-out with no other conflicts of interest or Fact: Staff have over 150 Policy: Staff may not edit professional commitments. years of experience journals

4 Citation Counts

Putting them into context.

5 Citation Counts

Molecular Biology & Genetics

Chemistry

Engineering

Plant & Animal Science

6 Variations in Citation Counts

7 Variations in Citation Counts

8 Variations in Citation Counts

Essential Science Indicators (ESI) provides Citation Thresholds for each Publication Year and Research Field combination over the past 10 years. One of these is for the Top 1%.

9 Citation Counts – Context from Essential Science Indicators (ESI)

Molecular Biology & Genetics

Chemistry

Engineering

Plant & Animal Science

10 Journal Impact Factors

Putting them into context.

11 Journal Impact Factors

12 Variations in Journal Impact Factors

Journal Citation Reports (JCR) shows how the Journal Impact Factor varies considerably across the different research categories.

JCR also shows how the number of journals in each category also varies.

13 Journal Impact Factors – Context from Journal Citation Reports (JCR)

14 Journal Impact Factors – Context from Journal Citation Reports (JCR)

15 Author Performance

Putting them into context.

16 Citation Impact

The total number of citations divided by the total number of publications in a set.

• Also known as ‘Average Citation Rate’ or ‘Citations per Publication’

Total Total Citation Examples Publications Citations Impact Researcher A 1 50 50 Researcher B 10 200 20

Researcher A: Citation Impact = 50 Researcher B: Citation Impact = 20

Even though Researcher B has published more documents and received more citations overall.

Does not account for differences in the fields.

17 The H-index

A researcher has an h-index, if he/she has at least h publications for which he/she has received at least h citations.

• Introduced by physicist J. Hirsch in 2005

Total Total Citation Example h-index Publications Citations Impact Researcher A 1 50 50 1 Researcher B 10 200 20 10 Researcher C 10 200 20 5

+ combines productivity (documents) and impact (citations) + can be applied to any level of aggregation + encourages large amounts of impactful research work

- highly time-dependent measure - ignores the researcher’s age - does not account for field differences

18 Variations in Citation Averages

Citation Impact is often compared by using averages (citations/document)

Essential Science Indicators (ESI) shows how Citation Impact varies significantly across different disciplines and time periods.

19 Author Performance – Context from InCites Benchmarking & Analytics

InCites B&A shows the variations between basic Citation Impact and contextual metrics like Top % and Category Normalised Citation Impact (CNCI).

20 Author Performance – Context from

An author’s Publons profile can provide valuable context to their research metrics.

21 Articles / Authors / Journals / Institutions /Countries

Putting everything into context.

22 Why Normalization

“Normalization is a prerequisite for reliable benchmarks and hence for evaluative science policy” (Debackere, 2015)

The average number of citations varies significantly across disciplines and journals

NECESSITY: FIELD AND JOURNAL NORMALIZATION

Citations are dynamic; they grow over time and cannot be compared across different time periods. Also the “citation maturity” rate differs between fields

NECESSITY: TIME NORMALIZATION

Different publication types have different citation behaviour, an article does not statistically receive as many citations as a review

NECESSITY: DOCUMENT TYPE NORMALIZATION

23 The Category Normalized Citation Impact (CNCI) – at Paper Level

Indicator of performance in the Average of citations received by Management category for this Article an article published in 2012 in the published in 2012: Management category If>1, performs higher than average If<1, performs lower than average

Times Cited/Category Expected Citations = 43/7.34 = 5.86

24 Journal Normalized Citation Impact (JNCI) – at Paper Level

Indicator of performance of this 2012 Average of citations received by Article in the Organization Science an article published in 2012 in the journal: Organization Science journal If>1, performs higher than average If<1, performs lower than average

Times Cited/Journal Expected Citations = 43/21.88 = 1.97

25 Percentiles – position within a set of publications

Percentile is a value above which a certain proportion of the observations fall. They position the publication within others of the same type, year and category. The smaller the percentile number, the higher the number of citations (in a scale of 0-100).

Percentile in Subject Area = 1.97 (just within the Top 2%)

26 InCites Benchmark & Analytics An objective analysis of people, programs and peers

Explore InCites Data Bibliometric Analysis InCites System Reports 227 Countries/Regions Productivity Prebuild Reports 12,508 Institutions Impact Customized Analytics 1,040 Funders Collaboration Personalized Dashboards 20,300 Journals Report Sharing 17 Research Schemas ORCID Data Imports & Exports My Organization InCites API WoS data imports View and analyse your Support your Institutions Upload Custom datasets organizational subunits repository or CRIS with advanced InCites indicators Article Level indicators imports / (Coming this year) exports

27 Productivity Indicators

28 Impact Indicators

29 Collaboration and Reputation Indicators

30 Even Normalized indicators need to be used with care

InCites B&A suggests these are the top performing countries when ranked by CNCI.

But when ranked by output the story is a different one.

Adding a sensible threshold usually gives a truer picture. 31 The Future of metrics

Where might they be heading?

32 ISI: the “Academy” of the Web of Science Group The Institute for Scientific Information (ISI) has been re-established to extend the work of Dr. Eugene Garfield

• ISI maintains the corpus of knowledge around research metrics, preserving its independent integrity. Web of Interpretation Science and related content, products, and services are built upon this key corpus. Data Strategy Indicators

• ISI disseminates that knowledge Editorial internally through reports and Excellence recommendations as well as externally Monitor through events, conferences, and Research publications.

Thought • ISI carries out research to sustain, Leadership extend, and improve the knowledge .

33 Profiles not Metrics

https://clarivate.com/g/profiles-not-metrics/

This was compiled by the Institute for Scientific Information (ISI), to draw attention to the information that is lost, when data about researchers and institutions is squeezed into simplified metrics or league tables.

It looks at four common types of analysis that can obscure real research performance when misused.

It goes on to suggest alternative visualizations that can show the wider picture. Hence supporting sound responsible research management.

34 Profiles not Metrics

Here are two visualizations that might provide a better indication of a researcher’s work, than the simple H-index does.

35 Profiles not Metrics

If you are interested in hearing more about this white paper.

Its not too late to register for this Webinar, which will discuss it in far more detail.

Click to be taken to the registration page.

36 Want more resources, tips and guidance to help you research smarter?

Sign up for our newsletter at:- webofsciencegroup.com

37 Thank you

Bob Green Solution Specialist

[email protected] clarivate.com/

Web of Science Group retains all intellectual property rights in, and asserts rights of confidentiality over, all parts of its response submitted within this presentation. By submitting this response we authorise you to make and distribute such copies of our proposal within your organisation and to any party contracted directly to solely assist in the evaluation process of our presentation on a confidential basis. Any further use will be strictly subject to agreeing appropriate terms.