The Metric Tide Literature Review Supplementary Report I to the Independent Review of the Role of Metrics in Research Assessment

Total Page:16

File Type:pdf, Size:1020Kb

The Metric Tide Literature Review Supplementary Report I to the Independent Review of the Role of Metrics in Research Assessment The Metric Tide Literature Review Supplementary Report I to the Independent Review of the Role of Metrics in Research Assessment and Management July 2015 Authors Paul Wouters*, Mike Thelwall**, Kayvan Kousha**, Ludo Waltman*, Sarah de Rijcke*, Alex Rushforth*, and Thomas Franssen* *Centre for Science and Technology Studies (CWTS), Leiden University, Netherlands. **Statistical Cybermetrics Research Group, University of Wolverhampton, UK. Cite as Wouters, P. et al. (2015). The Metric Tide: Literature Review (Supplementary Report I to the Independent Review of the Role of Metrics in Research Assessment and Management). HEFCE. DOI: 10.13140/RG.2.1.5066.3520 © HEFCE 2015, except where indicated. Cover image © JL-Pfeifer / Shutterstock.com. The parts of this work that are © HEFCE are available under the Open Government Licence 2.0: www.nationalarchives.gov.uk/doc/open-government-licence/version/2 ISBN: 1902369280 DOI: 10.13140/RG.2.1.5066.3520 Twitter: #HEFCEmetrics Contents Executive Summary ............................................................................................................................... iv Bibliometrics and the use of indicators .............................................................................................. iv Bibliographic databases ............................................................................................................ iv Basic citation impact indicators ................................................................................................ iv Exclusion of specific types of publications and citations .......................................................... v Normalisation of citation impact indicators ............................................................................... v Credit allocation in the case of multi-author publications ......................................................... v Indicators of the citation impact of journals .............................................................................. v Main research strands on indicator effects ................................................................................ vi Peer review and bibliometrics ........................................................................................................... vii Alternative metrics for research evaluation ....................................................................................... ix 1. Bibliometrics and the use of citation indicators in research assessment .................................... 1 1.1. Bibliographic databases ......................................................................................................... 3 1.1.1. Comparing WoS and Scopus ........................................................................................ 4 1.1.2. Comparing GS with WoS and Scopus .......................................................................... 5 1.1.3. Social sciences and humanities ..................................................................................... 6 1.1.4. Conference proceedings ................................................................................................ 7 1.2. Citation impact indicators ...................................................................................................... 8 1.2.1. Basic citation impact indicators .................................................................................... 8 1.2.2. Exclusion of specific types of publications and citations ........................................... 11 1.2.3. Normalisation of citation impact indicators ................................................................ 13 1.2.4. Credit allocation in the case of multi-author publications .......................................... 20 1.2.5. Indicators of the citation impact of journals ............................................................... 24 1.3. Evaluation practices and effects of indicator use ................................................................ 29 1.3.1. Introduction: main research strands ............................................................................ 29 1.3.2. Known effects ............................................................................................................. 30 1.4. Bibliometrics and the use of indicators summary ................................................................ 38 1.4.1. Bibliographic databases .............................................................................................. 38 i 1.4.2. Basic citation impact indicators .................................................................................. 38 1.4.3. Exclusion of specific types of publications and citations ........................................... 39 1.4.4. Normalisation of citation impact indicators ................................................................ 39 1.4.5. Credit allocation in the case of multi-author publications .......................................... 40 1.4.6. Indicators of the citation impact of journals ............................................................... 41 1.4.7. Main research strands on indicator effects .................................................................. 41 1.4.8. Strategic behaviour and goal displacement ................................................................. 42 1.4.9. Effects on interdisciplinarity ....................................................................................... 42 1.4.10. Task reduction ........................................................................................................ 42 1.4.11. Effects on institutions ............................................................................................. 42 1.4.12. Effects on knowledge production ........................................................................... 43 2. Peer review and bibliometrics .................................................................................................. 44 2.1. Forms of peer review ........................................................................................................... 44 2.2. Correlating bibliometrics with peer review ......................................................................... 49 2.2.1. Journal peer review ..................................................................................................... 49 2.2.2. Grant peer review ........................................................................................................ 49 2.2.3. Correlating bibliometrics and peer judgment of research groups .............................. 52 2.2.4. Bibliometrics and national research evaluation exercises ........................................... 53 2.2.5. Predicting the outcomes of the UK RAE and REF by bibliometrics .......................... 55 2.3. Informed peer review........................................................................................................... 61 2.4. Peer review and bibliometrics conclusions and summary ................................................... 65 3. Alternative metrics for research evaluation ............................................................................. 68 3.1. Web-based open access scholarly databases ....................................................................... 69 3.1.1. Limitations of traditional citation databases: A brief overview .................................. 70 3.1.2. GS ............................................................................................................................... 71 3.1.3. Patents and Google Patents ......................................................................................... 73 3.1.4. Usage indicators from scholarly databases ................................................................. 74 3.2. Citations and links from the general web ............................................................................ 76 3.2.1. Link analysis ............................................................................................................... 77 3.2.2. Web and URL citations ............................................................................................... 77 ii 3.3. Citations from specific parts of the general web ................................................................. 78 3.3.1. Online presentations ................................................................................................... 79 3.3.2. Online course syllabi for educational impact .............................................................. 80 3.3.3. Science blogs .............................................................................................................. 81 3.3.4. Other sources of online impact ................................................................................... 83 3.4. Citations, links, downloads and recommendations from social websites ............................ 83 3.4.1. Faculty of 1000 web recommendations ...................................................................... 84 3.4.2. Mendeley and other online reference managers ......................................................... 86 3.4.3. Twitter and microblog citations .................................................................................. 89 3.4.4. Facebook and Google+ citations ................................................................................. 92 3.4.5. Academic social network sites: Usage and follower counts ....................................... 92 3.5. Indicators for book impact assessment
Recommended publications
  • A Comprehensive Framework to Reinforce Evidence Synthesis Features in Cloud-Based Systematic Review Tools
    applied sciences Article A Comprehensive Framework to Reinforce Evidence Synthesis Features in Cloud-Based Systematic Review Tools Tatiana Person 1,* , Iván Ruiz-Rube 1 , José Miguel Mota 1 , Manuel Jesús Cobo 1 , Alexey Tselykh 2 and Juan Manuel Dodero 1 1 Department of Informatics Engineering, University of Cadiz, 11519 Puerto Real, Spain; [email protected] (I.R.-R.); [email protected] (J.M.M.); [email protected] (M.J.C.); [email protected] (J.M.D.) 2 Department of Information and Analytical Security Systems, Institute of Computer Technologies and Information Security, Southern Federal University, 347922 Taganrog, Russia; [email protected] * Correspondence: [email protected] Abstract: Systematic reviews are powerful methods used to determine the state-of-the-art in a given field from existing studies and literature. They are critical but time-consuming in research and decision making for various disciplines. When conducting a review, a large volume of data is usually generated from relevant studies. Computer-based tools are often used to manage such data and to support the systematic review process. This paper describes a comprehensive analysis to gather the required features of a systematic review tool, in order to support the complete evidence synthesis process. We propose a framework, elaborated by consulting experts in different knowledge areas, to evaluate significant features and thus reinforce existing tool capabilities. The framework will be used to enhance the currently available functionality of CloudSERA, a cloud-based systematic review Citation: Person, T.; Ruiz-Rube, I.; Mota, J.M.; Cobo, M.J.; Tselykh, A.; tool focused on Computer Science, to implement evidence-based systematic review processes in Dodero, J.M.
    [Show full text]
  • A Measured Approach
    DRAFT Chapter Six A Measured Approach Evaluating Altmetrics as a Library Service Hui Zhang and Korey Jackson [6.0] The emergence of altmetrics has drawn the attention of academic libraries as a new and effective approach for capturing the types of impact often ignored by traditional citation-based metrics. At Oregon State University Libraries, before investing in a full-scale implementation of altmetrics services, we conducted an assessment survey of faculty and other researchers in order to determine whether such products were as valuable to researchers as tradition- al bibliometrics (including h-index and journal impact factor). Based on the survey results, this chapter seeks to understand best practices for the intro- duction and implementation of altmetrics services. The results reveal a mix- ture of both enthusiasm and suspicion toward altmetrics as an impact meas- ure. Ultimately, we find that, while academic libraries are in the best position to act as intermediary providers of altmetrics services, there is much need for refinement of altmetrics evaluation techniques and a more robust set of best practices for institution-wide implementation. [6.1] INTRODUCTION [6.2] The term altmetrics has already become familiar to many librarians, and as more libraries and journal publishers add altmetrics data to their scholarly content, the term is finding currency among researchers as well. But the actual practice of altmetrics—which we define as the “tracking of multichan- nel, online use and conversation around a discrete piece of scholarship”—has yet to gain wide acceptance among scholars, in part because scholarly and administrative motivations for adoption have not yet appeared.
    [Show full text]
  • A Mixed Methods Bibliometric Study
    The Qualitative Report Volume 24 Number 12 Article 2 12-2-2019 Collaboration Patterns as a Function of Research Experience Among Mixed Researchers: A Mixed Methods Bibliometric Study Melanie S. Wachsmann Sam Houston State University, [email protected] Anthony J. Onwuegbuzie Sam Houston State University, [email protected] Susan Hoisington Sam Houston State University, [email protected] Vanessa Gonzales Sam Houston State University, [email protected] Rachael Wilcox Sam Houston State University, [email protected] See next page for additional authors Follow this and additional works at: https://nsuworks.nova.edu/tqr Part of the Education Commons, Quantitative, Qualitative, Comparative, and Historical Methodologies Commons, and the Social Statistics Commons Recommended APA Citation Wachsmann, M. S., Onwuegbuzie, A. J., Hoisington, S., Gonzales, V., Wilcox, R., Valle, R., & Aleisa, M. (2019). Collaboration Patterns as a Function of Research Experience Among Mixed Researchers: A Mixed Methods Bibliometric Study. The Qualitative Report, 24(12), 2954-2979. https://doi.org/10.46743/ 2160-3715/2019.3852 This Article is brought to you for free and open access by the The Qualitative Report at NSUWorks. It has been accepted for inclusion in The Qualitative Report by an authorized administrator of NSUWorks. For more information, please contact [email protected]. Collaboration Patterns as a Function of Research Experience Among Mixed Researchers: A Mixed Methods Bibliometric Study Abstract Onwuegbuzie et al. (2018) documented that the degree of collaboration is higher for mixed researchers than for qualitative and quantitative researchers. The present investigation examined the (a) link between the research experience of lead authors and their propensity to collaborate (Quantitative Phase), and (b) role of research experience in collaborative mixed research studies (Qualitative Phase).
    [Show full text]
  • The Privilege to Select. Global Research System, European Academic Library Collections, and Decolonisation
    The Privilege to Select Global Research System, European Academic Library Collections, and Decolonisation Schmidt, Nora DOI: 10.5281/zenodo.4011296 2020 Document Version: Publisher's PDF, also known as Version of record Link to publication Citation for published version (APA): Schmidt, N. (2020). The Privilege to Select: Global Research System, European Academic Library Collections, and Decolonisation. Lund University, Faculties of Humanities and Theology. https://doi.org/10.5281/zenodo.4011296 Total number of authors: 1 Creative Commons License: CC BY General rights Unless other specific re-use rights are stated the following general rights apply: Copyright and moral rights for the publications made accessible in the public portal are retained by the authors and/or other copyright owners and it is a condition of accessing publications that users recognise and abide by the legal requirements associated with these rights. • Users may download and print one copy of any publication from the public portal for the purpose of private study or research. • You may not further distribute the material or use it for any profit-making activity or commercial gain • You may freely distribute the URL identifying the publication in the public portal Read more about Creative commons licenses: https://creativecommons.org/licenses/ Take down policy If you believe that this document breaches copyright please contact us providing details, and we will remove access to the work immediately and investigate your claim. LUND UNIVERSITY PO Box 117 221 00 Lund +46 46-222 00 00 Download date: 06. Oct. 2021 The Privilege to Select To European social sciences and humanities researchers, substantial parts of potentially relevant literature published in the “Global South” are invisible.
    [Show full text]
  • PDF Download Starting with Science Strategies for Introducing Young Children to Inquiry 1St Edition Ebook
    STARTING WITH SCIENCE STRATEGIES FOR INTRODUCING YOUNG CHILDREN TO INQUIRY 1ST EDITION PDF, EPUB, EBOOK Marcia Talhelm Edson | 9781571108074 | | | | | Starting with Science Strategies for Introducing Young Children to Inquiry 1st edition PDF Book The presentation of the material is as good as the material utilizing star trek analogies, ancient wisdom and literature and so much more. Using Multivariate Statistics. Michael Gramling examines the impact of policy on practice in early childhood education. Part of a series on. Schauble and colleagues , for example, found that fifth grade students designed better experiments after instruction about the purpose of experimentation. For example, some suggest that learning about NoS enables children to understand the tentative and developmental NoS and science as a human activity, which makes science more interesting for children to learn Abd-El-Khalick a ; Driver et al. Research on teaching and learning of nature of science. The authors begin with theory in a cultural context as a foundation. What makes professional development effective? Frequently, the term NoS is utilised when considering matters about science. This book is a documentary account of a young intern who worked in the Reggio system in Italy and how she brought this pedagogy home to her school in St. Taking Science to School answers such questions as:. The content of the inquiries in science in the professional development programme was based on the different strands of the primary science curriculum, namely Living Things, Energy and Forces, Materials and Environmental Awareness and Care DES Exit interview. Begin to address the necessity of understanding other usually peer positions before they can discuss or comment on those positions.
    [Show full text]
  • The Journal Impact Factor Should Not Be Discarded
    OPINION Editing, Writing & Publishing https://doi.org/10.3346/jkms.2017.32.2.180 • J Korean Med Sci 2017; 32: 180-182 The Journal Impact Factor Should Not Be Discarded Lutz Bornmann1 and Alexander I. Pudovkin2 1Division for Science and Innovation Studies, Administrative Headquarters of the Max Planck Society, Munich, Germany; 2A. V. Zhirmunsky Institute of Marine Biology, Far Eastern Branch of Russian Academy of Sciences, Vladivostok, Russian Federation Journal editors and experts in scientometrics are increasingly calculation. In contrast to the JIF, the new CiteScore metric con- concerned with the reliability of the Journal Impact Factor (JIF, siders the papers from 3 years (instead of 2 years). Clarivate Analytics, formerly the IP & Science business of Thom- As such, the JIF (and also the CiteScore) covers rather short son Reuters) as a tool for assessing the influence of scholarly term of interest toward papers (i.e., interest at the research front) journals. A paper byLarivière et al. (1), which was reposited on and overlooks long-term implications of publication activity bioarXiv portal and commented on in Nature (2), reminded all (the so-called sticky knowledge) (7). Focus on the short-term stakeholders of science communication that the citability of attention of the field-specific community makes sense since most papers in an indexed journal deviates significantly from the JIF was initially designed to guide librarians purchase the its JIF. These authors recommend to display journal citation dis- most used modern periodicals for their libraries. Accordingly, tribution instead of the JIF, and the proposal is widely discussed the JIF cannot and should not be employed for evaluating the on social networking platforms (3,4).
    [Show full text]
  • Science Studies Probing the Dynamics of Scientific Knowledge
    Sabine Maasen / Matthias Winterhager (eds.) Science Studies Probing the Dynamics of Scientific Knowledge 09.05.01 --- Projekt: transcript.maasen.winterhager / Dokument: FAX ID 012a286938514334|(S. 1 ) T00_01 schmutztitel.p 286938514390 09.05.01 --- Projekt: transcript.maasen.winterhager / Dokument: FAX ID 012a286938514334|(S. 2 ) vakat 002.p 286938514406 Sabine Maasen / Matthias Winterhager (eds.) Science Studies Probing the Dynamics of Scientific Knowledge 09.05.01 --- Projekt: transcript.maasen.winterhager / Dokument: FAX ID 012a286938514334|(S. 3 ) T00_03 innentitel.p 286938514414 This work is licensed under a Creative Commons Attribution-NonCommercial-NoDerivatives 3.0 License. Die Deutsche Bibliothek – CIP-Einheitsaufnahme Science studies : probing the dynamics of scientific knowledge / Sabine Maasen / Matthias Winterhager (ed.). – Bielefeld : transcript, 2001 ISBN 3-933127-64-5 © 2001 transcript Verlag, Bielefeld Umschlaggestaltung: Kordula Röckenhaus, Bielefeld Satz: digitron GmbH, Bielefeld Druck: Digital Print, Witten ISBN 3-933127-64-5 09.05.01 --- Projekt: transcript.maasen.winterhager / Dokument: FAX ID 012a286938514334|(S. 4 ) T00_04 impressum.p 286938514422 To Peter Weingart and, of course, Henry Holorenshaw 09.05.01 --- Projekt: transcript.maasen.winterhager / Dokument: FAX ID 012a286938514334|(S. 5 ) T00_05 widmung.p 286938514430 09.05.01 --- Projekt: transcript.maasen.winterhager / Dokument: FAX ID 012a286938514334|(S. 6 ) vakat 006.p 286938514438 Contents Introduction 9 Science Studies. Probing the Dynamics of Scientific Knowledge Sabine Maasen and Matthias Winterhager 9 Eugenics – Looking at the Role of Science Anew 55 A Statistical Viewpoint on the Testing of Historical Hypotheses: The Case of Eugenics Diane B. Paul 57 Humanities – Inquiry Into the Growing Demand for Histories 71 Making Sense Wolfgang Prinz 73 Bibliometrics – Monitoring Emerging Fields 85 A Bibliometric Methodology for Exploring Interdisciplinary, ‘Unorthodox’ Fields of Science.
    [Show full text]
  • Science and Its Significant Other: Representing the Humanities in Bibliometric Scholarship
    Science and its significant other: Representing the humanities in bibliometric scholarship Thomas Franssen & Paul Wouters (CWTS, Leiden University, The Netherlands) 1. introduction Bibliometrics offers a particular representation of science (Wouters, 1999; Nicolaisen 2007). Through bibliometric methods a bibliometrician will always highlight particular elements of publications, and through these elements operationalize particular representations of science, while obscuring other possible representations from view. Understanding bibliometrics as representation implies that a bibliometric analysis is always performative; a bibliometric analysis brings a particular representation of science into being that potentially influences the science system itself (e.g. Wyatt et al., 2017). The performative effects of bibliometrics have been studied primarily in relation to individual researchers' behavior and how pervasive representations (and particular indicators) might influence this (De Rijcke et al., 2016). How bibliometrics influence the ways we think about, compare and contrast different scientific domains in general has however not been systematically analyzed. The pervasiveness of bibliometric representations of science in the contemporary science system warrants such a study. Moreover, a systematic, historical view of the development of bibliometrics might also offer this scientific community a better understanding of itself as well as the future of the discipline. We are in particular interested in the ways the humanities have been represented throughout the history of bibliometrics, often in comparison to other scientific domains or to a general notion of ‘the sciences’. Earlier reviews of bibliometric literature pertaining to the humanities exist (Nederhof, 2006; also part 2.3 in Moed, 2006; Huang & Chang, 2008; Ardanuy, 2013) but have been predominantly methodological in nature. They ask what bibliometric methods are suitable to use for research evaluation in the humanities (and social sciences) but do not engage with the question of representation.
    [Show full text]
  • OECD Compendium of Bibliometric Science Indicators
    Compendium of Bibliometric Science Indicators COMPENDIUM OF BIBLIOMETRIC SCIENCE INDICATORS NOTE FROM THE SECRETARIAT This document contains the final version of the OECD Compendium of Bibliometric Science Indicators. The report brings together a new collection of statistics depicting recent trends and the structure of scientific production across OECD countries and other major economies that supports indicators contained in the 2015 OECD Science, Technology and Industry Scoreboard. This report was prepared in partnership between the OECD Directorate for Science, Technology and Innovation (DSTI) and the SCImago Research Group (CSIC, Spain). It was presented to the Committee for Scientific and Technological Policy (CSTP) and National Experts in Science and Technology Indicators (NESTI) delegates for comment and approval. This paper was approved and declassified by written procedure by the Committee for Scientific and Technological Policy (CSTP) in May 2016 and prepared for publication by the OECD Secretariat. Note to Delegations: This document is also available on OLIS under the reference code: DSTI/EAS/STP/NESTI(2016)8/FINAL This document and any map included herein are without prejudice to the status of or sovereignty over any territory, to the delimitation of international frontiers and boundaries and to the name of any territory, city or area. The statistical data for Israel are supplied by and under the responsibility of the relevant Israeli authorities or third party. The use of such data by the OECD is without prejudice to the status of the Golan Heights, East Jerusalem and Israeli settlements in the West Bank under the terms of international law. Please cite this publication as: OECD and SCImago Research Group (CSIC) (2016), Compendium of Bibliometric Science Indicators.
    [Show full text]
  • INVESTIGATING the APPARATUS the Social
    Econ Journal Watch, Volume 1, Number 1, April 2004, pp 134-165. INVESTIGATING THE APPARATUS The Social Science Citation Index: A Black Box—with an Ideological Bias? DANIEL B. KLEIN* WITH ERIC CHIANG** ABSTRACT, KEYWORDS, JEL CODES THE SOCIAL SCIENCE CITATION INDEX (SSCI) IS A DATABASE of scholarly literature. SSCI is used in many important ways. The most conspicuous use of SSCI is showing whose work gets cited in other research. Which other research? The articles published in journals on the SSCI journal list. As of June 2003, SSCI included 1,768 journals. Your citation count is the number of times your work has been cited by articles in SSCI journals (actually, in those journals during the years that the journal was included in the SSCI).1 * Santa Clara University, Santa Clara, CA. ** Note on authorship: Although two authors are shown above, this paper is written in the first- person singular. Daniel Klein led the investigation and wrote up the paper. He is the “I” here. Eric Chiang is a student at Santa Clara University. He did most of the data collection and some of the institutional investigation. We thank the following ISI employees for their help with understanding SSCI and checking facts: Chris Pasquini (Technical Support Representative), James Testa (Director of Editorial Development), and Rodney Yancey (Manager of Corporate Communications). For comments on an earlier draft, we thank Niclas Berggren, Dan Johansson, Per Hortlund, and Richard Johnsson. 1 James Testa, Director of Editorial Development at Thomson ISI, writes in an e-mail correspondence to Eric Chiang: “When a journal is dropped it is from that moment forward.
    [Show full text]
  • The Literature of Bibliometrics, Scientometrics, and Informetrics
    Jointly published by Akadémiai Kiadó, Budapest Scientometrics, and Kluwer Academic Publishers, Dordrecht Vol. 52, No. 2 (2001) 291–314 The literature of bibliometrics, scientometrics, and informetrics WILLIAM W. HOOD, CONCEPCIÓN S. WILSON School of Information Systems, Technology and Management, The University of New South Wales, Sydney (Australia) Since Vassily V. Nalimov coined the term ‘scientometrics’ in the 1960s, this term has grown in popularity and is used to describe the study of science: growth, structure, interrelationships and productivity. Scientometrics is related to and has overlapping interests with bibliometrics and informetrics. The terms bibliometrics, scientometrics, and informetrics refer to component fields related to the study of the dynamics of disciplines as reflected in the production of their literature. Areas of study range from charting changes in the output of a scholarly field through time and across countries, to the library collection problem of maintaining control of the output, and to the low publication productivity of most researchers. These terms are used to describe similar and overlapping methodologies. The origins and historical survey of the development of each of these terms are presented. Profiles of the usage of each of these terms over time are presented, using an appropriate subject category of databases on the DIALOG information service. Various definitions of each of the terms are provided from an examination of the literature. The size of the overall literature of these fields is determined and the growth and stabilisation of both the dissertation and non-dissertation literature are shown. A listing of the top journals in the three fields are given, as well as a list of the major reviews and bibliographies that have been published over the years.
    [Show full text]
  • Drowning by Numbers: on Reading, Writing and Bibliometrics
    Confero | Vol. 1 | no. 1 | 2013 | pp. 19-44 | doi:10.3384/confero13v1121207a Drowning by numbers: On reading, writing and bibliometrics Ylva Hasselberg he purpose of this text is manifold. The primary purpose is to look into the effects of marketization of academia on the reading habits of academics, which also demands a problematization of reading and its Trole in the process of creating new knowledge. The second purpose is to discuss and problematize the citation as a sign of intellectual debt. And the third, but not least important, purpose is to write a text that demands the reader to read in a manner that is necessary to learn, instead of writing it in a manner that is adapted to promoting “citability”. And so of course, what I would like more than anything to teach the reader is that the only possible way forward, the only method of reproducing real scholarship in a commodified setting, is to live it yourself. This way of writing a text is my way of living real scholarship. If this does not agree with you – don’t bother citing me. Drowning by numbers What I do Let me describe my work to you. I am not at all sure whether you are interested in my work, but I suspect that you can gain some satisfaction through comparing my work process with your own. It is always good to be given a point of reference from which you can reflect on yourself, isn’t it? I’m a historian. There are many ways of being a historian.
    [Show full text]