Positioning Open Access Journals in a LIS Journal Ranking

Total Page:16

File Type:pdf, Size:1020Kb

Positioning Open Access Journals in a LIS Journal Ranking Positioning Open Access Journals in a LIS Journal Ranking Jingfeng Xia This research uses the h-index to rank the quality of library and information science journals between 2004 and 2008. Selected open access (OA) journals are included in the ranking to assess current OA development in support of scholarly communication. It is found that OA journals have gained momentum supporting high-quality research and publication, and some OA journals have been ranked as high as the best traditional print journals. The findings will help convince scholars to make more contri- butions to OA journal publications, and also encourage librarians and information professionals to make continuous efforts for library publishing. ccording to the Budapest the complexity of publication behaviors, Open Access Initiatives, open various approaches have been developed access (OA) denotes “its free to assist in journal ranking, of which availability on the public comparing the rates of citation using internet, permitting any users to read, citation indexes to rate journals has been download, copy, distribute, print, search, popularly practiced and recognized in or link to the full texts of these articles, most academic disciplines. ISI’s Journal crawl them for indexing, pass them as Citation Reports (JCR) is among the most data to software, or use them for any other used rankings, which “offers a systematic, lawful purpose, without financial, legal, objective means to critically evaluate the or technical barriers other than those world’s leading journals, with quantifi- inseparable from gaining access to the able, statistical information based on internet itself.”1 Therefore, an OA journal citation data.”4 Yet, citation-based journal is referred to as one that is freely available rankings, such as JCR, have included few online in full text. Conversely, a non-OA open access journals on their lists. Of journal is one that needs subscription and these limited OA journals, many were thus is not freely available.2 There are either recently converted into open access scholarly journals that were subscription- or are publicly available with conditions. based but are later converted to OA.3 The relative exclusion of OA journals Academic journal ranking serves as an creates two deficiencies for scholarly important criterion for the scholarly com- communication. First, these rankings may munity to assess research quality and for not accurately portray the full picture of librarians to select the best publications journal publications to reflect an ongoing for collection development. Because of advancement in scholarship. Second, they Jingfeng Xia is Assistant Professor in the School of Library and Information Science at Indiana University; e-mail: [email protected]. The author is grateful to Professor Debora Shaw for her help. © Jingfeng Xia Attribution-NonCommercial (http://creativecommons.org/licenses/by-nc-sa/3.0/) CC BY-NC 134 crl-234 Positioning Open Access Journals in a LIS Journal Ranking 135 may discourage the open access move- ensure the validity of a direct match be- ment by marginalizing the majority of tween the absolute numbers. For absolute OA journals. In fact, some OA journals numbers, GS returns more citations than have successfully built reputations, at- both Scopus and WoS databases.8 tracting high-quality articles and sizable An easy solution is to calculate the numbers of citations. This research is an citation indexes of both OA and non-OA attempt to add selected OA journals to the journals from the same data source. WoS journal quality rankings using library and citation records do not allow direct com- information science (LIS) as an example. parison because many OA journal articles It is helpful to detect the position of are not included; however, GS offers a OA journals in journal rankings so that good dataset with which to work. Many scholars can recognize the progresses of studies have measured the quality of GS OA publishing and make active contribu- searches and the similarities between tions to support the OA movement. Such WoS and GS.9 The correlation between rankings will also encourage librarians datasets from the two sources is consis- and information professionals to improve tently found to be significant. Additional the existing library publishing enterprise studies also recognize GS as a reliable and make continuous efforts for journal citation provider for bibliometric work,10 practices. particularly for the retrieval of OA article citations.11 Furthermore, GS returns more Background citations from conference papers, books, In a series of articles, Mukherjee reports and dissertations/theses, which show his studies on a group of 17 fully open ac- evidence of wider scholarly impact. There cess LIS journals published in the period are some disadvantages of using GS for of 2000–2004.5 By calculating the citation citation analyses. For example, GS’s data rates of research papers in these journals, coverage is better for certain academic he uses journal impact factors and other fields (such as social sciences, arts, and indexes to evaluate their contribution to humanities) where books and conference scholarship. The impact factors and total papers constitute a large part of formal citation counts are matched to those avail- publications.12 Also, due to the short his- able in JCR for traditional LIS journals in tory of GS, its database contains fewer the same time span, with results indicat- citations for old publications.13 However, ing that some LIS OA journals have an these limitations will not affect our use of impact comparable with JCR journals. GS for OA journal ranking because this To be specific, “the actual number of Web analysis is based on new data from 2004 citations in the case of some of these for to 2008 and within a single discipline. at least seven OA journals was a bit lower Instead of relying on journal impact than ISI’s high-ranked LIS journals, but factors to compare OA journals, this much higher than median-ranked jour- analysis adopts the h-index approach nals.”6 His study is representative of the for journal ranking. The h-index is an very few investigations that have tried to improvement over simple citation mea- position OA journals in scholarly journal sures, emphasizing the total number of rankings. Yet the problem of Mukherjee’s citations or publications, and it works research is that he attempts to manipulate properly for comparing publications a simple comparison between two sets of in the same field.14 Since first proposed data from different citation data services: in 2005, its benefits of quantifying the one from ISI’s Web of Science (WoS) and impact of research outcomes have been the other from Google Scholar (GS). A widely recognized. Studies show that high degree of correlation between index- for citation analyses at the article level, es of these two databases, as discovered “the Spearman rank order correlation by Meho and Yang,7 does not necessarily between citation ranks and h-index 136 College & Research Libraries March 2012 (with self-citations excluded) was 0.9, • JCR’s collection of 61 journals, significant at the 0.01 level”;15 and, at the which are considered to represent the core journal level, “the Spearman correlation journals in LIS by ISI, in its “Information between the ISI JIF and the h-index—used Science & Library Science” category of the because both the JIF and h-index have 2008 edition. non-normal distributions—is strong and • Nisonger and Davis’ list of journals, very significant: 0.718 (p<0.000).”16 The rated by LIS education deans and ARL li- h-index is especially known for its robust brary directors, to supplement the journal performance and its combined effect for list in JCR.21 both quantity (number of publications) • Directory of Open Access Journals and quality (citation rate) in a balanced collection of a total of 116 OA journals in way. 17 The fact that the h-index works well LIS, retrieved at the end of 2010.22 for the calculation of journal citations for • Mukherjee’s list of 17 OA journals, a definite period makes it an appropriate which are freely accessible in full text and measure for OA journal analysis in this represent a group of core OA LIS journals research. in the data selection. Various studies have tested the h-index JCR’s list served as the basic collection and its value in assessing journal impact of LIS journals, which was re-evaluated in some academic fields. For example, by and also supplemented by adding rel- using the evidence of convergent and dis- evant journals from other sources listed criminatory validity, Hodge suggests the above. The following criteria were used real utility of the h-index for social work for the journal selection. journals and points to “its compatibility • Journals not published in English with the profession’s applied research were removed because both GS and WoS culture and its ability to be used with databases do not properly handle names essentially all journals in which social with diacritics, which are common in workers publish.”18 Harzing and van der many other languages. Wal apply the h-index to assess journal • OA journals with initial publica- impact in the field of economics and busi- tions after 2004 were removed to give ness and realize its advantages for a more every journal the same time period to ac- accurate and comprehensive measure of crue citations. Exceptions include journals journal impact.19 With a discovered strong that were originally published in print correlation between a journal’s impact and converted to open access after 2004. factor and its quality,20 we are confident • Non–peer-reviewed journals were of the application of the h-index in OA removed. The decision was based on journal assessment. manual inspections of journal articles that are not research in nature.
Recommended publications
  • Journal Rankings Paper
    NBER WORKING PAPER SERIES NEOPHILIA RANKING OF SCIENTIFIC JOURNALS Mikko Packalen Jay Bhattacharya Working Paper 21579 http://www.nber.org/papers/w21579 NATIONAL BUREAU OF ECONOMIC RESEARCH 1050 Massachusetts Avenue Cambridge, MA 02138 September 2015 We thank Bruce Weinberg, Vetla Torvik, Neil Smalheiser, Partha Bhattacharyya, Walter Schaeffer, Katy Borner, Robert Kaestner, Donna Ginther, Joel Blit and Joseph De Juan for comments. We also thank seminar participants at the University of Illinois at Chicago Institute of Government and Public Affairs, at the Research in Progress Seminar at Stanford Medical School, and at the National Bureau of Economic Research working group on Invention in an Aging Society for helpful feedback. Finally, we thank the National Institute of Aging for funding for this research through grant P01-AG039347. We are solely responsible for the content and errors in the paper. The views expressed herein are those of the authors and do not necessarily reflect the views of the National Bureau of Economic Research. NBER working papers are circulated for discussion and comment purposes. They have not been peer- reviewed or been subject to the review by the NBER Board of Directors that accompanies official NBER publications. © 2015 by Mikko Packalen and Jay Bhattacharya. All rights reserved. Short sections of text, not to exceed two paragraphs, may be quoted without explicit permission provided that full credit, including © notice, is given to the source. Neophilia Ranking of Scientific Journals Mikko Packalen and Jay Bhattacharya NBER Working Paper No. 21579 September 2015 JEL No. I1,O3,O31 ABSTRACT The ranking of scientific journals is important because of the signal it sends to scientists about what is considered most vital for scientific progress.
    [Show full text]
  • The Journal Ranking System Undermining the Impact of 2 Brazilian Science 3 4 Rodolfo Jaffé1 5 6 1 Instituto Tecnológico Vale, Belém-PA, Brazil
    bioRxiv preprint doi: https://doi.org/10.1101/2020.07.05.188425; this version posted July 6, 2020. The copyright holder for this preprint (which was not certified by peer review) is the author/funder, who has granted bioRxiv a license to display the preprint in perpetuity. It is made available under aCC-BY-NC-ND 4.0 International license. 1 QUALIS: The journal ranking system undermining the impact of 2 Brazilian science 3 4 Rodolfo Jaffé1 5 6 1 Instituto Tecnológico Vale, Belém-PA, Brazil. Email: [email protected] 7 8 Abstract 9 10 A journal ranking system called QUALIS was implemented in Brazil in 2009, intended to rank 11 graduate programs from different subject areas and promote selected national journals. Since this 12 system uses a complicated suit of criteria (differing among subject areas) to group journals into 13 discrete categories, it could potentially create incentives to publish in low-impact journals ranked 14 highly by QUALIS. Here I assess the influence of the QUALIS journal ranking system on the 15 global impact of Brazilian science. Results reveal a steeper decrease in the number of citations 16 per document since the implementation of this QUALIS system, compared to the top Latin 17 American countries publishing more scientific articles. All the subject areas making up the 18 QUALIS system showed some degree of bias, with social sciences being usually more biased 19 than natural sciences. Lastly, the decrease in the number of citations over time proved steeper in a 20 more biased area, suggesting a faster shift towards low-impact journals ranked highly by 21 QUALIS.
    [Show full text]
  • Ranking Forestry Journals Using the H-Index
    Ranking forestry journals using the h-index Journal of Informetrics, in press Jerome K Vanclay Southern Cross University PO Box 157, Lismore NSW 2480, Australia [email protected] Abstract An expert ranking of forestry journals was compared with journal impact factors and h -indices computed from the ISI Web of Science and internet-based data. Citations reported by Google Scholar offer an efficient way to rank all journals objectively, in a manner consistent with other indicators. This h-index exhibited a high correlation with the journal impact factor (r=0.92), but is not confined to journals selected by any particular commercial provider. A ranking of 180 forestry journals is presented, on the basis of this index. Keywords : Hirsch index, Research quality framework, Journal impact factor, journal ranking, forestry Introduction The Thomson Scientific (TS) Journal Impact Factor (JIF; Garfield, 1955) has been the dominant measure of journal impact, and is often used to rank journals and gauge relative importance, despite several recognised limitations (Hecht et al., 1998; Moed et al., 1999; van Leeuwen et al., 1999; Saha et al., 2003; Dong et al., 2005; Moed, 2005; Dellavalle et al., 2007). Other providers offer alternative journal rankings (e.g., Lim et al., 2007), but most deal with a small subset of the literature in any discipline. Hirsch’s h-index (Hirsch, 2005; van Raan, 2006; Bornmann & Daniel, 2007a) has been suggested as an alternative that is reliable, robust and easily computed (Braun et al., 2006; Chapron and Husté, 2006; Olden, 2007; Rousseau, 2007; Schubert and Glänzel, 2007; Vanclay, 2007; 2008).
    [Show full text]
  • Australian Business Deans Council 2019 Journal Quality List Review Final Report 6 December 2019
    Australian Business Deans Council 2019 Journal Quality List Review Final Report 6 December 2019 1 About the Australian Business Deans Council The Australian Business Deans Council (ABDC) is the peak body of Australian university business schools. Our 38 members graduate one-third of all Australian university students and more than half of the nation’s international tertiary students. ABDC’s mission is to make Australian business schools better, and to foster the national and global impact of Australian business education and research. ABDC does this by: • Being the collective and collegial voice of university business schools • Providing opportunities for members to share knowledge and best practice • Creating and maintaining strong, collaborative relationships with affiliated national and international peak industry, higher education, professional and government bodies • Engaging in strategic initiatives and activities that further ABDC’s mission. Australian Business Deans Council Inc. UNSW Business School, Deans Unit, Level 6, West Lobby, College Road, Kensington, NSW, Australia 2052 T: +61 (0)2 6162 2970 E: [email protected] 2 Table of Contents Acknowledgements 4 Background and Context 4 Method and Approach 7 Outcomes 10 Beyond the 2019 Review 13 Appendix 1 – Individual Panel Reports 14 Information Systems 15 Economics 20 Accounting 37 Finance including Actuarial Studies 57 Management, Commercial Services and Transport and Logistics 63 (and Other, covering 1599) Marketing and Tourism 78 Business and Taxation Law 85 Appendix 2 – Terms
    [Show full text]
  • Understanding Research Metrics INTRODUCTION Discover How to Monitor Your Journal’S Performance Through a Range of Research Metrics
    Understanding research metrics INTRODUCTION Discover how to monitor your journal’s performance through a range of research metrics. This page will take you through the “basket of metrics”, from Impact Factor to usage, and help you identify the right research metrics for your journal. Contents UNDERSTANDING RESEACH METRICS 2 What are research metrics? What are research metrics? Research metrics are the fundamental tools used across the publishing industry to measure performance, both at journal- and author-level. For a long time, the only tool for assessing journal performance was the Impact Factor – more on that in a moment. Now there are a range of different research metrics available. This “basket of metrics” is growing every day, from the traditional Impact Factor to Altmetrics, h-index, and beyond. But what do they all mean? How is each metric calculated? Which research metrics are the most relevant to your journal? And how can you use these tools to monitor your journal’s performance? For a quick overview, download our simple guide to research metrics. Or keep reading for a more in-depth look at what’s in the “basket of metrics” and how to interpret it. UNDERSTANDING RESEACH METRICS 3 Citation-based metrics Citation-based metrics IMPACT FACTOR What is the Impact Factor? The Impact Factor is probably the most well-known metric for assessing journal performance. Designed to help librarians with collection management in the 1960s, it has since become a common proxy for journal quality. The Impact Factor is a simple research metric: it’s the average number of citations received by articles in a journal within a two-year window.
    [Show full text]
  • Ten Good Reasons to Use the Eigenfactor Metrics in Our Opinion, There Are Enough Good Reasons to Use the Eigenfactor Method to Evaluate Journal Influence: 1
    Ten good reasons to use the EigenfactorTMmetrics✩ Massimo Franceschet Department of Mathematics and Computer Science, University of Udine Via delle Scienze 206 – 33100 Udine, Italy [email protected] Abstract The Eigenfactor score is a journal influence metric developed at the De- partment of Biology of the University of Washington and recently intro- duced in the Science and Social Science Journal Citation Reports maintained by Thomson Reuters. It provides a compelling measure of journal status with solid mathematical background, sound axiomatic foundation, intriguing stochastic interpretation, and many interesting relationships to other ranking measures. In this short contribution, we give ten reasons to motivate the use of the Eigenfactor method. Key words: Journal influence measures; Eigenfactor metrics; Impact Factor. 1. Introduction The Eigenfactor metric is a measure of journal influence (Bergstrom, 2007; Bergstrom et al., 2008; West et al., 2010). Unlike traditional met- rics, like the popular Impact Factor, the Eigenfactor method weights journal citations by the influence of the citing journals. As a result, a journal is influential if it is cited by other influential journals. The definition is clearly recursive in terms of influence and the computation of the Eigenfactor scores involves the search of a stationary distribution, which corresponds to the leading eigenvector of a perturbed citation matrix. The Eigenfactor method was initially developed by Jevin West, Ben Alt- house, Martin Rosvall, and Carl Bergstrom at the University of Washington ✩Accepted for publication in Information Processing & Management. DOI: 10.1016/j.ipm.2010.01.001 Preprint submitted to Elsevier January 18, 2010 and Ted Bergstrom at the University of California Santa Barbara.
    [Show full text]
  • New Approaches to Ranking Economics Journals
    05‐12 New Approaches to Ranking Economics Journals Yolanda K. Kodrzycki and Pingkang Yu Abstract: We develop a flexible, citations‐ and reference‐intensity‐adjusted ranking technique that allows a specified set of journals to be evaluated using a range of alternative criteria. We also distinguish between the influence of a journal and that of a journal article, with the latter concept arguably being more relevant for measuring research productivity. The list of top economics journals can (but does not necessarily) change noticeably when one examines citations in the social science and policy literatures, and when one measures citations on a per‐ article basis. The changes in rankings are due to the broad interest in applied microeconomics and economic development, to differences in citation norms and in the relative importance assigned to theoretical and empirical contributions, and to the lack of a systematic effect of journal size on influence per article. We also find that economics is comparatively self‐ contained but nevertheless draws knowledge from a range of other disciplines. JEL Classifications: A10, A12 Keywords: economics journals, social sciences journals, policy journals, finance journals, rankings, citations, research productivity, interdisciplinary communications. Yolanda Kodrzycki is a Senior Economist and Policy Advisor at the Federal Reserve Bank of Boston. Her email address is [email protected]. Pingkang Yu is a Ph.D. student at George Washington University. His email address is [email protected]. Much of the research for this paper was conducted while Yu was with the Federal Reserve Bank of Boston. This paper, which may be revised, is available on the web site of the Federal Reserve Bank of Boston at http://www.bos.frb.org/economic/wp/index.htm.
    [Show full text]
  • AIRS Module 11: Publication Metrics CRICOS No
    AIRS Module 11 Publication metrics Contents Module 11 Publication metrics ......................................................................................... 3 Learning objectives ....................................................................................................... 3 Resource Log ................................................................................................................ 3 11.1 Research impact ........................................................................................................ 3 What are publication or citation-based metrics? ............................................................ 4 11.2 Main metrics ............................................................................................................... 4 Publication counts ............................................................................................................ 5 Citation counts .................................................................................................................. 5 h-index .............................................................................................................................. 6 Benefits ......................................................................................................................... 6 Limitations ..................................................................................................................... 7 Journal quality .................................................................................................................
    [Show full text]
  • Eigenfactor: Ranking and Mapping Scientific Knowledge
    Eigenfactor: ranking and mapping scientific knowledge Jevin D. West A dissertation submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy University of Washington 2010 Program Authorized to Offer Degree: Department of Biology University of Washington Graduate School This is to certify that I have examined this copy of a doctoral dissertation by Jevin D. West and have found that it is complete and satisfactory in all respects, and that any and all revisions required by the final examining committee have been made. Chair of the Supervisory Committee: Carl T. Bergstrom Reading Committee: Carl T. Bergstrom Benjamin B. Kerr Thomas L. Daniel Date: c Copyright 2010 Jevin D. West In presenting this dissertation in partial fulfillment of the requirements for the doctoral degree at the University of Washington, I agree that the Library shall make its copies freely available for inspection. I further agree that extensive copying of the dissertation is allowable only for scholarly purposes, consistent with fair use as prescribed in the U.S. Copyright Law. Requests for copying or reproduction of this dissertation may be referred to ProQuest Information and Learning, 300 North Zeeb Road, Ann Arbor, MI 48106- 1346, 1-800-521-0600, to whom the author has granted the right to reproduce and sell (a) copies of the manuscript in microform and/or (b) printed copies of the manuscript made from microform. Signature Date University of Washington Abstract Eigenfactor: ranking and mapping the scholarly literature Jevin D. West Chair of the Supervisory Committee: Professor Carl T. Bergstrom Department of Biology Each year, tens of thousands of scholarly journals publish hundreds of thou- sands of scholarly papers, collectively containing tens of millions of citations.
    [Show full text]
  • Nber Working Paper Series Open Access As a Crude
    NBER WORKING PAPER SERIES OPEN ACCESS AS A CRUDE SOLUTION TO A HOLD-UP PROBLEM IN THE TWO-SIDED MARKET FOR ACADEMIC JOURNALS Mark J. McCabe Christopher M. Snyder Working Paper 22220 http://www.nber.org/papers/w22220 NATIONAL BUREAU OF ECONOMIC RESEARCH 1050 Massachusetts Avenue Cambridge, MA 02138 May 2016 We thank Mark Armstrong, Sumit Joshi, Simon Loertscher, David Malueg, Lars Stole, and seminar participants at American University, Dartmouth College, George Washington University, MIT, New York University, Ruhr-Universitaet Bochum, Santa Clara University, Tulane University, the University of Chicago, the University of Goettingen, Yale University, the Cologne Summit on Open Access Publishing, the Emory University Symposium on Open Access and Digital Preservation, the International Industrial Organization Conference (Chicago), and the NBER Workshop on Scholarly Communication, Open Science, and Its Impact (Cambridge, MA) for helpful comments. This paper grew out of an earlier project, "The Economics of Open-Access Journals," conducted under grants from the Open Society Institute, NET Institute, and Stigler Center for Study of Economy and State at the University of Chicago Graduate School of Business. Funding for work on this paper was provided by the Sloan Foundation, grant #G-2012-10-20, which the authors gratefully acknowledge. The views expressed herein are those of the authors and do not necessarily reflect the views of the National Bureau of Economic Research. NBER working papers are circulated for discussion and comment purposes. They have not been peer-reviewed or been subject to the review by the NBER Board of Directors that accompanies official NBER publications. © 2016 by Mark J.
    [Show full text]
  • Journal Citation Indexes and Journal Quality: Information for Stockton Faculty
    Journal Citation Indexes and Journal Quality: Information for Stockton Faculty Deborah M. Figart, Ph.D. Fellow, Institute for Faculty Development, 2012-2013 The Richard Stockton College of New Jersey Introduction Publishing in peer referred journals is a requirement for professional advancement in nearly all academic disciplines. This document is intended to serve as an aid to faculty who seek to learn more about the following: • How to choose an academic journal to submit a paper; • How to document the quality or impact of the journal; and • How to document the quality or impact of an author’s scholarly journal article(s). Part 1. Selecting a Journal Faculty usually have a pretty good idea of the top journals in their field because of the reading they did in graduate school. Many of us belong to professional associations that publish a journal or journals. But there are thousands of academic journals, disciplinary and interdisciplinary. Assuredly there are dozens of highly regarded journals that will consider your paper for publication. It is a good idea to become familiar with many journals in your field, and even related fields.1 That way you can be current with the literature and also get a sense of the kinds of articles and topics that various journals publish. If you want to learn more about what journals are appropriate outlets for your work, Stockton’s Richard E. Bjork Library subscribes to a useful online database called UlrichsWeb. UlrichsWeb is an easy-to-search source of detailed information on more than 300,000 periodicals of all types in 900 subject areas: academic and scholarly journals, e-journals, peer- reviewed titles, popular magazines, newspapers, newsletters, and more.
    [Show full text]
  • Publications, Citations and Impact Factors Myth and Reality
    AUSTRALIAN UNIVERSITIES’ REVIEW Publications, citations and impact factors Myth and reality Robert Jeyakumar Nathan Multimedia University, Malaysia Omar Bin Shawkataly Universiti Sains Malaysia This article discusses the role of university academics as researchers. In present-day society which touts instant gratification, the primary role of a university is being undermined. In Malaysia, academics no longer teach and research as they please but are ‘guided’ by government agencies and influenced by the priorities of funding agencies. One of the components of academic freedom is the freedom to publish. Academics publish research that pushes the boundaries of knowledge. They choose journals in which their articles will be peer-reviewed, published and read by the communities of interest. However, lately many academics tend to publish in journals based on rankings, in journals reputed to have higher impact than others. Often young academics are puzzled whether they should publish where it matters or where it would quickly boost the key performance indicators (KPIs) set by the university. This article highlights some of these struggles in modern academia and exposes several examples of academic misconduct. Keywords: role of academics, academic freedom, journal ranking, academic misconduct focus and concern would be students’ learning. For Introduction lecturers and professors, their main focus is often their field of expertise that they research and teach. Lecturers Academics are pillars of educational endeavour and and professors seek to invigorate students’ thinking on agents of knowledge discovery and dissemination. Some their subject matter, by constantly questioning the status academics only conduct research, while others busy quo and providing new perspectives to a subject matter themselves solely with teaching.
    [Show full text]