Downloaded Manually1
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
1 Data Policies of Highly-Ranked Social Science
1 Data policies of highly-ranked social science journals1 Mercè Crosas2, Julian Gautier2, Sebastian Karcher3, Dessi Kirilova3, Gerard Otalora2, Abigail Schwartz2 Abstract By encouraging and requiring that authors share their data in order to publish articles, scholarly journals have become an important actor in the movement to improve the openness of data and the reproducibility of research. But how many social science journals encourage or mandate that authors share the data supporting their research findings? How does the share of journal data policies vary by discipline? What influences these journals’ decisions to adopt such policies and instructions? And what do those policies and instructions look like? We discuss the results of our analysis of the instructions and policies of 291 highly-ranked journals publishing social science research, where we studied the contents of journal data policies and instructions across 14 variables, such as when and how authors are asked to share their data, and what role journal ranking and age play in the existence and quality of data policies and instructions. We also compare our results to the results of other studies that have analyzed the policies of social science journals, although differences in the journals chosen and how each study defines what constitutes a data policy limit this comparison. We conclude that a little more than half of the journals in our study have data policies. A greater share of the economics journals have data policies and mandate sharing, followed by political science/international relations and psychology journals. Finally, we use our findings to make several recommendations: Policies should include the terms “data,” “dataset” or more specific terms that make it clear what to make available; policies should include the benefits of data sharing; journals, publishers, and associations need to collaborate more to clarify data policies; and policies should explicitly ask for qualitative data. -
D2.2: Research Data Exchange Solution
H2020-ICT-2018-2 /ICT-28-2018-CSA SOMA: Social Observatory for Disinformation and Social Media Analysis D2.2: Research data exchange solution Project Reference No SOMA [825469] Deliverable D2.2: Research Data exchange (and transparency) solution with platforms Work package WP2: Methods and Analysis for disinformation modeling Type Report Dissemination Level Public Date 30/08/2019 Status Final Authors Lynge Asbjørn Møller, DATALAB, Aarhus University Anja Bechmann, DATALAB, Aarhus University Contributor(s) See fact-checking interviews and meetings in appendix 7.2 Reviewers Noemi Trino, LUISS Datalab, LUISS University Stefano Guarino, LUISS Datalab, LUISS University Document description This deliverable compiles the findings and recommended solutions and actions needed in order to construct a sustainable data exchange model for stakeholders, focusing on a differentiated perspective, one for journalists and the broader community, and one for university-based academic researchers. SOMA-825469 D2.2: Research data exchange solution Document Revision History Version Date Modifications Introduced Modification Reason Modified by v0.1 28/08/2019 Consolidation of first DATALAB, Aarhus draft University v0.2 29/08/2019 Review LUISS Datalab, LUISS University v0.3 30/08/2019 Proofread DATALAB, Aarhus University v1.0 30/08/2019 Final version DATALAB, Aarhus University 30/08/2019 Page | 1 SOMA-825469 D2.2: Research data exchange solution Executive Summary This report provides an evaluation of current solutions for data transparency and exchange with social media platforms, an account of the historic obstacles and developments within the subject and a prioritized list of future scenarios and solutions for data access with social media platforms. The evaluation of current solutions and the historic accounts are based primarily on a systematic review of academic literature on the subject, expanded by an account on the most recent developments and solutions. -
A Comprehensive Analysis of the Journal Evaluation System in China
A comprehensive analysis of the journal evaluation system in China Ying HUANG1,2, Ruinan LI1, Lin ZHANG1,2,*, Gunnar SIVERTSEN3 1School of Information Management, Wuhan University, China 2Centre for R&D Monitoring (ECOOM) and Department of MSI, KU Leuven, Belgium 3Nordic Institute for Studies in Innovation, Research and Education, Tøyen, Oslo, Norway Abstract Journal evaluation systems are important in science because they focus on the quality of how new results are critically reviewed and published. They are also important because the prestige and scientific impact of journals is given attention in research assessment and in career and funding systems. Journal evaluation has become increasingly important in China with the expansion of the country’s research and innovation system and its rise as a major contributor to global science. In this paper, we first describe the history and background for journal evaluation in China. Then, we systematically introduce and compare the most influential journal lists and indexing services in China. These are: the Chinese Science Citation Database (CSCD); the journal partition table (JPT); the AMI Comprehensive Evaluation Report (AMI); the Chinese STM Citation Report (CJCR); the “A Guide to the Core Journals of China” (GCJC); the Chinese Social Sciences Citation Index (CSSCI); and the World Academic Journal Clout Index (WAJCI). Some other lists published by government agencies, professional associations, and universities are briefly introduced as well. Thereby, the tradition and landscape of the journal evaluation system in China is comprehensively covered. Methods and practices are closely described and sometimes compared to how other countries assess and rank journals. Keywords: scientific journal; journal evaluation; peer review; bibliometrics 1 1. -
Downloaded Manually1
The Journal Coverage of Web of Science and Scopus: a Comparative Analysis Philippe Mongeon and Adèle Paul-Hus [email protected]; [email protected] Université de Montréal, École de bibliothéconomie et des sciences de l'information, C.P. 6128, Succ. Centre-Ville, H3C 3J7 Montréal, Qc, Canada Abstract Bibliometric methods are used in multiple fields for a variety of purposes, namely for research evaluation. Most bibliometric analyses have in common their data sources: Thomson Reuters’ Web of Science (WoS) and Elsevier’s Scopus. This research compares the journal coverage of both databases in terms of fields, countries and languages, using Ulrich’s extensive periodical directory as a base for comparison. Results indicate that the use of either WoS or Scopus for research evaluation may introduce biases that favor Natural Sciences and Engineering as well as Biomedical Research to the detriment of Social Sciences and Arts and Humanities. Similarly, English-language journals are overrepresented to the detriment of other languages. While both databases share these biases, their coverage differs substantially. As a consequence, the results of bibliometric analyses may vary depending on the database used. Keywords Bibliometrics, citations indexes, Scopus, Web of Science, research evaluation Introduction Bibliometric and scientometric methods have multiple and varied application realms, that goes from information science, sociology and history of science to research evaluation and scientific policy (Gingras, 2014). Large scale bibliometric research was made possible by the creation and development of the Science Citation Index (SCI) in 1963, which is now part of Web of Science (WoS) alongside two other indexes: the Social Science Citation Index (SSCI) and the Arts and Humanities Citation Index (A&HCI) (Wouters, 2006). -
Russian Index of Science Citation: Overview and Review
Russian Index of Science Citation: Overview and Review Olga Moskaleva,1 Vladimir Pislyakov,2 Ivan Sterligov,3 Mark Akoev,4 Svetlana Shabanova5 1 [email protected] Saint Petersburg State University, Saint Petersburg (Russia) 2 [email protected] National Research University Higher School of Economics, Moscow (Russia) 3 [email protected] National Research University Higher School of Economics, Moscow (Russia) 4 [email protected] Ural Federal University, Ekaterinburg (Russia) 5 [email protected] Scientific Electronic Library, Moscow (Russia) Abstract At the beginning of 2016 the new index was launched on the Web of Science platform — Russian Science Citation Index (RSCI). The database is free for all Web of Science subscribers except those from the former Soviet Union republics. This database includes publications from the 652 best Russian journals and is based on the data from Russian national citation index — Russian Index of Science Citation (RISC). Though RISC was launched in 2005 but there is not much information about it in English-language scholarly literature by now. The aim of this paper is to describe the history, actual structure and user possibilities of RISC. We also draw attention to the novel features of RISC which are crucial to bibliometrics and unavailable in international citation indices. Introduction. History and main objectives of RISC RISC was launched in 2005 as a government-funded project primarily aimed at creating a comprehensive bibliographic/citation database of Russian scholarly publishing for evaluation purposes based on Scientific Electronic Library (further eLIBRARY.RU) which started as a full- text database of scholarly literature for grantholders of Russian Foundation for Basic Research (RFBR). -
Rapid Changes of Serbian Scientific Journals and Their Quality, Visibility
Rapid Changes of Serbian Scientific Journals: Their Quality, Visibility and Role in Science Locally and Globally Aleksandra Popovic University of Belgrade, University Library “Svetozar Markovic”, Bulevar kralja Aleksandra 71, Belgrade, Serbia. [email protected] Sanja Antonic University of Belgrade, University Library “Svetozar Markovic”, Bulevar kralja Aleksandra 71, Belgrade, Serbia. [email protected] Stela Filipi Matutinovic University of Belgrade, University Library “Svetozar Markovic”, Bulevar kralja Aleksandra 71, Belgrade, Serbia. [email protected] Abstract: More than 400 scientific journals are published in Serbia, and 357 are referred to in the Serbian Citation Index - national citation index. The visibility of Serbian scientific journals in the citation database Web of Science has risen significantly in the last decade. Results of this analysis show the presence of Serbian journals in Web of Science, JCR, Scopus and Serbian Citation Index (SCIndeks) during four years (2007 to 2010). The paper discusses different bibliometric parameters in Web of Science, JCR and the Scopus portal SCImago (citations, average citations/year, impact factor, Hirsch index) for Serbian journals. Bibliometric indicators that appear in the National Citation Index are also analyzed. The number of Serbian journals with impact factor has increased during the observed period. The impact of Serbian publishers rose remarkably in 2010, and Serbia has two highly ranked journals. Keywords: Scientific journals, citation analysis, Web of Science, Scopus, national citation index, Serbia Introduction Journals are the main medium in scientific communication. Parameters that characterize those journals can be used for analyzing the scientific disciplines, both in separate regions and globally. It is also possible to follow the development of the scientific community itself, since the appearance of journals show that the scientific community in a particular country has reached the level at which communication through journals is needed. -
Serbian Citation Index
Serbian Citation Index: The sustainability of a business model based on partnership between a non-profit web publisher and journal owners Milica Ševkušić, Biljana Kosanović, Pero Šipka To cite this version: Milica Ševkušić, Biljana Kosanović, Pero Šipka. Serbian Citation Index: The sustainability of a business model based on partnership between a non-profit web publisher and journal owners. ELPUB 2020 24rd edition of the International Conference on Electronic Publishing, Apr 2020, Doha, Qatar. 10.4000/proceedings.elpub.2020.16. hal-02544845 HAL Id: hal-02544845 https://hal.archives-ouvertes.fr/hal-02544845 Submitted on 16 Apr 2020 HAL is a multi-disciplinary open access L’archive ouverte pluridisciplinaire HAL, est archive for the deposit and dissemination of sci- destinée au dépôt et à la diffusion de documents entific research documents, whether they are pub- scientifiques de niveau recherche, publiés ou non, lished or not. The documents may come from émanant des établissements d’enseignement et de teaching and research institutions in France or recherche français ou étrangers, des laboratoires abroad, or from public or private research centers. publics ou privés. Serbian Citation Index: The sustainability of a business model based on partn... 1 Serbian Citation Index: The sustainability of a business model based on partnership between a non-profit web publisher and journal owners Milica Ševkušić, Biljana Kosanović and Pero Šipka Introduction 1 The Open Access (OA) and Open Science (OS) movements have led to global changes in scholarly -
Social Science & Medicine
SOCIAL SCIENCE & MEDICINE AUTHOR INFORMATION PACK TABLE OF CONTENTS XXX . • Description p.1 • Audience p.2 • Impact Factor p.2 • Abstracting and Indexing p.2 • Editorial Board p.2 • Guide for Authors p.6 ISSN: 0277-9536 DESCRIPTION . Social Science & Medicine provides an international and interdisciplinary forum for the dissemination of social science research on health. We publish original research articles (both empirical and theoretical), reviews, position papers and commentaries on health issues, to inform current research, policy and practice in all areas of common interest to social scientists, health practitioners, and policy makers. The journal publishes material relevant to any aspect of health from a wide range of social science disciplines (anthropology, economics, epidemiology, geography, policy, psychology, and sociology), and material relevant to the social sciences from any of the professions concerned with physical and mental health, health care, clinical practice, and health policy and organization. We encourage material which is of general interest to an international readership. The journal publishes the following types of contribution: 1) Peer-reviewed original research articles and critical analytical reviews in any area of social science research relevant to health and healthcare. These papers may be up to 9000 words including abstract, tables, figures, references and (printed) appendices as well as the main text. Papers below this limit are preferred. 2) Systematic reviews and literature reviews of up to 15000 words including abstract, tables, figures, references and (printed) appendices as well as the main text. 3) Peer-reviewed short communications of findings on topical issues or published articles of between 2000 and 4000 words. -
The Latest Thinking About Metrics for Research Impact in the Social Sciences (White Paper)
A SAGE White Paper The Latest Thinking About Metrics for Research Impact in the Social Sciences SAGE Publishing Based on a workshop convened by SAGE Publishing including representatives from the Alfred P. Sloan Foundation, Altmetric, the Center for Advanced Study in the Behavioral Sciences at Stanford University, Clarivate Analytics, Google, New York University, School Dash, SciTech Strategies, the Social Science Research Council, and the University of Washington. www.sagepublishing.com Share your thoughts at socialsciencespace.com/impact or using #socialscienceimpact Contents Introduction ...................................................................................................................2 Impact Metrics Workshop—February 2019 .................................................................3 Stakeholders ..................................................................................................................4 Academic Stakeholders—Individual ....................................................................................4 Academic Stakeholders—Institutional/Organizational ........................................................4 Societal Stakeholders ..........................................................................................................5 Commercial Stakeholders ...................................................................................................5 Academia .............................................................................................................................5 -
Social Scientists' Satisfaction with Data Reuse
Ixchel M. Faniel OCLC Research Adam Kriesberg University of Michigan Elizabeth Yakel University of Michigan Social Scientists’ Satisfaction with Data Reuse Note: This is a preprint of an article accepted for publication in Journal of the Association for Information Science and Technology copyright © 2015. Abstract: Much of the recent research on digital data repositories has focused on assessing either the trustworthiness of the repository or quantifying the frequency of data reuse. Satisfaction with the data reuse experience, however, has not been widely studied. Drawing from the information systems and information science literatures, we develop a model to examine the relationship between data quality and data reusers’ satisfaction. Based on a survey of 1,480 journal article authors who cited Inter-University Consortium for Political and Social Research (ICPSR) data in published papers from 2008 – 2012, we found several data quality attributes - completeness, accessibility ease of operation, and credibility – had significant positive associations with data reusers’ satisfaction. There was also a significant positive relationship between documentation quality and data reusers’ satisfaction. © 2015 OCLC Online Computer Library Center, Inc. 6565 Kilgour Place Dublin, Ohio 43017-3395 This work is licensed under a Creative Commons Attribution 4.0 International License. http://creativecommons.org/licenses/by/4.0/ Suggested citation: Faniel, I. M., Kriesberg, A. and Yakel, E. (2015). Social scientists' satisfaction with data reuse. Journal of the Association for Information Science and Technology. doi: 10.1002/asi.23480 Social Scientists’ Satisfaction with Data Reuse 1 Introduction In the past decade, digital data repositories in a variety of disciplines have increased in number and scope (Hey, Tansley, & Tolle, 2009). -
Journal Rankings in Sociology
Journal Rankings in Sociology: Using the H Index with Google Scholar Jerry A. Jacobs1 Forthcoming in The American Sociologist 2016 Abstract There is considerable interest in the ranking of journals, given the intense pressure to place articles in the “top” journals. In this article, a new index, h, and a new source of data – Google Scholar – are introduced, and a number of advantages of this methodology to assessing journals are noted. This approach is attractive because it provides a more robust account of the scholarly enterprise than do the standard Journal Citation Reports. Readily available software enables do-it-yourself assessments of journals, including those not otherwise covered, and enable the journal selection process to become a research endeavor that identifies particular articles of interest. While some critics are skeptical about the visibility and impact of sociological research, the evidence presented here indicates that most sociology journals produce a steady stream of papers that garner considerable attention. While the position of individual journals varies across measures, there is a high degree commonality across these measurement approaches. A clear hierarchy of journals remains no matter what assessment metric is used. Moreover, data over time indicate that the hierarchy of journals is highly stable and self-perpetuating. Yet highly visible articles do appear in journals outside the set of elite journals. In short, the h index provides a more comprehensive picture of the output and noteworthy consequences of sociology journals than do than standard impact scores, even though the overall ranking of journals does not markedly change. 1 Corresponding Author; Department of Sociology and Population Studies Center, University of Pennsylvania, 3718 Locust Walk, Philadelphia, PA 19104, USA; email: [email protected] Interest in journal rankings derives from many sources. -
Systematic Reviews in the Social Sciences a PRACTICAL GUIDE
Petticrew / Assessing Adults with Intellecutal Disabilities 1405121106_01_pre-toc Final Proof page 3 4.8.2005 1:07am Systematic Reviews in the Social Sciences A PRACTICAL GUIDE Mark Petticrew and Helen Roberts Petticrew / Assessing Adults with Intellecutal Disabilities 1405121106_01_pre-toc Final Proof page 1 4.8.2005 1:07am Systematic Reviews in the Social Sciences Petticrew / Assessing Adults with Intellecutal Disabilities 1405121106_01_pre-toc Final Proof page 2 4.8.2005 1:07am Petticrew / Assessing Adults with Intellecutal Disabilities 1405121106_01_pre-toc Final Proof page 3 4.8.2005 1:07am Systematic Reviews in the Social Sciences A PRACTICAL GUIDE Mark Petticrew and Helen Roberts Petticrew / Assessing Adults with Intellecutal Disabilities 1405121106_01_pre-toc Final Proof page 4 4.8.2005 1:07am ß 2006 by Mark Petticrew and Helen Roberts BLACKWELL PUBLISHING 350 Main Street, Malden, MA 02148-5020, USA 9600 Garsington Road, Oxford OX4 2DQ, UK 550 Swanston Street, Carlton, Victoria 3053, Australia The right of Mark Petticrew and Helen Roberts to be identified as the Authors of this Work has been asserted in accordance with the UK Copyright, Designs, and Patents Act 1988. All rights reserved. No part of this publication may be reproduced, stored in a retrieval system, or transmitted, in any form or by any means, electronic, mechanical, photo- copying, recording or otherwise, except as permitted by the UK Copyright, Designs, and Patents Act 1988, without the prior permission of the publisher. First published 2006 by Blackwell Publishing Ltd 1 2006 Library of Congress Cataloging-in-Publication Data Petticrew, Mark. Systematic reviews in the social sciences : a practical guide/Mark Petticrew and Helen Roberts.