Adoption of Pubmed Commons As a Forum for Post-Publication Peer Review

Total Page:16

File Type:pdf, Size:1020Kb

Adoption of Pubmed Commons As a Forum for Post-Publication Peer Review 26 Adoption of PubMed Commons as a forum for post-publication peer review Paul Lane Director of Social Media and Web-Based Information, Envision Pharma Group, Horsham, UK Abstract Results Objective: To assess the level of adoption of PubMed Commons, a pilot system allowing comments to be added to publications listed • A total of 1771 publications have received comments in the PubMed database by anyone who is an author on a PubMed- Figure 4A. Proportion of comments by nature of indexed publication, since launching in October 2013. since the launch of Commons; they were almost all comment Research design and methods: A search was conducted using the English language (99%) and approximately half were term “has_user_comments[filter]” to capture all items with comments available as free full text (49%; Figure 1). 9 8.5 within the PubMed database between the launch and 21 September 8 2014. Publications receiving comments were then sorted by the type of publication (as designated by PubMed filters – clinical trials, reviews and 7 free full text), publication year and journal. Figure 1. Proportion of articles with comments by 6 5.7 Results: A total of 1607 publications have received comments since publication type 5 the launch of Commons; they were almost all English language 4 (99%) and approximately half were available as free full text (49%). 120 Although publications as early as 1973 received comments, most 3 2.8 Comments (%) 2.1 were from 2013/2014 (58%). Over half were human studies (54%), 1.8 100 2 1.4 but those classified as clinical trials and systematic reviews/meta- 1.4 1.1 0.7 analyses accounted for only 8% and 9%, respectively. The top five 1 0.7 0.4 0.7 journals for publications with comments were PLoS One (4.1%), New 80 0 England Journal of Medicine (3.2%), Science (2.2%), Nature (2.0%) and Proceedings of the National Academy of Sciences U S A (1.9%). 60 author Conclusions: Levels of adoption of PubMed Commons have been very .gov issue poor and only 565 of the 903,614 publications added to PubMed in 40 Correction Plagiarism Articles with CT 2014 garnered comments. Publications garnering comments often had comments (%) Consort issue Congratulations open-access availability or were published in top-tier journals. 20 Omitted citations Acknowledgment Comment by Highlight retraction Request for clarification 0 Suggestion for experiments Response to a comment by author FFT PMC Letter Review English Editorial Introduction Guideline Clinical trial Meta-analysis Other animals Human studies • The most common (22.4%) type of comment was a • Scientific publication has always been based on Systematic review detailed, referenced critique of the research, akin to a letter a system of review and comment on published to the editor. More general, non-referenced comments FFT, free full text; PMC, PubMed Central. research via pre-publication peer-review and were the second most common (16.7%), with links to post-publication “letters to the editor” supporting research (8.2%), conflicting research (6.0%), • Over half were human studies (57%), but those classified and blogs (7.1%) the third most common (Figure 4B). • Post-publication review now occurs much as clinical trials and systematic reviews/meta-analyses more rapidly, via social media (eg,Twitter), accounted for only 8% each. online scientific communities (eg, Faculty of Figure 4B. Proportion of comments by nature of • Although articles published as early as 1956 received 1000 or Researchgate) or commenting forums comment comments, most were from 2013/2014 (58%; Figure 2). within the publishing journal (eg, PLoS One or 25 22.4 New England Journal of Medicine). Number of articles with comments by year 20 • However, there are inherent problems with this Figure 2. 16.7 system: anonymity, comments appearing across 15 multiple disparate venues, and no requirements 600 that those commenting are qualified to do so. 10 8.2 500 7.1 • PubMed Commons is a pilot system launched Comments (%) 6.0 4.3 5 in October 2013 that seeks to address 2.1 400 2.1 these problems by allowing comments to 0.4 0.7 0.4 appear directly beneath abstracts within the 0 PubMed database. 300 • If adopted in a wholesale fashion, PubMed 200 Link to blog COI disclosed Number of articles Link to full text Commons would allow centralization of Link to software General comment comments in the world’s largest searchable 100 Link to related editorial Removed by moderator database of biomedical literature. Link to supportingLink to conflicting paper paper Detailed, referenced critique Link to supplementary data • However, unlike other venues, a comment can 0 COI, conflict of interest. 2014 2012 1998 1996 1994 1992 1990 only be added by an individual who is an author 2010 2008 2006 2004 2002 2000 on at least one PubMed-indexed article. • Other useful purposes of posted comments were linking • This means that comments cannot be added to free full text of the articles, supplementary data sets, or anonymously and that the person commenting • The top six journals publishing articles that elicited the software used for a particular analysis. should be qualified to do so with a degree of comments were PLoS One (4%), New England Journal of • Potential problems highlighted by comments were scientific rigor. Medicine (3%), Proceedings of the National Academy of Sciences (3%), Science (2%), Nature (2%) and Journal of the corrections to the published research, omission of important • Purpose: To assess the level of adoption American Medical Association (1.5%; Figure 3) citations, CONSORT issues, Clinicaltrials.gov issues (lack of and type of comments added via the NCT registry or posted results), plagiarism, and retractions. PubMed Commons forum since launching in Figure 3. Number of articles with comments by • A general indicator of the quality of the comments was October 2013. journal that only one of the 281 (0.4%) comments sampled was removed by the site moderator. However, conflicts 70 of interest were only declared for two (0.7%) of the Methods 60 comments in the sample. • A search was conducted using the term “has_ 50 user_comments[filter]” to capture all items 40 Conclusions with comments within the PubMed database • Levels of adoption of PubMed Commons have been between the launch of PubMed Commons and 30 very poor, and <0.05% of publications added to 30 November 2014. 20 PubMed in 2014 garnered comments. Number of articles • Publications receiving comments were sorted 10 • Publications garnering comments often had open- by the type of publication (as designated by access availability or were published in top-tier PubMed filters – eg, clinical trials, reviews and 0 journals. However, the comments that were posted free full text), publication year, and journal. BMJ Cell generally displayed scientific rigor or fulfilled a JAMA Nature Lancet Science scientifically useful purpose. Acids Res • The type of comment was assessed by a PLoS One Pediatrics J Neurosci J Biol Chem N Engl J Med Nat Commun Nat Neurosci random sampling of 10% of the publications Ann Intern MedBioinformatics • If PubMed Commons is to fulfil its proposed role in Ann Emerg Med using a random number generator (random.org) NucleicJAMA Intern Med the post-publication evaluation of scientific research, to pick entries from the chronological listing of Proc Natl Acad Sci USA levels of adoption must improve and commenters all articles with comments. Cochrane Database Syst Rev must disclose any pertinent conflicts of interest. • Comments on these manuscripts were assigned to several non-exclusive groups (eg, “Link to • The random sampling of 177 articles had 281 individual Acknowledgments and Disclosures supplementary data” or “Correction”) based on comments; the highest number of comments received by any single article was 17, but the vast majority of articles I would like to thank Joanne King and Jason Pelc of the Envision Pharma Group for their the content of the comments and the person assistance in the production of this poster. who posted them (eg, author of the article or had ≤2 comments. Please scan this QR code with your smartphone app to view an electronic non-author); totals for each type of comment • Only 8.5% of comments were made by an author of the version of this poster. were then expressed as a percentage of the article in question and these were mostly (66.0% of cases) If you do not have a smartphone, access the poster via the Internet at overall number of sampled comments. in response to questions by non-authors (Figure 4A). http://bit.ly/1DDT14F Presented at the International Society for Medical Publication Professionals European Meeting (January 20-21, 2015; London, UK) and Annual Meeting (April 27-29; Arlington, VA, USA).
Recommended publications
  • Google Scholar, Sci-Hub and Libgen: Could They Be Our New Partners?
    Purdue University Purdue e-Pubs Proceedings of the IATUL Conferences 2017 IATUL Proceedings Google Scholar, Sci-Hub and LibGen: Could they be our New Partners? Louis Houle McGill University, [email protected] Louis Houle, "Google Scholar, Sci-Hub and LibGen: Could they be our New Partners?." Proceedings of the IATUL Conferences. Paper 3. https://docs.lib.purdue.edu/iatul/2017/partnership/3 This document has been made available through Purdue e-Pubs, a service of the Purdue University Libraries. Please contact [email protected] for additional information. GOOGLE SCHOLAR, SCI-HUB AND LIBGEN: COULD THEY BE OUR NEW PARTNERS? Louis Houle McGill University Canada [email protected] Abstract Since its debut I November 2004, librarians have raised several criticisms at Google Scholar (GS) such as its inconsistency of coverage and its currency and scope of coverage. It may have been true in the early years of Google Scholar but is this still through twelve years after? Is this sufficient to ignore it totally either in our information literacy programs or evaluate its value against the values of subscription-based abstracts and indexes? In this era of severe budget constraints that libraries are facing, can we imagine of substituting most or all of our subject databases with the free access of Google Scholar for discoverability? How much overlap between our databases and Google Scholar? How reliable is Google Scholar? How stable is its content over time? Open Access is getting to be the predominant form of getting access to peer reviewed articles. Many new non-traditional tools (institutional repositories, social media and peer to peer sites) are available out there to retrieve the full-text of peer reviewed articles.
    [Show full text]
  • Exploratory Analysis of Publons Metrics and Their Relationship with Bibliometric and Altmetric Impact
    Exploratory analysis of Publons metrics and their relationship with bibliometric and altmetric impact José Luis Ortega Institute for Advanced Social Studies (IESA-CSIC), Córdoba, Spain, [email protected] Abstract Purpose: This study aims to analyse the metrics provided by Publons about the scoring of publications and their relationship with impact measurements (bibliometric and altmetric indicators). Design/methodology/approach: In January 2018, 45,819 research articles were extracted from Publons, including all their metrics (scores, number of pre and post reviews, reviewers, etc.). Using the DOI identifier, other metrics from altmetric providers were gathered to compare the scores of those publications in Publons with their bibliometric and altmetric impact in PlumX, Altmetric.com and Crossref Event Data (CED). Findings: The results show that (1) there are important biases in the coverage of Publons according to disciplines and publishers; (2) metrics from Publons present several problems as research evaluation indicators; and (3) correlations between bibliometric and altmetric counts and the Publons metrics are very weak (r<.2) and not significant. Originality/value: This is the first study about the Publons metrics at article level and their relationship with other quantitative measures such as bibliometric and altmetric indicators. Keywords: Publons, Altmetrics, Bibliometrics, Peer-review 1. Introduction Traditionally, peer-review has been the most appropriate way to validate scientific advances. Since the first beginning of the scientific revolution, scientific theories and discoveries were discussed and agreed by the research community, as a way to confirm and accept new knowledge. This validation process has arrived until our days as a suitable tool for accepting the most relevant manuscripts to academic journals, allocating research funds or selecting and promoting scientific staff.
    [Show full text]
  • Do You Speak Open Science? Resources and Tips to Learn the Language
    Do You Speak Open Science? Resources and Tips to Learn the Language. Paola Masuzzo1, 2 - ORCID: 0000-0003-3699-1195, Lennart Martens1,2 - ORCID: 0000- 0003-4277-658X Author Affiliation 1 Medical Biotechnology Center, VIB, Ghent, Belgium 2 Department of Biochemistry, Ghent University, Ghent, Belgium Abstract The internet era, large-scale computing and storage resources, mobile devices, social media, and their high uptake among different groups of people, have all deeply changed the way knowledge is created, communicated, and further deployed. These advances have enabled a radical transformation of the practice of science, which is now more open, more global and collaborative, and closer to society than ever. Open science has therefore become an increasingly important topic. Moreover, as open science is actively pursued by several high-profile funders and institutions, it has fast become a crucial matter to all researchers. However, because this widespread interest in open science has emerged relatively recently, its definition and implementation are constantly shifting and evolving, sometimes leaving researchers in doubt about how to adopt open science, and which are the best practices to follow. This article therefore aims to be a field guide for scientists who want to perform science in the open, offering resources and tips to make open science happen in the four key areas of data, code, publications and peer-review. The Rationale for Open Science: Standing on the Shoulders of Giants One of the most widely used definitions of open science originates from Michael Nielsen [1]: “Open science is the idea that scientific knowledge of all kinds should be openly shared as early as is practical in the discovery process”.
    [Show full text]
  • How Should Peer-Review Panels Behave? IZA DP No
    IZA DP No. 7024 How Should Peer-Review Panels Behave? Daniel Sgroi Andrew J. Oswald November 2012 DISCUSSION PAPER SERIES Forschungsinstitut zur Zukunft der Arbeit Institute for the Study of Labor How Should Peer-Review Panels Behave? Daniel Sgroi University of Warwick Andrew J. Oswald University of Warwick and IZA Discussion Paper No. 7024 November 2012 IZA P.O. Box 7240 53072 Bonn Germany Phone: +49-228-3894-0 Fax: +49-228-3894-180 E-mail: [email protected] Any opinions expressed here are those of the author(s) and not those of IZA. Research published in this series may include views on policy, but the institute itself takes no institutional policy positions. The IZA research network is committed to the IZA Guiding Principles of Research Integrity. The Institute for the Study of Labor (IZA) in Bonn is a local and virtual international research center and a place of communication between science, politics and business. IZA is an independent nonprofit organization supported by Deutsche Post Foundation. The center is associated with the University of Bonn and offers a stimulating research environment through its international network, workshops and conferences, data service, project support, research visits and doctoral program. IZA engages in (i) original and internationally competitive research in all fields of labor economics, (ii) development of policy concepts, and (iii) dissemination of research results and concepts to the interested public. IZA Discussion Papers often represent preliminary work and are circulated to encourage discussion. Citation of such a paper should account for its provisional character. A revised version may be available directly from the author.
    [Show full text]
  • American Journal of Public Health (AJPH) Instructions for Authors
    American Journal of Public Health (AJPH) Instructions for Authors “What AJPH Authors Should Know” ISSN: 1541-0048 (electronic); 0090-0036 (print) Publication Frequency: 12 issues per year plus supplements with external support Publisher: Georges C. Benjamin, MD, FACP, FACEP (Emeritus), Executive Director of the American Public Health Association (APHA) Editor-in-Chief: Mary E. Northridge, PhD, MPH Editorial Board Chair: Hector Balcazar, PhD GENERAL INFORMATION Mission Formal Submission and Informal Inquiries Timelines EDITORIAL AND PUBLICATION POLICIES Authorship Conflicts of Interest Nondiscriminatory Language CONSORT Statement TREND Statement Embargo Policy Revised Policy on Enhancing Public Access to Archived Publications Resulting from NIH- funded Research Copyright EDITORIAL AND PEER REVIEW CATEGORIES OF PAPERS Letters to the Editor and Responses Editor’s Choice Editorials On the Other Hand Statistically Speaking Field Action Reports Going Public Faces of Public Health Commentaries Analytic Essays Health Policy and Ethics Forum Government, Politics, and Law Framing Health Matters Public Health Then and Now Voices From the Past Images of Health Briefs Research and Practice Articles MANUSCRIPT PREPARATION AND SUBMISSION REQUIREMENTS Style Manuscript File Formats Cover Letter MANUSCRIPT COMPONENTS Title Page Abstracts Headings References Tables, Figures, and Images Statistics Supplemental Files Reproduced Material ETHICS COMPLIANCE GENERAL INFORMATION Mission Promoting public health research, policy, practice, and education is the foremost mission of the AJPH. We aim to embrace all of public health, from global policies to the local needs of public health practitioners. Contributions of original unpublished research and practice articles and briefs, social science and policy analyses (analytic essays), constructively critical commentaries, department papers, and letters to the editor and responses are welcome.
    [Show full text]
  • PUBLIC HEALTH in PRACTICE Putting Research Into Action
    PUBLIC HEALTH IN PRACTICE Putting research into action AUTHOR INFORMATION PACK TABLE OF CONTENTS XXX . • Description p.1 • Abstracting and Indexing p.1 • Editorial Board p.2 • Guide for Authors p.3 ISSN: 2666-5352 DESCRIPTION . An official journal of the The Royal Society for Public Health and a sister journal of Public Health. Public Health in Practice is a gold open access journal in the field of population health. The journal invites papers from across the broad public health subject area, with a particular focus on public health in practice, emerging topics, and policy debates. The journal will publish research study types, from study protocols through to meta-analyses, including commentaries and short communications. Whilst research focusing on the practical basis of public health delivery or providing a foundational understanding of theoretical issues in the field is the journal's core interest, robust and ethically sound articles on any public health topic are welcome. These can include (but are not limited to): Public health practice and impact Applied epidemiology Need or impact assessments Health service effectiveness, management and re-design Health protection including control of communicable diseases Health promotion and disease prevention Evaluation of public health programmes or interventions Public health governance, audit and quality Public health law and ethics Public health policy and comparisons Capacity in public health systems and workforce The target market of the journal is researchers, practitioners, and policymakers who are engaged in public health practice, teaching, and research. Public Health in Practice adheres to strict ethical publication guidelines and actively supports a culture of inclusive and representative publication.
    [Show full text]
  • COVID-19 Preprints and Their Publishing Rate: an Improved Method
    medRxiv preprint doi: https://doi.org/10.1101/2020.09.04.20188771; this version posted September 7, 2020. The copyright holder for this preprint (which was not certified by peer review) is the author/funder, who has granted medRxiv a license to display the preprint in perpetuity. It is made available under a CC-BY-ND 4.0 International license . COVID-19 Preprints and Their Publishing Rate: An Improved Method Francois Lachapelle, PhD Candidate Department of Sociology, University of British Columbia Abstract Context: As the COVID-19 pandemic persists around the world, the scientific community continues to produce and circulate knowledge on the deadly disease at an unprecedented rate. During the early stage of the pandemic, preprints represented nearly 40% of all English-language COVID-19 scientific corpus (6, 000+ preprints | 16, 000+ articles). As of mid-August 2020, that proportion dropped to around 28% (13, 000+ preprints | 49, 000+ articles). Nevertheless, preprint servers remain a key engine in the efficient dissemination of scientific work on this infectious disease. But, giving the ‘uncertified’ nature of the scientific manuscripts curated on preprint repositories, their integration to the global ecosystem of scientific communication is not without creating serious tensions. This is especially the case for biomedical knowledge since the dissemination of bad science can have widespread societal consequences. Scope: In this paper, I propose a robust method that will allow the repeated monitoring and measuring of COVID-19 preprint’s publication rate. I also introduce a new API called Upload-or-Perish. It is a micro- API service that enables a client to query a specific preprint manuscript’s publication status and associated meta-data using a unique ID.
    [Show full text]
  • Open Science and the Role of Publishers in Reproducible Research
    Open science and the role of publishers in reproducible research Authors: Iain Hrynaszkiewicz (Faculty of 1000), Peter Li (BGI), Scott Edmunds (BGI) Chapter summary Reproducible computational research is and will be facilitated by the wide availability of scientific data, literature and code which is freely accessible and, furthermore, licensed such that it can be reused, inteGrated and built upon to drive new scientific discoveries without leGal impediments. Scholarly publishers have an important role in encouraGing and mandating the availability of data and code accordinG to community norms and best practices, and developinG innovative mechanisms and platforms for sharinG and publishinG products of research, beyond papers in journals. Open access publishers, in particular the first commercial open access publisher BioMed Central, have played a key role in the development of policies on open access and open data, and increasing the use by scientists of leGal tools – licenses and waivers – which maximize reproducibility. Collaborations, between publishers and funders of scientific research, are vital for the successful implementation of reproducible research policies. The genomics and, latterly, other ‘omics communities historically have been leaders in the creation and wide adoption of policies on public availability of data. This has been throuGh policies, such as Fort Lauderdale and the Bermuda Principles; infrastructure, such as the INSDC databases; and incentives, such as conditions of journal publication. We review some of these policies and practices, and how these events relate to the open access publishinG movement. We describe the implementation and adoption of licenses and waivers prepared by Creative Commons, in science publishinG, with a focus on licensing of research data published in scholarly journals and data repositories.
    [Show full text]
  • Ethics of Peer Review: a Guide for Manuscript Reviewers
    Ethics of Peer Review: A Guide for Manuscript Reviewers Sara Rockwell, Ph.D. Departments of Therapeutic Radiology and Pharmacology, and Office of Scientific Affairs, Yale University School of Medicine, PO Box 208040, New Haven, CT 06510-8040 Abstract The peer review of scientific manuscripts is a cornerstone of modern science and medicine. Peer reviewed journals rely on expert and objective review by knowledgeable researchers to ensure the quality of the papers they publish. Moreover, the recommendations the reviewers make concerning publication and the advice they give to authors set the scientific standards of the discipline. In addition, their critiques set subtler standards of collegiality, behavior, and ethics, not only through their recommendations concerning which papers should be published and which should be rejected, but also through the tone and wording of their reviews and through the thought that they give to their scientific and ethical responsibilities as reviewers. The review of manuscripts for peer reviewed journals raises many ethical issues and problems. The reviewer should be aware of these when deciding whether to review a specific paper, throughout the process of handling the manuscript and writing the review, and even after the review is completed and submitted. This paper describes some of the ethical issues that can arise during the peer review process and discusses their implications. 1 INTRODUCTION The peer review of scientific manuscripts is a cornerstone of modern science and medicine. Peer reviewed journals rely on expert and objective review by knowledgeable researchers to ensure the quality of the papers they publish. Moreover, the collective activities of the investigators who review manuscripts in a scientific discipline serve to set the standards of that field.
    [Show full text]
  • Publons for Researchers Final
    Track more of your impact. Own your research identity. Publons Researcher Profiles “In today’s highly competitive research environment, I am frequently asked to demonstrate my impact. But keeping track of all my contributions is tedious, frustrating, and time-consuming. I have grown exhausted with keeping all of the different profiles and tools up-to-date.” Researcher feedback Researcher feedback 2 Track more of your research impact and own your online researcher identity Track your publications, citation metrics, peer reviews, and journal editing work in one, easy-to- maintain profile. ● All your publications, instantly imported from Web of Science, ORCiD, or your bibliographic reference manager (e.g. EndNote Zotero, or Mendeley) ● Trusted citation metrics, automatically imported from the Web of Science Core Collection ● Manage your publication records in Web of Science ● Your verified peer review and journal editing history, powered by partnerships with thousands of scholarly journals ● Downloadable record summarising your scholarly impact as an author, editor and peer reviewer. 3 Unified Authentication A seamless user-experience. Sign in to Publons, EndNote, and Web of Science (entitlements pending) using the same email and password. 4 Modern profile Summary of key publication and peer review metrics Simple and intuitive user interface and clear information hierarchy making it easy for you to build, maintain, and navigate profiles. Navigate to detailed metrics, publication, or peer review and journal editing summaries Research fields, bio, institutional affiliations, and highly cited and peer review awards Most cited publications and citation counts Journals reviewed for and count of verified reviews performed 5 Easy to add publications to your profile 1.
    [Show full text]
  • Eigenfactor: Ranking and Mapping Scientific Knowledge
    Eigenfactor: ranking and mapping scientific knowledge Jevin D. West A dissertation submitted in partial fulfillment of the requirements for the degree of Doctor of Philosophy University of Washington 2010 Program Authorized to Offer Degree: Department of Biology University of Washington Graduate School This is to certify that I have examined this copy of a doctoral dissertation by Jevin D. West and have found that it is complete and satisfactory in all respects, and that any and all revisions required by the final examining committee have been made. Chair of the Supervisory Committee: Carl T. Bergstrom Reading Committee: Carl T. Bergstrom Benjamin B. Kerr Thomas L. Daniel Date: c Copyright 2010 Jevin D. West In presenting this dissertation in partial fulfillment of the requirements for the doctoral degree at the University of Washington, I agree that the Library shall make its copies freely available for inspection. I further agree that extensive copying of the dissertation is allowable only for scholarly purposes, consistent with fair use as prescribed in the U.S. Copyright Law. Requests for copying or reproduction of this dissertation may be referred to ProQuest Information and Learning, 300 North Zeeb Road, Ann Arbor, MI 48106- 1346, 1-800-521-0600, to whom the author has granted the right to reproduce and sell (a) copies of the manuscript in microform and/or (b) printed copies of the manuscript made from microform. Signature Date University of Washington Abstract Eigenfactor: ranking and mapping the scholarly literature Jevin D. West Chair of the Supervisory Committee: Professor Carl T. Bergstrom Department of Biology Each year, tens of thousands of scholarly journals publish hundreds of thou- sands of scholarly papers, collectively containing tens of millions of citations.
    [Show full text]
  • What Is Peer Review?
    A “scholarly” or “academic” journal publishes articles written by experts for experts that are “peer reviewed.” What is peer review? Peer review is a process in which two or more experts in a field read an article and make suggestions for revisions before it is accepted for publication. To pass peer review, the article must give an original perspective on an important topic. It also must be thoroughly researched, logically argued, and well written. Most articles that are submitted to journals do not make it through peer review. The articles that do get published usually have extensive revisions. Most articles you see are the result of years of work by many people. To learn if a journal requires articles to be peer reviewed, locate journal’s information in the database you are using or the journal’s website. Why should I use scholarly articles? Scholarly articles are the most credible sources you can find because of the rigorous peer-review process. They are written by people who have studied this subject for many years and they have been reviewed by other people with similar experience. They are thoroughly researched, which means you can “mine” the article’s bibliography to find other sources that might be useful for your paper. Where can I find scholarly articles? The easiest way to find them is by using an EBSCO or JSTOR database. Access to these databases is usually provided by your campus library. If you use an EBSCO database, such as Academic Search Complete, check the “scholarly sources” box on the search page before you start your search.
    [Show full text]