<<

Ensuring excellence in scholarship for the 21st century: better research and publication practice behaviors by a broad spectrum of players

Symposium der Cochrane Deutschland Stiftung Freiburg, Germany 1st February 2019

David Moher (twitter - @dmoher) Centre for (http://www.ohri.ca/journalology/) Ottawa Hospital Research Institute; School of and Public Health, uOttawa; ORCID: 0000-0003-2434-4206 Disclosures of interests

• Co-editor-in-chief of Systematic Reviews and receive a small stipend from BioMed Central • Associate Director of the International Congress on Peer Review and Scientific Publication • Member of the SRDR Governance Board • Executive member, REWARD Alliance • I have no other real or perceived disclosures of interests to declare Take home messages

• Publication practices are not optimal yet scientists are using them to advance their careers • Cochrane (Germany) could act as a strong advocate to improve how scientists are evaluated for career advancement – Bring evidence to the table • Advocate for meta-research and journalology • Sign DORA • Downplay the ’s impact factor • Advocate for stronger publication practices

“Our belief is that research funders, scientific societies, school and university

www.researchwaste.net teachers, professional medical associations, and scientific publishers (and their editors) can use this Series as an opportunity to examine more forensically why they are doing what they do...and whether they are getting the most value

for the time and money invested in science.”

Kleinert S, Horton R. How should medical science change? Lancet 2014; 383(9913):197–8 Problems with biomedical publications

• There is a reproducibility crises • Publication bias is on the rise – Statistically negative results have been eviscerated from the literature • High prevalence of selective outcome reporting bias • High prevalence of spin • Endemic poor quality of reporting of biomedical research • Researchers are publishing (publicly funded) research in deceptive journals • Data sharing is uncommon University rankings

• Yet researchers have seen their careers advance • Universities use traditional research metrics to advance their cause (ranking schemes; league tables) – Advertise their ‘value’ – Recruit faculty and students – Attract funding

Current criteria to assess scientists

• Little evidence to inform the discussion – What are the current criteria? – Are the current criteria evidence-based? – Should the criteria be evidence-based? • I’ve not been able to identify where to examine a curated and annotated list of P&T criteria – Nothing exists publicly • Current criteria are not effective – the scientific literature is in a mess

Alperin, J.P. et al. Preprint https://hcommons.org/deposits/objects/hc:21016/datastreams/CONTENT/content 170 Faculties of

• Faculties of Medicine or Biomedical Sciences were eligible for inclusion – Leiden Rankings (2017) and took a 20% random sample (n= 170) • When available each faculty guideline for assessing scientists for promotion and tenure was reviewed and the presence of – 5 traditional criteria – 7 progressive criteria

Rice DB, et al. unpublished Traditional Incentives 1 Is any quantitative or qualitative mention made about publications required? If quantitative, please specify the requirement. 2 Is any quantitative or qualitative mention made about the specific authorship order in publications? If so, please specify order (e.g. first, senior, single) required. 3 Is any mention made of journal impact factors? If quantitative, what are the minimum thresholds? 4 Is any mention made of grant funding? If quantitative, what are the minimum thresholds (i.e., amount of funding and/or number of grants as principal investigator)? 5 Is any mention made requiring that research is recognized at a national or international level? If so, please specify the requirement. Progressive Incentives 1 Is any mention made of citations? If quantitative, what are the thresholds of minimum requirement? Are specific citation databases mentioned? 2 Is any mention made of data sharing? If quantitative, what are the minimum thresholds (e.g., percentage of data that is to be made available)? 3 Is any mention made of publishing in open access mediums? If quantitative, what are the minimum thresholds (e.g., percentage of studies to be published in open access journals)? 4 Is any mention made of registration (including preregistration challenge) of studies? If yes, are there thresholds of minimum requirement (e.g., percentage of studies that are to be registered). 5 Is any mention made of adherence to reporting guidelines for publications? If so, are specific guidelines mentioned? 6 Is any mention made of alternative metrics for sharing research (e.g., social media and print media)? If so, are specific metrics mentioned? 7 Is any mention made of accommodations or adjustments to expectations due to extenuating circumstance? If so, please specify the description of accommodations (e.g., an extra year to defer tenure consideration) and the type of eligible circumstances (e.g., parental leave, medical leave)? Assessing scientists for hiring, promotion and tenure

Moher, D et al. Eur J Clin Invest. 2016 May;46(5):383-5.

Declaration of Research Assessment (DORA)

The JIF in a given year is the average number of citations to research articles in that journal over the preceding two years http://www.ascb.org/dora/ Curry S. Nature. 2018 Feb 8;554(7691):147. Moher D, et al. PLOS Biology 2018 Mar 29;16(3):e2004089. doi: 10.1371 Why data sharing

Data sharing associated with added value

Piwowar HA, et al.(2007). PLoS ONE 2(3): e308. N Engl J Med 2018;378:2202-11 Traditional and Progressive Incentives for Promotion and Tenure

5

4

3

Progressive Incentives

2 Traditional Incentives Median Number Criteria Number of Median

1

0 Assistant Professor Associate Professor Professor Tenure Level of Evaluation

Rice DB, et al. unpublished Value to the community From twitter

• My paper has only been cited 21 times but the software I developed has been downloaded 4300 times – Value to the community University of Ghent

2nd January 2019: http://dariuszgalasinski.com/2019/01/02/ghents-choices/ Benedictus and Miedema Nature 2016; 538:453-454 https://www.bihealth.org/en/quest-center/mission-approaches/ Levers of power in science

• Journals (Cochrane) – peer review and editorial oversight of manuscripts • Funders (Cochrane) – 14% Canadian institutes of of Health Research – 25%-30% in Germany • Patients and the public (Cochrane) – Increasingly important • Academic institutions – The home of current and future generations of researchers

Sims et al. Heart. 2017 Nov 1. pii: heartjnl-2017-312165 Moher et al. BMC Med. 2017 Sep 11;15(1):167 Data driven publication practices

• Register all protocols and make results of all research available – Clinical and preclinical • Stop switching primary outcomes covertly • Promote reproducibility • Share data, materials and code • Use reporting guidelines • Promote open science

Moher D, et al. PLOS Biology 2018 Mar 29;16(3):e2004089. doi: 10.1371 Make the results of research available

Chen R, et al. BMJ. 2016 Feb 17;352:i637 Replicating results

Open Science Collaboration, Science 349, aac4716 (2015). DOI: 10.1126/science.aac4716 Stanford University, Hoover Institution

• Delphi panel • Face to face consensus meeting (January 2017) – N = 22 – Multiple stakeholders Responsible Indicators for Assessing Scientists (RIAS) # Key Principles

1 Contributing to societal needs is an important goal of scholarship

2 Assessing faculty should be based on responsible indicators that reflect fully the contribution to the scientific enterprise

3 We should reward publishing and/or reporting all research completely and transparently, regardless of the results

4 The culture of Open Research needs to be rewarded

5 It is important to fund research that can provide an evidence base to inform optimal ways to assess science and faculty

6 Funding out-of-the-box ideas needs to be valued in promotion and tenure decisions

Moher D, et al. (2018) Assessing scientists for hiring, promotion, and tenure. PLoS Biol 16(3): e2004089. https://doi.org/10.1371/

Principle 2 Action Assessing faculty should be based on Assessing scientists should include responsible indicators that reflect fully the indicators that can incentivize best contribution to the scientific enterprise publication practices such as assessing registration (e.g. publishing the protocol, registered reports), reproducible research reporting, contributions to peer review, etc. Such indicators should be measured as objectively and accurately as possible Implementation • Academic institutions, funders, and journals should explicitly endorse efforts to reduce the importance of journal impact factors when assessing faculty (e.g., DORA; Leiden manifesto) and make this known to their constituents. – German Cochrane Foundation could sign DORA – Substantially reduce the use of Impact Factors in Cochrane library – An explicit statement about not using these metrics when evaluating career advancement Implementation

• Institutions will need to prepare the landscape to ease implementation: – To facilitate data sharing, it is likely that the FAIR (Findability, Accessibility, Interoperability and Reusability) principles will need to be in place. – German Cochrane Foundation in collaboration with German universities prepare educational outreach about FAIR

Implementation

• Data sharing is not without costs – Cochrane advocate for universities to make funds available to help researchers prepare data for sharing • Cochrane advocate for universities, particularly promotion and tenure committees, add best publication practices into their assessment criteria – RIAS

Experiment with CVs to reward best publication practices

Steven Beyer Roberts http://faculty.washington.edu/sr320

European Union. Evaluation of Research Careers fully acknowledging Open Science Practices Rewards, incentives and/or recognition for researchers practicing Open Science, 2017. https://bit.ly/2HmMkK8 Could Cochrane more strongly advocate for: • Reproducibility • Data sharing • Better publication practices • Using different criteria when assessing scientists for career advancement • Experimenting with CVs

Thank you 