Reporting checklists For course on advanced empirical methods for reproducible science 2017

Dr Trish Groves [email protected] twitter @trished Director of academic outreach BMJ, Editor in chief BMJ Open, Honorary deputy editor The BMJ Competing interests

I’m editor in chief of BMJ Open and Director of academic outreach at BMJ Publishing Group, owned by the British Medical Association (BMA)

Part of the revenue for BMJ comes from drug & device manufacturers through advertising, reprint sales, & sponsorship. The BMJ (British Medical Journal) and BMJ Open are open access journals that charge article publishing fees for . I’m editorial lead for the BMJ Research to Publication eLearning programme (by subscription).

My annual bonus scheme is based partly on the overall financial performance of both BMJ and BMJ Research to Publication.

What I’ll cover

The problem of poorly reported research

Why this matters

What can be done to fix it

How to make studies more useful and usable

85% research wasted, costing >$100bn/yr

Chalmers I, Glasziou P. Avoidable waste in the production and reporting of research evidence. Lancet 2014; 374: 86-9. REWARD Alliance http://researchwaste.net/about/ Why journals reject research

What are the main reasons for journal editors to reject a research paper, even if it is well written and presented?

• the research question isn’t sufficiently new, interesting, relevant

• the question hasn’t been answered using the best possible study design so the results may be unreliable

• investigators often lack training on developing research questions, choosing study designs, and reporting research transparently

Editors focus on the methods section

Editors want to publish papers where: • the research question is clear, interesting, new, and relevant to the journal’s readers • correct methods were used to answer the question

If the question is good and the methods are right, then the editor should be interested in the paper even if the study result was “negative” or simple Methods matter at less selective journals too

At large open access journals such as PLOS One and BMJ Open: • reviewers and editors select studies with good enough methods, clear writing, cautious interpretation. It’s OK if a study is simple • originality, importance, or relevance not judged • paper’s importance becomes clear after publication through comments, cites, downloads, shares, uses Methods section helps readers make decisions

Was the study capable of answering the research question? Was it reliable?

Can the study be replicated, refuted, or extended? Is it worth citing?

Can the intervention or method be adopted into clinical practice, health policy, or healthcare?

Can the study be included in a ?

Can the study support clinical practice guidelines? Editors want papers where the research question matches the study design

Adapted from Centre for Evidence Based Medicine, Oxford, UK www.cebm.net • descriptive studies answer “what’s happening?” Descriptive studies answer• analytic “what’s observational happening?” studies answer “why or how is it happening?” • analytic experimental studies answer “can it work?”

Observational analytic studies answer “why, how, or when is it happening?” Randomised experimental studies (trials) answer “can it work ? or does it work?”

ICMJE recommendations on methods I

The methods section should: • aim to be sufficiently detailed such that others with access to the data would be able to reproduce the results • include only information that was available at the time the plan or protocol for the study was being written • include eligibility and exclusion criteria and a description of the source population • if population was non-representative, give justification

http://www.icmje.org/recommendations/browse/manuscript-preparation/preparing-for-submission.html#d ICMJE recommendations on methods II

• specify primary and secondary outcomes • give references to established methods, including statistical methods • provide references and brief descriptions for methods that have been published but are not well-known • describe new or substantially modified methods, give the reasons for using them, and evaluate their limitations

TOP Guidelines from Center for Open Science https://cos.io/our- services/top- guidelines/ What is a reporting guideline?

“A checklist, flow diagram, or explicit text to guide authors in reporting a specific type of research, developed using explicit methodology”

• these are evidence based • they recommend a minimum set of items for reporting a particular study design • the text is usually called a statement eg the CONSORT statement • checklists follow IMRAD format

How reporting guidelines are developed

• identify need for guideline • review published and grey literature, ideally systematically • get funding • convene group for consensus exercise: • delphi survey, face to face meeting, follow up • draft checklist to discuss and hone in consensus survey and meeting(s) • pilot test the resulting checklist • develop statement + checklist + flowchart + supporting materials • publish it all, ideally with open access, and disseminate/promote widely • monitor and evaluate implementation

Moher D, Schulz KF, Simera I, Altman DG (2010) Guidance for Developers of Health Research Reporting Guidelines. PLOS Medicine 7(2): e1000217. doi:10.1371/journal.pmed.1000217 Where to find reporting guidelines http://www.equator-network.org/ Reporting guidelines for many types of study

How to use reporting guidelines

• download from EQUATOR Network online library the correct guideline • translations are available for some eg CONSORT • use the reporting guideline to help you write or edit your manuscript: remember, it’s a minimum set • if guideline includes a flowchart, make it figure 1 • complete the checklist, with page numbers to show where items appear in your manuscript • “Tell us what you did” • if items are irrelevant or weren’t done, explain why • submit your completed checklist with your paper, if the journal requests it

Randomised controlled trial (RCT)

The classic design is a parallel group trial where:

• participants allocated randomly to intervention and control groups • any difference is likely to be caused by the intervention • participants remain unaware of which intervention they had until the study is completed (single blind*)... • ...and the outcome is evaluated by people who do not know which participant is in which group (double blind*) • analysis focuses on estimating the size of the difference in predefined outcomes between intervention groups, to see which intervention, if any, is better (superiority)

* “blinding” is also known as “masking” The reporting guideline for parallel group trials is CONSORT = CONsolidated Standards of Reporting Trials http://www.equator- network.org/ and http://www.consort- statement.org/

The statement includes this flowchart to show how participants took part.

It also provides an authors’ checklist, available in Chinese, French, German, Greek, Italian, Japanese, Persian, Polish, Portuguese, Russian, Spanish, Turkish

The checklist looks like this

Page 1 of 2 CONSORT 2010: items about RCT methods

Item number Checklist item 3a & b Trial design 4a & b Participants 5 Interventions – also see TIDieR checklist

6a & b Outcomes 8a & b Randomisation: sequence generation

9 Randomisation: allocation concealment mechanism 10 Randomisation: implementation 11a & b Blinding 12a & b Statistics methods

The CONSORT website includes this real paper, marked up to show how to use the checklist: http://www.consort-statement.org/Documents/SampleView/5eb5d30c-fc53-4708-b0cd-80ab76fe95a5 TIDieR Template for intervention description and replication Reviewers, editors, readers - and ultimately clinicians, patients, systematic reviewers, policy makers, payers – often need these details. TIDieR aids reporting

1. Brief name 2. Why 3. What materials 4. What procedures 5. Who provided 6. How 7. Where 8. When and how much 9. Tailoring 10. Modifications 11. How well (planned assessment) 12. How well: (actual assessment)

http://www.equator-network.org/reporting-guidelines/tidier/ SPIRIT 2013 statement: Standard Protocol Items: Recommendations for Interventional Trials

http://www.spirit-statement.org/ STROBE Statement: Strengthening The Reporting of OBbservational studies in Epidemiology http://www.strobe-statement.org/

There are several slightly different versions for cohort, case-control, and cross sectional studies Methods section for an observational epidemiological study: STROBE checklist items

Item number Checklist item

4 Study design (cohort, case- control, cross sectional) 5 Setting 6a & b Participants 7 Variables 8 Data sources/measurement 9 Bias 10 Study size 11 Quantitative variables for analysis 12a,b,c,d,e Statistical methods

http://www.strobe-statement.org/index.php?id=available-checklists Also in Chinese, German, Greek, Italian, Japanese, Persian, Portugese, Spanish Other study designs and their reporting guidelines/frameworks

Study design Reporting guideline preclinical study ARRIVE quality improvement study SQUIRE qualitative research study COREQ modelling study of prognosis or diagnosis TRIPOD health economics study CHEERS phase IV drug study ICH E8 genetic association study STREGA genetic risk prediction studies GRIPS tumor marker prognostic study REMARK

Library of reporting guidelines http://www.equator-network.org/library/ Nature reporting checklist for life sciences studies

Summary of general reporting guidelines for study methods Animal studies: Report species, strain, sex and age of animals For experiments involving live vertebrates, include a statement of compliance with ethical regulations and identify the committee(s) approving the experiments We recommend consulting the ARRIVE guidelines Human studies: Identify the committee(s) approving the study protocol Include a statement confirming that informed consent was obtained from all subjects For publication of patient photos, include a statement confirming that consent was obtained Report the registration number For phase II and III randomized controlled trials, please refer to the CONSORT statement and submit the CONSORT checklist with your submission For tumor marker prognostic studies, we recommend the REMARK reporting guidelines

http://www.nature.com/authors/policies/checklist.pdf Nature reporting checklist for life sciences studies

Statistics and general methods 1. Sample size: how chosen to ensure adequate power to detect a pre-specified effect size? 2. Inclusion/exclusion criteria if samples or animals were excluded from the analysis. 3. If a method of randomization was used to determine how samples/animals were allocated to experimental groups and processed, describe it. 4. If the investigator was blinded to the group allocation during the experiment and/or when assessing the outcome, state the extent of blinding. 5. Are statistical tests justified as appropriate? Do the data meet the assumptions of the tests? Is there an estimate of variation within each group of data? Is the variance similar between the groups that are being statistically compared?

http://www.nature.com/authors/policies/checklist.pdf ARRIVE checklist: items for the methods section of an animal research paper

Animal Research: Reporting of In Vivo Experiments http://www.nc3rs.org.uk/arrive-guidelines Thanks [email protected]

Twitter @trished Exercise: assess the reporting of the intervention using TIDieR

Template for intervention description and replication