Pragmatic Clinical Trials for Real-World Evidence: Concept and Implementation
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Real-World Data for Health Technology
International Journal of Real-world data for health technology Technology Assessment in Health Care assessment for reimbursement decisions in Asia: current landscape and a way forward cambridge.org/thc Jing Lou1,* , Sarin KC2,*, Kai Yee Toh1, Saudamini Dabak2 , Amanda Adler3, Jeonghoon Ahn4 , Diana Beatriz S. Bayani1,5, Kelvin Chan6,7,8, Commentary Dechen Choiphel9, Brandon Chua10, Anne Julienne Genuino5, 5 11 1 12 *Joint first authors – contributed equally to Anna Melissa Guerrero , Brendon Kearney , Lydia Wenxin Lin , Yuehua Liu , the preparation of the first draft. Ryota Nakamura13, Fiona Pearce14, Shankar Prinja15 , Raoh-Fang Pwu16, Asrul †Joint senior authors – contributed equally to Akmal Shafie17, Binyan Sui12, Auliya Suwantika18, Yot Teerawattananon1,2,†, conceptualizing and designing the study, project management, supervising data Sean Tunis19, Hui-Min Wu16, John Zalcberg20,21, Kun Zhao12, collection and data analysis, critically revised 2,22,23, 1,24, the first draft and approved the final version. Wanrudee Isaranuwatchai † and Hwee-Lin Wee † Cite this article: Lou J et al (2020). Real-world 1Saw Swee Hock School of Public Health, National University of Singapore (NUS), Singapore, Singapore; 2Health data for health technology assessment for Intervention and Technology Assessment Program (HITAP), Ministry of Health, Mueang Nonthaburi, Thailand; reimbursement decisions in Asia: current 3National Institute for Health and Care Excellence (NICE), London, UK; 4Ewha Womans University, Seoul, South landscape and a way forward. International Korea; 5Health Technology Assessment Unit, Department of Health, Manila, Philippines; 6Sunnybrook Odette Journal of Technology Assessment in Health 7 8 – Cancer Centre, Toronto, Canada; Sunnybrook Research Institute, Toronto, Canada; Canadian Centre for Applied Care 36, 474 480. -
Efficacy and Effectiveness Models
Efficacy and Effectiveness models Graziano Onder Centro Medicina dell’Invecchiamento Università Cattolica del Sacro Cuore Rome - Italy EMA Workshop: Ensuring safe and effective medicines for an ageing population Definition Efficacy is the capacity to produce an effect. In medicine, it is the ability of an intervention or drug to produce a desired effect in expert hands and under ideal circumstances . Effectiveness is the capability of producing a desired result. In medicine, effectiveness relates to how well a treatment works in practice , as opposed to efficacy, which measures how well it works in RCT or laboratory studies. Ideal or real patient? Comorbidity Researchers have Multiple drugs largely shied away Physical function from the complexity Cognitive status Physical function of multiple chronic Affective status conditions Social status — avoidance that Incontinence results in expensive, COMPLEXITY COMPLEXITY Malnutrition potentially harmful care of unclear Falls benefit. Osteoporosis Tinetti M. NEJM2011 Efficacy and Effectiveness research Effectiveness research addresses practical questions about an intervention as it would occur in routine clinical practice, preserving the ‘ecology’ of care: hypothesis and study design are formulated based on information needed to make a decision. Efficacy research is aimed to better understand how and why an intervention works. Tunis SR. JAMA 2003 Efficacy and Effectiveness research 3 key features differentiates effectiveness (pragmatic or practical trials ) and efficacy research ( explanatory -
LITERATURE SEARCHING Veterinary Medicine Year 1
LITERATURE SEARCHING Veterinary Medicine Year 1 1 Wednesday, 11 November 2020 Literature Searching Presenter: Niala Dwarika-Bhagat Email: [email protected] 21/10/2020 2 Wednesday, 11 November 2020 Objectives To identify the 5 steps of Evidence-Based Veterinary Medicine (EBVM) To locate and determine resources for medical and health literature searching. Focus: o Database Searching – PubMed (free database) o Database Searching Ebscohost Medline (UWI Subscribed Databases) o Searching UWILinC o Locating e-Journals by subject To conduct effective hands-on online research To identify the FMS Vancouver referencing resources What is evidence-based medicine? "Evidence-based medicine (EBM) requires the integration of the best research with our clinical expertise and our patient's unique values and circumstances.“ From: Straus, Sharon E. (2005). Evidence-based medicine: How to practice and teach EBM, p. 1.[Source: https://veterinary-practice.com/article/embracing-evidence-based-medicine ] The practice of evidence-based medicine is a process of self-directed problem-based learning in which clinically relevant information about diagnosis, prognosis, and therapy is considered according to the best data available through research. [Source: https://www.medindia.net/patientinfo/evidence-based-medicine.htm#1 ] One commonly used definition of evidence-based medicine (EBM) is: “… the conscientious, explicit and judicious use of current best evidence in making decisions about the care of individual patients. The practice of EBM means integrating individual clinical expertise with the best available external clinical evidence from systematic research.” (Sackett et al., 2000) 4 Wednesday, 11 November 2020 What is EBVM? The practise of Evidence-Based Veterinary Medicine (EBVM) is the use of best available scientific evidence, in conjunction with clinical expertise and consideration of owner and patient factors, to make the best clinical decisions for patients. -
From Explanatory to Pragmatic Clinical Trials: a New Era for Effectiveness Research
From Explanatory to Pragmatic Clinical Trials: A New Era for Effectiveness Research Jerry Jarvik, M.D., M.P.H. Professor of Radiology, Neurological Surgery and Health Services Adjunct Professor Orthopedic Surgery & Sports Medicine and Pharmacy Director, Comparative Effectiveness, Cost and Outcomes Research Center (CECORC) UTSW April 2018 Acknowledgements •NIH: UH2 AT007766; UH3 AT007766 •NIH P30AR072572 •PCORI: CE-12-11-4469 Disclosures Physiosonix (ultrasound company): Founder/stockholder UpToDate: Section Editor Evidence-Based Neuroimaging Diagnosis and Treatment (Springer): Co-Editor The Big Picture Comparative Effectiveness Evidence Based Practice Health Policy Learning Healthcare System So we need to generate evidence Challenge #1: Clinical research is slow • Tradi>onal RCTs are slow and expensive— and rarely produce findings that are easily put into prac>ce. • In fact, it takes an average of 17 ye ars before research findings Howlead pragmaticto widespread clinical trials canchanges improve in care. practice & policy Challenge #1: Clinical research is slow “…rarely produce findings that are easily put into prac>ce.” Efficacy vs. Effectiveness Efficacy vs. Effectiveness • Efficacy: can it work under ideal conditions • Effectiveness: does it work under real-world conditions Challenge #2: Clinical research is not relevant to practice • Tradi>onal RCTs study efficacy “If we want of txs for carefully selected more evidence- popula>ons under ideal based pr actice, condi>ons. we need more • Difficult to translate to real practice-based evidence.” world. Green, LW. American Journal • When implemented into of Public Health, 2006. How pragmatic clinical trials everyday clinical prac>ce, oZen seecan a “ voltage improve drop practice”— drama>c & decreasepolicy from efficacy to effec>veness. -
Using Randomized Evaluations to Improve the Efficiency of US Healthcare Delivery
Using Randomized Evaluations to Improve the Efficiency of US Healthcare Delivery Amy Finkelstein Sarah Taubman MIT, J-PAL North America, and NBER MIT, J-PAL North America February 2015 Abstract: Randomized evaluations of interventions that may improve the efficiency of US healthcare delivery are unfortunately rare. Across top journals in medicine, health services research, and economics, less than 20 percent of studies of interventions in US healthcare delivery are randomized. By contrast, about 80 percent of studies of medical interventions are randomized. The tide may be turning toward making randomized evaluations closer to the norm in healthcare delivery, as the relevant actors have an increasing need for rigorous evidence of what interventions work and why. At the same time, the increasing availability of administrative data that enables high-quality, low-cost RCTs and of new sources of funding that support RCTs in healthcare delivery make them easier to conduct. We suggest a number of design choices that can enhance the feasibility and impact of RCTs on US healthcare delivery including: greater reliance on existing administrative data; measuring a wide range of outcomes on healthcare costs, health, and broader economic measures; and designing experiments in a way that sheds light on underlying mechanisms. Finally, we discuss some of the more common concerns raised about the feasibility and desirability of healthcare RCTs and when and how these can be overcome. _____________________________ We are grateful to Innessa Colaiacovo, Lizi Chen, and Belinda Tang for outstanding research assistance, to Katherine Baicker, Mary Ann Bates, Kelly Bidwell, Joe Doyle, Mireille Jacobson, Larry Katz, Adam Sacarny, Jesse Shapiro, and Annetta Zhou for helpful comments, and to the Laura and John Arnold Foundation for financial support. -
Interpreting Indirect Treatment Comparisons and Network Meta
VALUE IN HEALTH 14 (2011) 417–428 available at www.sciencedirect.com journal homepage: www.elsevier.com/locate/jval SCIENTIFIC REPORT Interpreting Indirect Treatment Comparisons and Network Meta-Analysis for Health-Care Decision Making: Report of the ISPOR Task Force on Indirect Treatment Comparisons Good Research Practices: Part 1 Jeroen P. Jansen, PhD1,*, Rachael Fleurence, PhD2, Beth Devine, PharmD, MBA, PhD3, Robbin Itzler, PhD4, Annabel Barrett, BSc5, Neil Hawkins, PhD6, Karen Lee, MA7, Cornelis Boersma, PhD, MSc8, Lieven Annemans, PhD9, Joseph C. Cappelleri, PhD, MPH10 1Mapi Values, Boston, MA, USA; 2Oxford Outcomes, Bethesda, MD, USA; 3Pharmaceutical Outcomes Research and Policy Program, School of Pharmacy, School of Medicine, University of Washington, Seattle, WA, USA; 4Merck Research Laboratories, North Wales, PA, USA; 5Eli Lilly and Company Ltd., Windlesham, Surrey, UK; 6Oxford Outcomes Ltd., Oxford, UK; 7Canadian Agency for Drugs and Technologies in Health (CADTH), Ottawa, ON, Canada; 8University of Groningen / HECTA, Groningen, The Netherlands; 9University of Ghent, Ghent, Belgium; 10Pfizer Inc., New London, CT, USA ABSTRACT Evidence-based health-care decision making requires comparisons of all of randomized, controlled trials allow multiple treatment comparisons of relevant competing interventions. In the absence of randomized, con- competing interventions. Next, an introduction to the synthesis of the avail- trolled trials involving a direct comparison of all treatments of interest, able evidence with a focus on terminology, assumptions, validity, and statis- indirect treatment comparisons and network meta-analysis provide use- tical methods is provided, followed by advice on critically reviewing and in- ful evidence for judiciously selecting the best choice(s) of treatment. terpreting an indirect treatment comparison or network meta-analysis to Mixed treatment comparisons, a special case of network meta-analysis, inform decision making. -
Patient Reported Outcomes (PROS) in Performance Measurement
Patient-Reported Outcomes Table of Contents Introduction ............................................................................................................................................ 2 Defining Patient-Reported Outcomes ............................................................................................ 2 How Are PROs Used? ............................................................................................................................ 3 Measuring research study endpoints ........................................................................................................ 3 Monitoring adverse events in clinical research ..................................................................................... 3 Monitoring symptoms, patient satisfaction, and health care performance ................................ 3 Example from the NIH Collaboratory ................................................................................................................... 4 Measuring PROs: Instruments, Item Banks, and Devices ....................................................... 5 PRO Instruments ............................................................................................................................................... 5 Item Banks........................................................................................................................................................... 8 Devices ................................................................................................................................................................. -
A National Strategy to Develop Pragmatic Clinical Trials Infrastructure
A National Strategy to Develop Pragmatic Clinical Trials Infrastructure Thomas W. Concannon, Ph.D.1,2, Jeanne-Marie Guise, M.D., M.P.H.3, Rowena J. Dolor, M.D., M.H.S.4, Paul Meissner, M.S.P.H.5, Sean Tunis, M.D., M.P.H.6, Jerry A. Krishnan, M.D., Ph.D.7,8, Wilson D. Pace, M.D.9, Joel Saltz, M.D., Ph.D.10, William R. Hersh, M.D.3, Lloyd Michener, M.D.4, and Timothy S. Carey, M.D., M.P.H.11 Abstract An important challenge in comparative effectiveness research is the lack of infrastructure to support pragmatic clinical trials, which com- pare interventions in usual practice settings and subjects. These trials present challenges that differ from those of classical efficacy trials, which are conducted under ideal circumstances, in patients selected for their suitability, and with highly controlled protocols. In 2012, we launched a 1-year learning network to identify high-priority pragmatic clinical trials and to deploy research infrastructure through the NIH Clinical and Translational Science Awards Consortium that could be used to launch and sustain them. The network and infrastructure were initiated as a learning ground and shared resource for investigators and communities interested in developing pragmatic clinical trials. We followed a three-stage process of developing the network, prioritizing proposed trials, and implementing learning exercises that culminated in a 1-day network meeting at the end of the year. The year-long project resulted in five recommendations related to developing the network, enhancing community engagement, addressing regulatory challenges, advancing information technology, and developing research methods. -
Patient Engagement in Health Research: a How-To Guide for Researchers
Patient Engagement in Health Research: A How-to Guide for Researchers May 2018 (Version 8.0) Acknowledgement Thank you to all of the individuals and organizations who provided feedback on the iterations of this guide for researchers. • AbSPORU—Patient Engagement Platform Advisory Council (PEPAC) members • Alberta Health Services—Patient Engagement Reference Group • Alberta Health Services—Provincial Patient and Family Advisory Council • Alberta Health Services--Strategic Clinical Networks (Assistant Scientific Directors) • CIHR National Patient Engagement Working Group • IMAGINE Citizens • Individuals who took the Pragmatic Clinical Trials (PCT) course in September 2017, or participated in the PCT webinar in the Fall of 2017 Contents Overview 1 Why should I engage patients in health research? 1 How is patient engagement defined? 2 What does patient engagement in health research look like? 3 How does the “patient and researcher engagement in health research” 4 strategy differ from other participatory research approaches? How do I know I’m ready to engage patients in health research? 6 How do I use this Guide? 8 Step 1: Why 10 Why engage patients in health research? 10 What knowledge or perspectives do you seek from patients at the individual 11 level? Is research ethics approval required to engage patients in health research? 12 Step 2: Who 14 Who should I consider engaging in my research? 14 How many patients do I need to engage? 15 How do I decide how patients should be involved? 16 How do I find people to engage in my research? 19 Step -
Evidence-Based Medicine: Is It a Bridge Too Far?
Fernandez et al. Health Research Policy and Systems (2015) 13:66 DOI 10.1186/s12961-015-0057-0 REVIEW Open Access Evidence-based medicine: is it a bridge too far? Ana Fernandez1*, Joachim Sturmberg2, Sue Lukersmith3, Rosamond Madden3, Ghazal Torkfar4, Ruth Colagiuri4 and Luis Salvador-Carulla5 Abstract Aims: This paper aims to describe the contextual factors that gave rise to evidence-based medicine (EBM), as well as its controversies and limitations in the current health context. Our analysis utilizes two frameworks: (1) a complex adaptive view of health that sees both health and healthcare as non-linear phenomena emerging from their different components; and (2) the unified approach to the philosophy of science that provides a new background for understanding the differences between the phases of discovery, corroboration, and implementation in science. Results: The need for standardization, the development of clinical epidemiology, concerns about the economic sustainability of health systems and increasing numbers of clinical trials, together with the increase in the computer’s ability to handle large amounts of data, have paved the way for the development of the EBM movement. It was quickly adopted on the basis of authoritative knowledge rather than evidence of its own capacity to improve the efficiency and equity of health systems. The main problem with the EBM approach is the restricted and simplistic approach to scientific knowledge, which prioritizes internal validity as the major quality of the studies to be included in clinical guidelines. As a corollary, the preferred method for generating evidence is the explanatory randomized controlled trial. This method can be useful in the phase of discovery but is inadequate in the field of implementation, which needs to incorporate additional information including expert knowledge, patients’ values and the context. -
E Resources in the Medical Sciences
E-RESOURCES IN THE MEDICAL SCIENCES E-RESOURCES IN THE MEDICAL SCIENCES Presenter: Niala Dwarika-Bhagat October 2020 OBJECTIVES • To understand literature searching within a basic EBM (Evidence Based Medicine) framework. • Online Resources: to expose participants to databases - free and library-subscribed - available for medical/health sciences research. • Search Strategy: basic introduction • Hands-on: hands-on opportunity is included and databases – free / subscribed would be explored. What is EBM and why should I care? BMJ: • It's about integrating individual clinical expertise and the best external evidence. • Good doctors use both individual clinical expertise and the best available external evidence, and neither alone is enough. • Evidence based medicine is the conscientious, explicit, and judicious use of current best evidence in making decisions about the care of individual patients. • Evidence based medicine is not “cookbook” medicine. • Without clinical expertise, practice risks becoming tyrannised by evidence, for even excellent external evidence may be inapplicable to or inappropriate for an individual patient. • Without current best evidence, practice risks becoming rapidly out of date, to the detriment of patients. • Evidence based medicine is not restricted to randomised trials and meta-analyses. It involves tracking down the best external evidence with which to answer our clinical questions. Source: https://www.bmj.com/content/312/7023/71.full.print What EBM skills do you need? • Information mastery find the best evidence for every day practice • Assessment - relevance before rigor. Is the evidence patient- oriented? • Evaluation- information about therapies, diagnostic tests, and clinical decision rules. Understand basic statistics. • Efficiency - having at fingertips "just in time" information at the point of care using web based and/or handheld computer based information and tools for clinical decision making • Evaluation - expert-based information, including colleagues, presentations, reviews and guidelines. -
PCS PFA (Cycle 1 2021)
Cycle 1 2021 Funding Cycle PCORI Funding Announcement: Pragmatic Clinical Studies To Evaluate Patient-Centered Outcomes Published January 5, 2021 This PCORI Funding Announcement (PFA) applies to the funding cycle that closes May 4, 2021, at 5 pm ET. Submission Instructions, templates, and other resources are available at https://www.pcori.org/funding-opportunities/announcement/pragmatic-clinical-studies-cycle- 1-2021. CLOSED About PCORI The Patient-Centered Outcomes Research Institute (PCORI) was authorized by Congress in 2010 as a nonprofit, nongovernmental organization. PCORI’s purpose, as defined by our authorizing legislation, is to help patients, caregivers, clinicians, policy makers, and other healthcare system stakeholders make better-informed health decisions by “advancing the quality and relevance of evidence about how to prevent, diagnose, treat, monitor, and manage diseases, disorders, and other health conditions” and by promoting the dissemination and uptake of this evidence. PCORI is committed to transparency and a rigorous stakeholder-driven process that emphasizes patient engagement. PCORI uses a variety of forums and public comment periods to obtain public input to enhance its work. PCORI helps people make informed healthcare decisions and improves healthcare delivery and outcomes by producing and promoting high-integrity, evidence-based information that comes from research guided by patients and other stakeholders. Patient-Centered Outcomes Research Institute 1828 L St., NW, Suite 900 Washington, DC 20036 Phone: 202-827-7700