Pcori Methodology Report
Total Page:16
File Type:pdf, Size:1020Kb
Load more
Recommended publications
-
Jackson: Choosing a Methodology: Philosophical Underpinning
JACKSON: CHOOSING A METHODOLOGY: PHILOSOPHICAL UNDERPINNING Choosing a Methodology: Philosophical Practitioner Research Underpinning In Higher Education Copyright © 2013 University of Cumbria Vol 7 (1) pages 49-62 Elizabeth Jackson University of Cumbria [email protected] Abstract As a university lecturer, I find that a frequent question raised by Masters students concerns the methodology chosen for research and the rationale required in dissertations. This paper unpicks some of the philosophical coherence that can inform choices to be made regarding methodology and a well-thought out rationale that can add to the rigour of a research project. It considers the conceptual framework for research including the ontological and epistemological perspectives that are pertinent in choosing a methodology and subsequently the methods to be used. The discussion is exemplified using a concrete example of a research project in order to contextualise theory within practice. Key words Ontology; epistemology; positionality; relationality; methodology; method. Introduction This paper arises from work with students writing Masters dissertations who frequently express confusion and doubt about how appropriate methodology is chosen for research. It will be argued here that consideration of philosophical underpinning can be crucial for both shaping research design and for explaining approaches taken in order to support credibility of research outcomes. It is beneficial, within the unique context of the research, for the researcher to carefully -
On Becoming a Pragmatic Researcher: the Importance of Combining Quantitative and Qualitative Research Methodologies
DOCUMENT RESUME ED 482 462 TM 035 389 AUTHOR Onwuegbuzie, Anthony J.; Leech, Nancy L. TITLE On Becoming a Pragmatic Researcher: The Importance of Combining Quantitative and Qualitative Research Methodologies. PUB DATE 2003-11-00 NOTE 25p.; Paper presented at the Annual Meeting of the Mid-South Educational Research Association (Biloxi, MS, November 5-7, 2003). PUB TYPE Reports Descriptive (141) Speeches/Meeting Papers (150) EDRS PRICE EDRS Price MF01/PCO2 Plus Postage. DESCRIPTORS *Pragmatics; *Qualitative Research; *Research Methodology; *Researchers ABSTRACT The last 100 years have witnessed a fervent debate in the United States about quantitative and qualitative research paradigms. Unfortunately, this has led to a great divide between quantitative and qualitative researchers, who often view themselves in competition with each other. Clearly, this polarization has promoted purists, i.e., researchers who restrict themselves exclusively to either quantitative or qualitative research methods. Mono-method research is the biggest threat to the advancement of the social sciences. As long as researchers stay polarized in research they cannot expect stakeholders who rely on their research findings to take their work seriously. The purpose of this paper is to explore how the debate between quantitative and qualitative is divisive, and thus counterproductive for advancing the social and behavioral science field. This paper advocates that all graduate students learn to use and appreciate both quantitative and qualitative research. In so doing, students will develop into what is termed "pragmatic researchers." (Contains 41 references.) (Author/SLD) Reproductions supplied by EDRS are the best that can be made from the original document. On Becoming a Pragmatic Researcher 1 Running head: ON BECOMING A PRAGMATIC RESEARCHER U.S. -
From Explanatory to Pragmatic Clinical Trials: a New Era for Effectiveness Research
From Explanatory to Pragmatic Clinical Trials: A New Era for Effectiveness Research Jerry Jarvik, M.D., M.P.H. Professor of Radiology, Neurological Surgery and Health Services Adjunct Professor Orthopedic Surgery & Sports Medicine and Pharmacy Director, Comparative Effectiveness, Cost and Outcomes Research Center (CECORC) UTSW April 2018 Acknowledgements •NIH: UH2 AT007766; UH3 AT007766 •NIH P30AR072572 •PCORI: CE-12-11-4469 Disclosures Physiosonix (ultrasound company): Founder/stockholder UpToDate: Section Editor Evidence-Based Neuroimaging Diagnosis and Treatment (Springer): Co-Editor The Big Picture Comparative Effectiveness Evidence Based Practice Health Policy Learning Healthcare System So we need to generate evidence Challenge #1: Clinical research is slow • Tradi>onal RCTs are slow and expensive— and rarely produce findings that are easily put into prac>ce. • In fact, it takes an average of 17 ye ars before research findings Howlead pragmaticto widespread clinical trials canchanges improve in care. practice & policy Challenge #1: Clinical research is slow “…rarely produce findings that are easily put into prac>ce.” Efficacy vs. Effectiveness Efficacy vs. Effectiveness • Efficacy: can it work under ideal conditions • Effectiveness: does it work under real-world conditions Challenge #2: Clinical research is not relevant to practice • Tradi>onal RCTs study efficacy “If we want of txs for carefully selected more evidence- popula>ons under ideal based pr actice, condi>ons. we need more • Difficult to translate to real practice-based evidence.” world. Green, LW. American Journal • When implemented into of Public Health, 2006. How pragmatic clinical trials everyday clinical prac>ce, oZen seecan a “ voltage improve drop practice”— drama>c & decreasepolicy from efficacy to effec>veness. -
Using Randomized Evaluations to Improve the Efficiency of US Healthcare Delivery
Using Randomized Evaluations to Improve the Efficiency of US Healthcare Delivery Amy Finkelstein Sarah Taubman MIT, J-PAL North America, and NBER MIT, J-PAL North America February 2015 Abstract: Randomized evaluations of interventions that may improve the efficiency of US healthcare delivery are unfortunately rare. Across top journals in medicine, health services research, and economics, less than 20 percent of studies of interventions in US healthcare delivery are randomized. By contrast, about 80 percent of studies of medical interventions are randomized. The tide may be turning toward making randomized evaluations closer to the norm in healthcare delivery, as the relevant actors have an increasing need for rigorous evidence of what interventions work and why. At the same time, the increasing availability of administrative data that enables high-quality, low-cost RCTs and of new sources of funding that support RCTs in healthcare delivery make them easier to conduct. We suggest a number of design choices that can enhance the feasibility and impact of RCTs on US healthcare delivery including: greater reliance on existing administrative data; measuring a wide range of outcomes on healthcare costs, health, and broader economic measures; and designing experiments in a way that sheds light on underlying mechanisms. Finally, we discuss some of the more common concerns raised about the feasibility and desirability of healthcare RCTs and when and how these can be overcome. _____________________________ We are grateful to Innessa Colaiacovo, Lizi Chen, and Belinda Tang for outstanding research assistance, to Katherine Baicker, Mary Ann Bates, Kelly Bidwell, Joe Doyle, Mireille Jacobson, Larry Katz, Adam Sacarny, Jesse Shapiro, and Annetta Zhou for helpful comments, and to the Laura and John Arnold Foundation for financial support. -
Interpreting Indirect Treatment Comparisons and Network Meta
VALUE IN HEALTH 14 (2011) 417–428 available at www.sciencedirect.com journal homepage: www.elsevier.com/locate/jval SCIENTIFIC REPORT Interpreting Indirect Treatment Comparisons and Network Meta-Analysis for Health-Care Decision Making: Report of the ISPOR Task Force on Indirect Treatment Comparisons Good Research Practices: Part 1 Jeroen P. Jansen, PhD1,*, Rachael Fleurence, PhD2, Beth Devine, PharmD, MBA, PhD3, Robbin Itzler, PhD4, Annabel Barrett, BSc5, Neil Hawkins, PhD6, Karen Lee, MA7, Cornelis Boersma, PhD, MSc8, Lieven Annemans, PhD9, Joseph C. Cappelleri, PhD, MPH10 1Mapi Values, Boston, MA, USA; 2Oxford Outcomes, Bethesda, MD, USA; 3Pharmaceutical Outcomes Research and Policy Program, School of Pharmacy, School of Medicine, University of Washington, Seattle, WA, USA; 4Merck Research Laboratories, North Wales, PA, USA; 5Eli Lilly and Company Ltd., Windlesham, Surrey, UK; 6Oxford Outcomes Ltd., Oxford, UK; 7Canadian Agency for Drugs and Technologies in Health (CADTH), Ottawa, ON, Canada; 8University of Groningen / HECTA, Groningen, The Netherlands; 9University of Ghent, Ghent, Belgium; 10Pfizer Inc., New London, CT, USA ABSTRACT Evidence-based health-care decision making requires comparisons of all of randomized, controlled trials allow multiple treatment comparisons of relevant competing interventions. In the absence of randomized, con- competing interventions. Next, an introduction to the synthesis of the avail- trolled trials involving a direct comparison of all treatments of interest, able evidence with a focus on terminology, assumptions, validity, and statis- indirect treatment comparisons and network meta-analysis provide use- tical methods is provided, followed by advice on critically reviewing and in- ful evidence for judiciously selecting the best choice(s) of treatment. terpreting an indirect treatment comparison or network meta-analysis to Mixed treatment comparisons, a special case of network meta-analysis, inform decision making. -
An Invitation to Qualitative Research
CHAPTER 1 An Invitation to Qualitative Research n recent years, binge drinking has caused considerable concern among admin - istrators at colleges and universities, compelled by statistics that show marked Iincreases in such behavior. A qualitative researcher studying this topic would seek to go behind the statistics to understand the issue. Recently, we attended a fac - ulty meeting that addressed the problem of binge drinking and heard concerned faculty and administrators suggest some of the following solutions: • stricter campus policies with enforced consequences • more faculty-student socials with alcohol for those over 21 years old • more bus trips into the city to local sites such as major museums to get students interested in other pastimes Although well-intentioned, these folks were grasping at straws. This is because although they were armed with statistics indicating binge drinking was prevalent and thus could identify a problem, they had no information about why this trend was occurring. Without understanding this issue on a meaningful level, it is diffi - cult to remedy. At this point, we invite you to spend 5 to 10 minutes jotting down a list of questions you think are important to investigate as we try to better under - stand the phenomenon of binge drinking at college. What Is Qualitative Research? The qualitative approach to research is a unique grounding—the position from which to conduct research—that fosters particular ways of asking questions and particular ways of thinking through problems. As noted in the opening dis - cussion of binge drinking in college, the questions asked in this type of research usually begin with words like how, why, or what. -
Realism and Pragmatism in a Mixed Methods Study
Realism and pragmatism in a mixed methods study ALLMARK, Peter <http://orcid.org/0000-0002-3314-8947> and MACHACZEK, Katarzyna <http://orcid.org/0000-0001-5308-2407> Available from Sheffield Hallam University Research Archive (SHURA) at: http://shura.shu.ac.uk/18205/ This document is the author deposited version. You are advised to consult the publisher's version if you wish to cite from it. Published version ALLMARK, Peter and MACHACZEK, Katarzyna (2018). Realism and pragmatism in a mixed methods study. Journal of Advanced Nursing, 74 (6), 1301-1309. Copyright and re-use policy See http://shura.shu.ac.uk/information.html Sheffield Hallam University Research Archive http://shura.shu.ac.uk Realism pragmatism v3 1 MS Title: Realism and Pragmatism in a mixed methods study Running Head: Realism pragmatism v2 Authors Peter ALLMARK PhD RN (Corresponding) and Katarzyna MACHACZEK PhD Job titles: Principal Research Fellow (PA) and Research Fellow (KMa) Affiliation Centre for Health and Social Care Research, Sheffield Hallam University, 32 Collegiate Crescent, Sheffield S10 2BP Contact details for PA [email protected] Phone: 0114 225 5727 Fax: 0114 225 4377 Address - as above Acknowledgements The authors thank the clinicians who participated in this study and the Project Hope team, who facilitated data collection process. Thanks also to John Paley for reading and commenting on an earlier version. Realism pragmatism v3 2 Conflict of interest: No conflict of interest has been declared by the authors Funding statement This discussion paper received no specific grant from any funding agency in the public, commercial, or not-for-profit sectors. The empirical study discussed was funded by Project HOPE and is reported elsewhere. -
Patient Reported Outcomes (PROS) in Performance Measurement
Patient-Reported Outcomes Table of Contents Introduction ............................................................................................................................................ 2 Defining Patient-Reported Outcomes ............................................................................................ 2 How Are PROs Used? ............................................................................................................................ 3 Measuring research study endpoints ........................................................................................................ 3 Monitoring adverse events in clinical research ..................................................................................... 3 Monitoring symptoms, patient satisfaction, and health care performance ................................ 3 Example from the NIH Collaboratory ................................................................................................................... 4 Measuring PROs: Instruments, Item Banks, and Devices ....................................................... 5 PRO Instruments ............................................................................................................................................... 5 Item Banks........................................................................................................................................................... 8 Devices ................................................................................................................................................................. -
A National Strategy to Develop Pragmatic Clinical Trials Infrastructure
A National Strategy to Develop Pragmatic Clinical Trials Infrastructure Thomas W. Concannon, Ph.D.1,2, Jeanne-Marie Guise, M.D., M.P.H.3, Rowena J. Dolor, M.D., M.H.S.4, Paul Meissner, M.S.P.H.5, Sean Tunis, M.D., M.P.H.6, Jerry A. Krishnan, M.D., Ph.D.7,8, Wilson D. Pace, M.D.9, Joel Saltz, M.D., Ph.D.10, William R. Hersh, M.D.3, Lloyd Michener, M.D.4, and Timothy S. Carey, M.D., M.P.H.11 Abstract An important challenge in comparative effectiveness research is the lack of infrastructure to support pragmatic clinical trials, which com- pare interventions in usual practice settings and subjects. These trials present challenges that differ from those of classical efficacy trials, which are conducted under ideal circumstances, in patients selected for their suitability, and with highly controlled protocols. In 2012, we launched a 1-year learning network to identify high-priority pragmatic clinical trials and to deploy research infrastructure through the NIH Clinical and Translational Science Awards Consortium that could be used to launch and sustain them. The network and infrastructure were initiated as a learning ground and shared resource for investigators and communities interested in developing pragmatic clinical trials. We followed a three-stage process of developing the network, prioritizing proposed trials, and implementing learning exercises that culminated in a 1-day network meeting at the end of the year. The year-long project resulted in five recommendations related to developing the network, enhancing community engagement, addressing regulatory challenges, advancing information technology, and developing research methods. -
Internet Memes and Desensitization by Barbara Sanchez, University of Florida HSI Pathways to the Professoriate, Cohort 2
VOLUME 1, ISSUE 2 Published January 2020 Internet Memes and Desensitization By Barbara Sanchez, University of Florida HSI Pathways to the Professoriate, Cohort 2 Abstract: Internet memes (IMs) have been used as a visual form of online rhetoric since the early 2000s. With hundreds of thousands now in circulation, IMs have become a prominent method of communication across the Internet. In this essay, I analyze the characteristics that have made IMs a mainstay in online communication. Understanding the definitions and structures of IMs aid in explaining their online success, especially on social platforms like Instagram, Facebook, and Twitter. I use these understandings as a basis from which to theorize how both the creative process in making IMs and the prominence of IMs that utilize images of originally violent or sensitive contexts may relate to existing research correlating violent media and desensitization. The use of these images often involves a disconnection from their original contexts in order to create a new and distinct— in many cases irrelevant— message and meaning. These IMs, in turn, exemplify the belittlement of distress depicted in such images—often for the sake of humor. This essay’s main goal is to propose a new theoretical lens from which to analyze the social and cultural influences on IMs. The greater purpose of this essay is not to devalue Internet memes nor to define them in narrow terms nor to claim there is any causal relationship between Internet memes and societal desensitization. Rather, I am proposing a new lens through which to further study Internet memes’ social effects on online users’ conceptions of violence and/or sensitive topics. -
WRD Methodology
Methodology of the World Religion Database Adapted from Todd M. Johnson and Brian J. Grim, The World’s Religions in Figures: An Introduction to International Religious Demography (Oxford: Wiley-Blackwell, 2013) The World Religion Database (WRD) is a dynamic online publication based on the research of the International Religious Demography Project, under the direction of Dr. Todd M. Johnson and Dr. Brian J. Grim, at Boston University’s Institute on Culture, Religion and World Affairs (CURA). The WRD collects, collates, and offers analyses of primary and secondary source material on demographic variables for all major religions in every country of the world. The database is unique in that it makes estimates from these sources readily available and fully transparent to the scholarly community. The WRD provides adherence data at the provincial level where available from censuses and surveys. It also aims to account for subgroups of each major religion to the maximum extent possible based on best social science and demographic practices. The right to profess one’s choice This methodology takes as its starting-point the United Nations 1948 Universal Declaration of Human Rights, Article 18: “Everyone has the right to freedom of thought, conscience and religion; this right includes freedom to change his religion or belief, and freedom, either alone or in community with others and in public or private, to manifest his religion or belief in teaching, practice, worship and observance.” Since its promulgation, this group of phrases has been incorporated into the state constitutions of a large number of countries across the world. This fundamental right also includes the right to claim the religion of one’s choice, and the right to be called a follower of that religion and to be enumerated as such. -
VCS Methodology Approval Process, and to Ensure That the Methodology Documentation Has Been Completed in Accordance with VCS Program Rules
Methodology Approval Process 19 September 2019 v4.0 ABOUT VERRA Verra supports climate action and sustainable development through the development and management of standards, tools and programs that credibly, transparently and robustly assess environmental and social impacts, and drive funding for sustaining and scaling up these benefits. As a mission-driven, non-profit (NGO) organization, Verra works in any arena where we see a need for clear standards, a role for market-driven mechanisms and an opportunity to achieve environmental and social good. Verra manages a number of global standards frameworks designed to drive finance towards activities that mitigate climate change and promote sustainable development, including the Verified Carbon Standard (VCS) Program and its Jurisdictional and Nested REDD+ framework (JNR), the Verra California Offset Project Registry (OPR), the Climate, Community & Biodiversity (CCB) Standards and the Sustainable Development Verified Impact Standard (SD VISta). Verra is also developing new standards frameworks, including LandScale, which will promote and measure sustainability outcomes across landscapes. Finally, Verra is one of the implementing partners of the Initiative for Climate Action Transparency (ICAT), which helps countries assess the impacts of their climate actions and supports greater transparency, effectiveness, trust and ambition in climate policies worldwide. Intellectual Property Rights, Copyright and Disclaimer This document contains materials, the copyright and other intellectual property rights in which are vested in Verra or which appear with the consent of the copyright owner. These materials are made available for you to review and to copy for the use (the “Authorized Use”) of your establishment or operation of a project or program under the VCS Program (the “Authorized Use”).