Meeting the Challenges of Evidence-Based Policy: the Campbell Collaboration

Total Page:16

File Type:pdf, Size:1020Kb

Meeting the Challenges of Evidence-Based Policy: the Campbell Collaboration Meeting the Challenges of Evidence-Based Policy: The Campbell Collaboration By ANTHONY PETROSINO, ROBERT F. BORUCH, HALUK SOYDAN, LORNA DUGGAN, and JULIO SANCHEZ-MECA ABSTRACT: Evidence-based policy has much to recommend it, but it also faces significant challenges. These challenges reside not only in the dilemmas faced by policy makers but also in the quality of the evaluation evidence. Some of these problems are most effectively ad- dressed by rigorous syntheses of the literature known as systematic reviews. Other problems remain, including the range of quality in systematic reviews and their general failure to be updated in light of new evidence or disseminated beyond the research community. Based on the precedent established in health care by the international Cochrane Collaboration, the newly formed Campbell Collaboration will prepare, maintain, and make accessible systematic reviews of re- search on the effects of social and educational interventions. Through mechanisms such as rigorous quality control, electronic publication, and worldwide coverage of the literature, the Campbell Collaboration seeks to meet challenges posed by evidence-based policy. Anthony Petrosino is a research fellow at the Center of Evaluation, American Acad- emy of Arts and Sciences and coordinator for the Campbell Crime and Justice Group. Robert F. Boruch is University Trustee Professor at the University of Pennsylvania Graduate School of Education and cochair of the Campbell Collaboration Steering Group. Haluk Soydan is the director of the Center for Evaluation at the Center for Evalua- tion, Swedish National Board of Health and Welfare and a cochair of the Campbell Col- laboration Steering Group. Lorna Duggan is a forensic psychiatrist in the United Kingdom and reviewer for the Cochrane Collaboration. Julio Sanchez-Meca is a psychology professor at the University of Murcia in Spain and the director of the Unit for Meta-Analysis. NOTE: We appreciate the assistance of Maureen Matkovich and colleagues at the National Criminal Justice Reference Service who supplied the data in Figure 1. The first author’s work was supported by grants from the Mellon Foundation to the Center for Evaluation and from the Home Office to Cambridge University. Thanks to Brandon C. Welsh for his comments on earlier drafts of the article. 14 Downloaded from http://ann.sagepub.com at Vrije Universiteit 34820 on March 8, 2010 15 Campbell (1969) was an they trust were not the product of DONALDinfluential psychologist who advocacy? wrote persuasively about the need Donald Campbell unfortunately for governments to take evaluation did not live long enough to bear wit- evidence into account in decisions ness to the creation of the interna- about social programs. He also recog- tional collaboration named in his nized, however, the limitations of the honor that ambitiously attempts to evidence-based approach and the address some of the challenges posed fact that government officials would by evidence-based policy. The Camp- be faced with a number of political di- bell Collaboration was created to pre- lemmas that confined their use of re- pare, update, and disseminate sys- search. The limits of evidence-based tematic reviews of evidence on what policy and practice, however, reside works relevant to social and educa- tional intervention not only in the political pressures (see http:// faced by decision makers when im- campbell.gse.upenn.edu). The target plementing laws and administrative audience will include decision makers at all levels of directives or determining budgets; government, they also reside in problems with the practitioners, citizens, media, and research evidence. researchers. This article begins with a discus- Questions such as, What works to sion of the rationale for the reduce crime in communities? are Campbell Collaboration. We then describe not easily answered. The studies that the precedent established by the bear on these questions are often Cochrane Collaboration in health care. scattered across different fields and This is followed by an overview of the written in different are languages, advent and of the sometimes disseminated in obscure early progress Campbell Collaboration. We con- or inaccessible and can be of outlets, clude with the of the such that inter- promise Camp- questionable quality bell Collaboration in meeting the is at best. How can pretation risky evidence-based and challenges posed by policy practice be informed, if not policy. persuaded, by such a fragmented knowledge base comprising evalu- RATIONALE ative studies that range in quality? Which study, or set of studies, if any at all, ought to be used to influence Surge of interest in policy? What methods ought to be evidence-based policy used to appraise and analyze a set of There are many influences on separate studies bearing on the same decisions or beliefs about what ought question? And how can the findings to be done to address problems like be disseminated in such a way that crime, illiteracy, and unemployment. the very people Donald Campbell Influential factors include ideology, cared about-the decision makers in politics, costs, ethics, social back- government and elsewhere-receive ground, clinical experience, expert findings from these analyses that opinion, and anecdote (for example, Downloaded from http://ann.sagepub.com at Vrije Universiteit 34820 on March 8, 2010 16 Lipton 1992). The evidence-based Gibbon (1999), MacDonald (1999), approach stresses moving beyond and Sheldon and Chilvers (2000), these factors to also consider the among others, articulated views results of scientific studies. Although about evidence-based education and few writers have articulated the social welfare. deterministic view that the term A more persuasive indicator that &dquo;evidence-based&dquo; suggests, it is clear evidence-based policy is having some that the vast majority of writers impact is initiatives undertaken by argue that decision makers need to- governments since the late 1990s. at the very least-be aware of the Whether due to growing pragmatism research evidence that bears on poli- or pressures for accountability on cies under consideration (for exam- how public funds are spent, the ple, Davies, Nutley, and Smith 2000). evidence-based approach is begin- Certainly the implicit or explicit goal ning to take root. For example, the of research-funding agencies has United Kingdom is promoting always been to influence policy evidence-based policy in medicine through science (Weiss and Petrosino and the social sectors vigorously (for 1999), and there have always been example, Davies, Nutley, and Smith individuals who have articulated the 2000; Wiles 2001). In 1997, its need for an evidence-based approach Labour government was elected (for example, Fischer 1978). But using the slogan, &dquo;What counts is there has been a surge of interest, what works&dquo; (Davies, Nutley, and particularly in the 1990s, in argu- Smith 2000). The 1998 U.K. Crime ments for research-, science-, or evi- Reduction Programme was greatly dence-based policy (for example, influenced by both the University of Amann 2000; Boruch, Petrosino, and Maryland report to Congress on Chalmers 1999; Nutley and Davies crime prevention (Sherman et al. 1999; Wiles 2001). 1997) and the Home OfBce’s own syn- One indirect gauge of this surge is theses (Nuttall, Goldblatt, and Lewis the amount of academic writing on 1998). In Sweden, the National the topic. For example, in Sherman’s Board of Health and Welfare was (1999) argument for evidence-based commissioned by the government to policing, decisions about where to draft a program for advancing target police strategies would be knowledge in the social services to based on epidemiological data about ensure they are evidence based the nature and scope of problems. (National Board of Health and Wel- The kinds of interventions employed, fare 2001). and how long they were kept in place, In the United States, the Govern- would be guided by careful eval- ment Performance and Review Act of uative studies, preferably random- 1993 was implemented to hold fed- ized field trials. Cullen and eral agencies responsible for identi- Gendreau (2000) and MacKenzie fying measurable objectives and (2000) are among those who made reaching them. This has led to the similar arguments about correc- development of performance indica- tional treatment. Davies (1999), Fitz- tors to assess whether there is value Downloaded from http://ann.sagepub.com at Vrije Universiteit 34820 on March 8, 2010 17 added by agencies. The 1998 Coinciding with fragmentation, reauthorization of the Safe and Drug evaluation studies are not regularly Free Schools and Communities Act published in academic journals or in required that programs funded outlets that are readily accessible. under the law be research based. The Instead a large percentage of news media now commonly ask why evaluative research resides in what police are not using research-based Sechrest, White, and Brown (1979) eyewitness identification techniques called the fugitive literature. These (Gawande 2001) or why schools use are government reports, disserta- ineffective drug prevention pro- tions and master’s theses, conference grams (for example, Cohn 2001). papers, technical documents, and All of these signs seem to indicate other literature that is difficult to more than a passing interest in evi- obtain. Lipsey (1992), in his review of dence-based policy. As Boruch (1997) delinquency prevention and treat- noted, different policy questions ment studies, found 4 in 10 were require different types of scientific reported in this literature. Although evidence. To implement the most some may argue that unpublished effective interventions to ameliorate studies are of lesser quality because problems, careful evaluations are they have not been subjected to blind needed. An evidence-based approach peer review as journal articles are, to what works therefore requires this is an empirical question worthy be that these evaluations gathered, of investigation. Such an assertion, and and that appraised, analyzed the at the very least, ignores the high- results be made accessible to influ- quality evaluations done by private ence relevant decisions whenever research firms.
Recommended publications
  • JOANNA BRIGGS INSTITUTE Annual Report 2018
    JOANNA BRIGGS INSTITUTE Annual Report 2018 joannabriggs.org BETTER EVIDENCE. BETTER OUTCOMES. BRIGHTER FUTURE. 2 JBI’S IMPACT IN 2018 CONTENTS 4 Message from the Executive Director 5 JBI Strategic Plan update 6 Our team 7 Governance 8 JBI Model of Evidence-based Healthcare 10 GLOBAL REACH 11 Joanna Briggs Collaboration 12 JBC Regions 14 Africa 15 Americas 16 Asia 17 Australasia 18 Europe 19 JBI Colloquium 2018 20 Joanna Briggs Foundation 22 JBI Endorsement 24 Groups we work with 26 EDUCATION 27 CSRTP 28 Evidence-based Clinical Fellowship Program 29 Train the Trainer programs 30 Postgraduate Research Degrees 32 RESEARCH CONSULTANCY 34 EBP RESOURCES AND PUBLICATIONS 35 JBI EBP Database 36 JBI Journals 38 JBI Tools 40 Oral presentations by JBI staff 42 Publications by JBI staff Joanna Briggs Institute - Annual Report 2018 1 JBI’S IMPACT JBI EBP DATABASE IN 2018 Evidence- 4365 based point of care resources resources The widespread impact of JBI, both in Australia and worldwide, is attributable to our local and global partnerships that ensure evidence-based activities are context-specific and driven by individuals SCIENTIFIC WRITERS and groups that understand their specific health environments. Scientific 45 Writers partnered writers with JBI JBI ENDORSEMENT Granted 4 full JBI Endorsement organisations status JOANNA BRIGGS COLLABORATION 7 new groups 2 Joanna Briggs Institute - Annual Report 2018 POSTGRADUATE RESEARCH DEGREES JBI’s Master of Clinical Science 388 80+ 8 completions audit topics projects JBI’s Doctor of Philosophy 3 completions
    [Show full text]
  • Systematic Review Template
    Protocol: The effect of interventions for women’s empowerment on children’s health and education: A systematic review of evidence from low- and middle-income countries Sebastian Vollmer, Sarah Khan, Le Thi Ngoc Tu, Atika Pasha, Soham Sahoo Submitted to the Coordinating Group of: Crime and Justice Education Disability International Development Nutrition Social Welfare Methods Knowledge Translation and Implementation Other: Plans to co-register: No Yes Cochrane Other Maybe Date Submitted: 4.11.2016 Date Revision Submitted: 31.05.2017 and 17.08.2017 Publication date: 21 November 2017 1 The Campbell Collaboration | www.campbellcollaboration.org Background The problem, condition or issue While numerous facts and figures point towards the increasing opportunities for women and their growing participation in economic, political and public decision-making processes, women remain in disadvantaged positions compared to men in many places of the world (World Economic Forum, 2015). Gender bias is deeply embedded in cultures, economies, and political and social institutions around the world, denying women an equal part in society and decision-making. Gender based discrimination in access to economic opportunities, lack of representation and participation in economic and social spheres, and limited opportunities to accumulate resources could perpetuate vulnerability to poverty among women, limit human capital accumulation and restrict economic growth. Women’s economic involvement, their political participation, and their empowerment in the two crucial domains of education and health, are often seen as the four fundamental factors to create just and equitable societies where both women and men have control over their lives and exert similar, if not equal, influence in society.
    [Show full text]
  • Developing PRISMA-RR, a Reporting Guideline for Rapid Reviews of Primary Studies (Protocol)
    Developing PRISMA-RR, a reporting guideline for rapid reviews of primary studies (Protocol) Adrienne Stevens1,2, Chantelle Garritty1,2, Mona Hersi1, David Moher1,3,4 1Ottawa Methods Centre, Clinical Epidemiology Program, Ottawa Hospital Research Institute, Canada; 2TRIBE Graduate Program, University of Split School of Medicine, Croatia; 3Centre for Journalology, Clinical Epidemiology Program, Ottawa Hospital Research Institute, Canada; 4School of Epidemiology and Public Health, University of Ottawa, Canada; Funding: The development of PRISMA-RR is supported by a grant from the Canadian Institutes of Health Research (CIHR FRN-142310). The funder had no role in the development of this protocol and will not have a role in the data collection, analyses, interpretation of the data, and publication of the findings. Funding will be sought to develop the Explanation and Elaboration manuscript. Developing PRISMA-RR, a reporting guideline for rapid reviews of primary studies February 2018 INTRODUCTION Systematic reviews are known to be the best evidence upon which to make healthcare decisions, but can take years to complete (1). Rapid reviews have emerged as accelerated and/or abbreviated versions of systematic reviews with certain concessions made in the systematic review process to accommodate decision-making situations that require an expedited compilation of the evidence (2). Rapid reviews are usually conducted for a specific requestor and are typically understood to take 6 months or less to complete. Rapid reviews may differ from systematic reviews in a number of ways, such as the number of abbreviated methods or shortcuts employed and the breadth or scope of the question(s) posed (2–4). Some rapid reviews involve accelerated data mining processes and targeted screening of studies.
    [Show full text]
  • 208 Environmental Scan and Evaluation of Best Practices For
    208 ORIGINAL INVESTIGATION DOI: dx.doi.org/10.5195/jmla.2018.241 Environmental scan and evaluation of best practices for online systematic review resources Robin M. N. Parker, MLIS; Leah Boulos; Sarah Visintini; Krista Ritchie; Jill Hayden See end of article for authors’ affiliations. Objective: Online training for systematic review methodology is an attractive option due to flexibility and limited availability of in-person instruction. Librarians often direct new reviewers to these online resources, so they should be knowledgeable about the variety of available resources. The objective for this project was to conduct an environmental scan of online systematic review training resources and evaluate those identified resources. Methods: The authors systematically searched for electronic learning resources pertaining to systematic review methods. After screening for inclusion, we collected data about characteristics of training resources and assigned scores in the domains of (1) content, (2) design, (3) interactivity, and (4) usability by applying a previously published evaluation rubric for online instruction modules. We described the characteristics and scores for each training resource and compared performance across the domains. Results: Twenty training resources were evaluated. Average overall score of online instructional resources was 61%. Online courses (n=7) averaged 73%, web modules (n=5) 64%, and videos (n=8) 48%. The top 5 highest scoring resources were in course or web module format, featured high interactivity, and required a longer (>5hrs) time commitment from users. Conclusion: This study revealed that resources include appropriate content but are less likely to adhere to principles of online training design and interactivity. Awareness of these resources will allow librarians to make informed recommendations for training based on patrons’ needs.
    [Show full text]
  • C2 MPB Economics Methods 21-04-2008
    The Campbell Collaboration Economics Methods Policy Brief Prepared for the Campbell Collaboration Steering Committee by: Co-convenors of the Campbell & Cochrane Economics Methods Group (CCEMG): Ian Shemilt - University of East Anglia, UK. Miranda Mugford - University of East Anglia, UK. Sarah Byford - King’s College London, UK. Mike Drummond - University of York, UK. Eric Eisenstein - Duke University Medical Center, USA. Martin Knapp - London School of Economics and Political Science, UK. Jacqueline Mallender - The Matrix Knowledge Group, UK. Kevin Marsh - The Matrix Knowledge Group, UK. David McDaid - London School of Economics and Political Science, UK. Luke Vale - University of Aberdeen, UK. Damian Walker - John Hopkins Bloomberg School of Public Health, USA. Correspondence to: Ian Shemilt, Research Coordinator, Campbell & Cochrane Economics Methods Group, Elizabeth Fry Building, University of East Anglia, Norwich NR4 7TJ, United Kingdom. Tel: +44 (0)1603 591086; e-mail: [email protected] ; web: www.c- cemg.org C2 Methods Policy Brief: Economics Methods Version 1.0 - April 2008 Contents Executive Summary 2 Preface 5 Process 6 Introduction 7 Key Issues 9 References 34 Acknowledgements 37 1 C2 Methods Policy Brief: Economics Methods Version 1.0 - April 2008 Executive Summary The overall aim of this Brief is to provide an initial source of guidance for authors of Campbell Collaboration Reviews on key issues concerning the use of economics methods. The core objective of The Campbell Collaboration (C2) is preparation, maintenance and dissemination of systematic reviews in order to help people make well-informed decisions about the effects of criminal justice, education and social welfare interventions. In the face of scarce resources, decision-makers often need to consider not only whether an intervention works, but also whether its adoption will lead to a more efficient use of resources.
    [Show full text]
  • Sean Grant – Curriculum Vitae 1 SEAN PATRICK GRANT Assistant
    SEAN PATRICK GRANT Assistant Professor in Social & Behavioral Sciences Richard M. Fairbanks School of Public Health Indiana University Contact Details 1050 Wishard Blvd., Room 6046 Indianapolis, IN 46202 United States of America Phone: (+1)-317-274-3245 Email: [email protected] Website: https://fsph.iupui.edu/about/directory/grant-sean.html EDUCATION 2011-14 DPhil (Ph.D.) in Social Intervention Institution: University of Oxford (United Kingdom) Funding: Clarendon – Green Templeton College Annual Fund Scholarship Thesis: Development of a CONSORT extension for social and psychological intervention trials Supervisor: Professor Paul Montgomery Examiners: Professor Robert West (University College London) Professor Lucy Bowes (University of Oxford) Result: No Corrections (highest possible mark) 2010-11 MSc (M.S.) in Evidence-Based Social Intervention Institution: University of Oxford (United Kingdom) Thesis: Assessing the reporting quality of psychosocial intervention trials: A pilot study Supervisor: Professor Paul Montgomery Result: Distinction for thesis (highest mark in cohort) and overall degree 2006-10 Honors B.A. in Psychology (with a minor in Economics) Institution: Loyola Marymount University (United States) Funding: Trustee Scholarship Thesis: Event-level drinking among college students Supervisor: Professor Joseph LaBrie Result: Summa cum laude 2006-10 Honors B.A. in Philosophy Institution: Loyola Marymount University (United States) Funding: Trustee Scholarship Thesis: More than a feeling: The significance of emotion in attaining the
    [Show full text]
  • Promoting Mental Health
    Promoting Mental Health ■ ■ CONCEPTS EMERGING EVIDENCE PRACTICE A Report of the World Health Organization, Department of Mental Health and Substance Abuse in collaboration with the Victorian Health Promotion Foundation and The University of Melbourne Promoting Mental Health ■ ■ CONCEPTS EMERGING EVIDENCE PRACTICE A Report of the World Health Organization, Department of Mental Health and Substance Abuse in collaboration with the Victorian Health Promotion Foundation and The University of Melbourne Editors: Helen Herrman Shekhar Saxena Rob Moodie WHO Library Cataloguing-in-Publication Data Promoting mental health: concepts, emerging evidence, practice : report of the World Health Organization, Department of Mental Health and Substance Abuse in collaboration with the Victorian Health Promotion Foundation and the University of Melbourne / [editors: Helen Herrman, Shekhar Saxena, Rob Moodie]. 1.Mental health 2.Health promotion 3.Evidence-based medicine 4.Health policy 5.Practice guidelines 6.Developing countries I.Herrman, Helen. II.Saxena, Shekhar. III.Moodie, Rob. ISBN 92 4 156294 3 (NLM classification: WM 31.5) © World Health Organization 2005 All rights reserved. Publications of the World Health Organization can be obtained from WHO Press, World Health Organization, 20 Avenue Appia, 1211 Geneva 27, Switzerland (tel: +41 22 791 2476; fax: +41 22 791 4857; email: [email protected]). Requests for permission to reproduce or translate WHO publications – whether for sale or for noncommercial distribution – should be addressed to WHO Press, at the above address (fax: +41 22 791 4806; email: [email protected]). The designations employed and the presentation of the material in this publication do not imply the expression of any opinion whatsoever on the part of the World Health Organization concerning the legal status of any country, territory, city or area or of its authorities, or concerning the delimitation of its frontiers or boundaries.
    [Show full text]
  • View Programme (Pdf)
    COMET V Wednesday 20th and Thursday 21st May 2015 University of Calgary Alberta, Canada COMET Management Group: Professor Doug Altman, Professor Jane Blazeby, Professor Mike Clarke, Miss Elizabeth Gargon, Dr Sean Tunis, Professor Paula Williamson COMET V Programme COMET V Programme Day 1 – Wednesday 20th May 2015 Time 8.00-9.00 Registration and refreshments 8.30-8.50 Nicola Harman (University of Liverpool) Pre-meeting: Introductory session for newcomers 9.00-9.40 Meeting opening: Welcome and COMET update 9.00-9.20 Theresa Radwell (Alberta Cancer Foundation) Welcome and introduction 9.20-9.40 Paula Williamson (University of Liverpool) COMET – activities and update 9.40-10.40 John Marshall (St. Michael's Hospital), Jonathan Craig and Allison Tong (University of Sydney), Amy Hoang-Kim (University of Toronto) Core Outcome Set presentations 10.40-11.10 Refreshments and poster viewing 11.10-12.10 Panel discussion – Common ground for core outcome sets 11.10-11.40 Introducing the panel CMAJ (John Fletcher), COMET (Mike Clarke), OMERACT (Peter Tugwell), Health Canada (Carole Légaré), Canadian Critical Care Trials Group (John Marshall) 11.40-12.10 Questions to the panel 12.10-13.00 Lunch and poster viewing 13.00-13.30 David Moher (Ottawa Hospital Research Institute) EQUATOR 13.30-14.30 Zafira Bhaloo (University of Alberta), Michele Hamm (University of Alberta), Mufiza Kapadia (The Hospital for Sick Children) Outcomes for paediatric trials (Chair: Lisa Hartling, University of Alberta) 14.30-15.00 Refreshments and poster viewing 15:00-16:00 Carina
    [Show full text]
  • Better Evidence for a Better World Annual Report 2016 About the Campbell Collaboration
    Better evidence for a better world Annual report 2016 About the Campbell Collaboration “Modern nations should be ready for an experimental approach to social reform, an approach in which we try out new programs designed to cure specific social problems, in which we learn whether or not these programs are effective, and in which we retain, imitate, modify or discard them on the basis of apparent effectiveness on the multiple criteria available.” Donald T. Campbell, Reforms as Experiments, 1969 Founded in 2000, the Campbell Collaboration is an Our vision international research network which publishes high quality systematic reviews of social and economic Campbell’s vision statement is ‘Better evidence for a interventions around the world. better world’. We seek to bring about positive social change and to improve the quality of public and Campbell is a network of Coordinating Groups private services around the world. (CGs), supported by secretariat staff in Oslo, Norway and New Delhi, India. We prepare, maintain and disseminate systematic reviews of research related to education, crime As of the end of 2016, there were four sector CGs and justice, social welfare and international covering crime and justice, education, international development. development and social welfare. Two further CGs are responsible for methods, and knowledge transfer We believe that a systematic and rigorous approach and implementation. The CGs are responsible to research synthesis improves the knowledge base for managing the editorial process for reviews for decisions on policy and practice. Better-informed registered with the Campbell Collaboration. choices by policymakers, practitioners and members of the general public can lead to better outcomes.
    [Show full text]
  • The Effectiveness of Homework in Primary School: a Systematic Review
    Title registration for a systematic review: The effectiveness of homework in primary school: a systematic review Jennifer Hanratty, Sarah Miller, Aoibheann Brennan- Wilson, Maria Cockerill, Jenny Davison, Jennifer Roberts, Karen Winter Submitted to the Coordinating Group of: Crime and Justice Education Disability International Development Nutrition Food Security Social Welfare Methods Knowledge Translation and Implementation Business and Management Other: Plans to co-register: No Yes Cochrane Other Maybe Date submitted: 22 January 2019 Date revision submitted: 12 February 2019 Publication date: 13 February 2019 1 The Campbell Collaboration | www.campbellcollaboration.org Title of the review The effectiveness of homework in primary school: a systematic review Note: We have deliberately not included the primary outcomes of interest in the title as these will be chosen on the basis of consultation with school leaders, teachers, parents and primary school children. We will revise the title accordingly at protocol stage once this process has been completed. Background Homework in primary school refers to any work set by teachers for pupils to complete outside of their formal lessons. Homework in primary school is set by teachers for a wide variety of reasons. Homework may be set in order to consolidate learning, to address specific learning needs tailored to each child, to practice key skills such as reading or fine motor skills or to prepare for in-class learning in advance. In addition to pedagogical reasons, homework may also be set to equip children with skills for independent learning, such as managing their time, prioritising tasks and meeting deadlines, in order to prepare for secondary school or as a means to motivate and inspire confidence and enthusiasm in children.
    [Show full text]
  • The Joanna Briggs Institute Annual Report Contents
    2 16 The Joanna Briggs Institute Annual Report Contents From the Chair Contents and. Executive Director . 3 Our team . 5 Governance . 6 Higher Degrees by Research . 7 Research . 14 Oral presentations . 18 Scientific development program . 20 Short course program . 21 Publications . 25 Events . 30 Joanna Briggs Foundation . 33 Joanna Briggs Collaboration . 36 Americas . 38 Asia . 43 Australasia . 48 Central . 52 Europe . 55 The Joanna Briggs Institute - 2015 Annual Report 2 From the Chair and Executive Director Dear colleagues In 2016 the Joanna Briggs Institute (JBI) celebrated its 20th anniversary. It does not seem like that long ago that we were IN 2016 ‘Roaring with Pride’ as we reached 10 years of activity as a self- funded not-for-profit organisation. JBI SAT Ten years was a remarkable success. We had a small but dynamic, positive and mission focused team in Adelaide supporting an COMFORTABLY international collaboration of 26 centres. Together, along with our members, we were the face of evidence-based healthcare (EBHC) for nursing and the allied health professions. Fast forward to 2016, ON THE and JBI is still a dynamic, mission focused institute intent on staying connected with the EBHC needs of the practising professions. Some of our resources bear a familiar look, but have significantly WORLD STAGE expanded in both quantity and quality as the science of EBHC has progressed. Our direct work with clinicians and organisations AS A GLOBAL committed to improving quality through the implementation of EBHC has increased and key partnerships continue to develop. In 2016 JBI sat comfortably on the world stage as a global provider PROVIDER OF of evidence, tools, resources and education and training for best practice.
    [Show full text]
  • E-Learning of Evidence-Based Health Care (EBHC) to Increase EBHC Competencies in Healthcare Professionals: a Systematic Review
    Education Coordinating Group E-learning of evidence-based health care (EBHC) to increase EBHC competencies in healthcare professionals: a systematic review Anke Rohwer, Nkengafac Villyen Motaze, Eva Rehfuess, and Taryn Young A Campbell Systematic Review Published: March 2017 2017:4 Search executed: May 2016 The Campbell Library comprises: • Systematic reviews (titles, protocols and reviews) • Policies and Guidelines Series • Methods Series Go to the library to download these resources, at: www.campbellcollaboration.org/library/ Better evidence for a better world Colophon Title E-learning of evidence-based health care (EBHC) to increase EBHC competencies in healthcare professionals: a systematic review Institution The Campbell Collaboration Authors Rohwer, Anke Motaze, Nkengafac Villyen Rehfuess, Eva Young, Taryn DOI 10.4073/csr.2017.4 No. of pages 147 Last updated 2 March 2017 Citation Rohwer A, Motaze NV, Rehfuess E, Young T. E-learning of evidence-based health care (EBHC) in healthcare professionals: a systematic review. Campbell Systematic Reviews 2017:4 DOI: 10.4073/csr.2017.4 ISSN 1891-1803 Copyright ©Rohwer et al. This is an open-access article distributed under the terms of the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited. Roles and AR, TY and ER conceptualized the question for the review. AR did the searches. AR, responsibilities NVM, ER and TY were involved in selection of studies, data extraction and risk of bias assessment. AR did the analyses with input from TY, NVM and ER. AR drafted the manuscript. NVM, ER and TY critically engaged with the manuscript and provided input.
    [Show full text]