Paper ID #33111

Workshop Result: Teaching Structured Reviews to Environmental Researchers

Dr. Daniel B. Oerther, University of Science and Technology

Professor Daniel B. Oerther, PhD, PE joined the faculty of the Missouri University of Science and Tech- nology in 2010 as the John A. and Susan Mathes Chair of Civil Engineering after serving ten years on the faculty of the University of where he was Head of the Department of Civil and Environmental Engineering. Oerther earned his Ph.D. (2002) from the University of Illinois, Urbana-Champaign. Dan’s professional registrations include: PE, BCEE, BCES, CEng, CEnv, CEHS, and DAAS. Oerther’s schol- arship, teaching, service, and professional practice focus in the fields of environmental biotechnology and sustainable development where he specializes in promoting Water, Sanitation, and Hygiene (WaSH), food and nutrition security, energy efficiency, and poverty alleviation. Oerther’s awards for teaching in- clude the best paper award from the Environmental Engineering Division of ASEE and the society-wide Robert G. Quinn Award from ASEE, the Engineering Education Excellence Award from the NSPE, the Excellence in Environmental Engineering and Science Educator award from AAEES, and the Fair Dis- tinguished Engineering Educator Medal from WEF. Due to his collaborations with nurses and healthcare professionals, Professor Oerther has been inducted as a Lifetime Honorary Member of , the International Honor Society of (STTI), and he has been inducted as a Lifetime Honorary Fellow of the American Academy of Nursing (FAAN) and the Academy of Nursing Education Fellows (ANEF). Oerther has also been elevated as a Fellow of the Society of Environmental (FSEE), the Royal Society of Arts (FRSA), the Royal Society for Public Health (FRSPH), the Chartered Institute of (FCIEH), the Society of Operations Engineers (FSOE), and the Association of Environmental Engineering and Science Professors (FAEESP).

c American Society for Engineering Education, 2021 Workshop Result: Teaching Structured Reviews to Environmental Engineering Researchers

Daniel B. Oerther Missouri University of Science and Technology, 1401 North Pine Street, Rolla, MO 65409

Abstract

As part of the 2019 biennial conference of the Association of Environmental Engineering and Science Professors, a pre-conference workshop on the topic of structured reviews was delivered to 22 participants. The workshop had three objectives, namely: 1) raising awareness about the process of structured reviews; 2) demonstrating the process of structured reviews; and 3) encouraging participants to adopt a structured review format for future research. To achieve these objectives, three activities were undertaken, including: 1) a pre-conference web site was constructed where participants completed mandatory exercises before the on-site portion of the workshop; 2) group-work to learn the structured review process during the on-site portion of the workshop; and 3) voluntary follow-up including mentoring/coaching by workshop conveners to support the use of structured reviews in future research after the completion of the on-site portion of the workshop. Assessment of the knowledge, skills, and attitudes of workshop participants was undertaken using an on-site, anonymous, voluntary readiness assessment test (RAT) as well as an on-site, anonymous, voluntary comprehension assessment test (CAT), administered immediately preceding and following the on-site portion of the workshop, respectively. To support the long-term adoption of structured reviews in research, the workshop moderators provided mentoring/coaching to workshop participants in the months following the completion of the on-site portion of the workshop. The value of this long-term mentoring/coaching was assessed by an additional CAT administered one year after the completion of the on-site portion of the workshop. The purpose of this paper is to share: 1) workshop content and format that could be used by other conveners of similar workshops; 2) results of the analysis of the RAT and CATs; and 3) the author’s experience with mentoring/coaching workshop participants on the use of structured reviews.

Introduction

Structured reviews – a formalized process to synthesize evidence-based practice and policy by extracting and analyzing research data from the scientific literature – are widely used in the fields of healthcare, social sciences, education, and economics, where humans are part of the system of study observed and manipulated by basic and applied researchers [1]. Structured reviews often are part of “translational medicine”, where discoveries at the “lab bench” are brought to the “bed side”. Structured reviews typically are not employed by engineers; although the value to engineering researchers is self- evident in collaborations with disciplines where structured reviews are regularly employed (i.e., [2], [3], and [4]), and prior efforts have been invested to educate engineers about systematic reviews in workshops (i.e., [5], and [6]).

Disseminating the value of and demonstrating the basic approach to performing a structured review to a diverse audience of engineering graduate students, faculty, and administrators was the objective of a pre-conference workshop, which was part of the 2019 biennial conference of the Association of Environmental Engineering and Science Professors (AEESP) in Tempe, Arizona.

The purpose of this paper is to share: 1) workshop content and format that could be used by other conveners of similar workshops; 2) on-site, anonymous, voluntary readiness assessment test (RAT) as well as an on-site, anonymous, voluntary comprehension assessment test (CAT), administered immediately preceding and following the workshop, respectively, as well as an additional CAT administered after a period of post-workshop mentoring/coaching; and 3) the author’s experience with mentoring/coaching workshop participants on the use of structured reviews.

Methods

Workshop content: background on structured reviews. Structured reviews often are attributed to the efforts of British physician, Archie Cochrane. In 1971, Cochrane authored “Effectiveness and Efficiency” [7] where he argued passionately for medical practice to be based on evidence. Cochrane identified the results of randomized controlled trials (RCTs), or experiments where test subjects are assigned either to an intervention or to a control condition using a method independent of human influence (i.e., random), as the “gold-standard” of scientific evidence. But not every RCT is of equal quality (i.e., the size of the sample population and the duration of the experiment vary among different trials), and the results from replicate RCTs often produce different results. Observing this fact, Dr. Cochrane concluded, “it is surely a great criticism of our profession [medicine] that we have not organized a critical summary, by specialty or subspecialty, adapted periodically, of all relevant randomized control trials,” [7]. This concept, a critical summary of published research, organized, and updated regularly is what is meant by the term, “structured review”. Perhaps the most widely known example of an international collaborative effort to produce structured reviews is Cochrane (also known as, the Cochrane Library), a British charity founded in 1993 with the motto, “Trusted evidence. Informed decisions. Better health.” Cochrane is named after Dr. Archie Cochrane.

Depending upon the approach employed in creating the summary and the types of published research studies analyzed, the various styles of structured reviews include scoping reviews, systematic reviews, and meta-analyses, among others [8]. Important aspects of performing a structured review include: 1) identifying a research question; 2) defining inclusion and exclusion criteria; 3) collecting all possible studies according to the inclusion criteria; 4) eliminating studies using the exclusion criteria; 5) extracting and thematically comparing relevant data; and 6) synthesizing the answer to the research question using the weight of evidence of the studies.

The three learning objectives of the pre-conference workshop, included: 1. raising awareness about the process of structured reviews using a pre-conference website; 2. demonstrating the process of structured reviews using group-work during the on- site workshop; and 3. encouraging participants to adopt a structured review format for future research through voluntary mentoring/coaching in the months following the on-site workshop was completed.

Workshop format and assessments. Founded in 1963, today’s Association of Environmental Engineering and Science Professors (AEESP) includes nearly 1,000- members who are faculty, postdocs, and students of environmental engineering and science. The mission of AEESP is to, “assist it’s members in the development and dissemination of knowledge in environmental engineering and science.” One way this occurs is through a biennial conference. In May, 2019, the membership of AEESP gathered in Tempe, Arizona. Conference organizers included Arizona State University, the University of Arizona, and Northern Arizona University. The conference theme was, “Environmental engineers and scientists see cities in 4-D,” which includes the built environment, the natural environment, cyberspace, and human health. As part of the three-day, on-site event, conference participants were invited to apply to attend one or more of 16 different pre-conference workshops.

The complete application submitted by the author for consideration by the conference organizers is provided in Appendix 1. The workshop description provided by the author for consideration by the conference participants included,

Structured Reviews are Necessary to Translate Research into Practice and Policy, Contact: Daniel Oerther, Missouri University of Science and Technology

This face-to-face preconference workshop at AEESP2019 is envisioned as a critical second-phase of a three-phase process. In phase-one – completed online in the winter of 2019 using a “flipped classroom” approach with materials already prepared by the workshop organizers –participants selected for the workshop will identify a guiding question for their review based in the Grand Challenges and Opportunities in Environmental Engineering for the 21st Century (NAP, 2018), and participants will conduct a search of the published literature following the PRISMA approach to create a collection of research articles for further analysis. In phase-two –completed face-to-face at AEESP2019 using “active learning” guided by the workshop organizers – participants will share with their colleagues the experiences, impressions, and results from phase-one, and participants will use available tools, demonstrated by the workshop organizers, to evaluate the quality of published literature. In phase three – completed in the summer of 2019 using “peer-to-peer mentoring” facilitated by the workshop organizers – teams of participants will complete the evaluation of literature collections with expert input from the workshop organizers, and these evaluations of literature collections will be used to co-author structured reviews answering the original guiding questions selected to advance our understanding of the Grand Challenges and Opportunities in Environmental Engineering in the 21st Century. These co-authored structured review articles will be submitted as a collection for consideration for publication in Environmental Engineering Science or via another mechanisms within AEESP.

A total of 22 individuals registered for this workshop as participants, and a total of 5 individuals, including the author, served as workshop organizers.

A blended format was employed, as participants were provided with educational artifacts to review before meeting in person (i.e., brief explanatory videos available from Cochrane) (see Appendix 2).

Before the workshop, participants were invited to complete a voluntary readiness assessment test (RAT) (see Appendix 3).

Active learning was employed, including brief expert presentations of assignments subsequently completed in small groups with summary presentations to all participants (see Appendix 4).

After the workshop, participants were invited to complete two separate voluntary comprehension assessment tests (CAT); the first was offered on the same day as the workshop (CAT0), and the second was offered after one year (CAT1) (see Appendix 5).

After the workshop, participants were invited to participate in mentoring/coaching in structured reviews offered by the workshop organizers.

Human subjects: IRB exemption was provided by the University for this educational activity.

Results

The results of demographic information as well as “attitude” results of the initial RAT and the initial CAT performed immediately preceding and following the on-site workshop are summarized in Table 1. Of the 22 conference participants, a total of 14 completed the voluntary RAT and CAT. The majority of respondents were students with a typical age of approximately 30 years old. Gender was at near parity among respondents. To evaluate the “attitude” of participants, three different pairs of terms were provided. Participants were asked to circle the feeling that was most accurate to their own feeling. Six participants did not provide a response to the RAT, and one student did not provide a response to the CAT. The number of responses indicating “excited” increased from 7 to 10. The number of responses indicating “prepared” increased from 0 to 4. The number of responses indicating “optimistic” increased from 8 to 13.

Table 1. Summary of demographic and “attitude” results of the RAT and the CAT performed immediately preceding and following the on-site workshop. Age Gender Rank <25 2 Female 7 Student 10 <30 5 Male 6 Postdoc 0 <40 4 Non binary 1 Assist Prof 2 <50 1 No answer 0 Assoc Prof 2 No answer 1 Prof 0

RAT First pair Second pair Third pair Nervous 1 Cautious 8 Foolish 0 Excited 7 Prepared 0 Optimistic 8 No answer 6 No answer 6 No answer 6

CAT First pair Second pair Third pair Nervous 3 Cautious 9 Foolish 0 Excited 10 Prepared 4 Optimistic 13 No answer 1 No answer 1 No answer 1

The results of “knowledge” and “skills” assessed using the RAT and CAT performed immediately preceding and following the on-site workshop are summarized in Table 2. A total of 14 of the 22 workshop participants responded. The number of correct responses defining a structured review increased from 8 to 13, and the number of correct responses defining an ad hoc review increased from 5 to 9. Tools for performing structured reviews, including PRIMSA, PICOS, and GRADE, were introduced during the on-site workshop. No participants cited these tools in their RAT, while a number of participants cited these tools in their CAT. There was a small increase in the number of correct responses describing how to synthesize evidence (from 6 in the RAT to 10 in the CAT).

As part of a follow-up to the on-site workshop, participants were invited to continue to receive mentoring/coaching from the workshop organizers. While the original intention had been for a majority of workshop participants to actively engage in follow-up activities, the reality was less than the expectation. No workshop participants expressed a sincere interest in continuing a structured review on the topics of the “Environmental engineering for the 21st century” report [9]. Four of the workshop participants did express an interest in continued coaching/mentoring from the workshop organizer, and these four participants continued to meet occasionally as a small group at quarterly intervals during the following year. Table 3 provides an example of the review articles co-authored by workshop participants leveraging the information covered in the workshop.

Table 2. Summary of the “knowledge” and “skills” results of the RAT and CAT performed immediately preceding and following the on-site workshop. A score of zero indicates that no answer was provided. A score of 1 indicates that an answer, acceptable to an expert as correct, was provided. A score of 2 indicates that the preferred answer included in the on-site workshop presentation was provided. The preferred steps in performing a structured review were described using the PRIMSA example. The criteria were described using PICOS. And the weight of evidence was described using GRADE. RAT CAT 0 1 2 0 1 2 Define structured review 6 8 - 1 13 - Define ad hoc review 9 5 - 5 9 - Steps in performing structured reviewa 6 8 0 1 4 9 Setting inclusion/exclusion criteriab 9 5 0 4 7 3 Weight of evidencec 9 5 0 3 5 6 Synthesize evidence 8 6 - 4 10 - a. PRISMA b. PICOS c. GRADE

As part of a summary exercise at the conclusion of the mentoring/coaching period, the remaining workshop participants were asked to provide their response to three, free-form questions, including: 1) what was the best aspect of the workshop; 2) what was the worst aspect of the workshop; and 3) what would you suggest improving future workshops on this subject?

Table 3. Examples of reviews co-authored by workshop participants. Author Year Title Journal Bullock et al 2019 In situ burning with chemical herders for Artic oil spill Science of the Total response: Meta-analysis and review Environment Boehm et al 2019 Systematic review and meta-analysis of decay rates of Water Research waterborne mammalian viruses and coliphages in surface waters Tong et al 2019 Adsorption of organic micropollutants onto biochar: a Environmental Science: review of relevant kinetics, mechanisms, and Water Research and equilibrium Technology

The most consistent advice from respondents included: 1) working in teams was the best aspect of the workshop; 2) having limited time to work together was the worst aspect of the workshop; and 3) the best way to improve the approach would be to develop an “environmental engineering and science specific” set of instructions on “how to” develop a structured review, and provide published examples using the approach to serve as a template for how environmental engineers and scientist could integrate the structured review approach into our research.

Discussion

A comparison of the results from the RAT and CAT showed a significant improvement in the knowledge, skills, and attitudes of workshop participants. The benefits of active learning were highlighted by participants, and “constructing a research question in our small group setting” was identified as the most rewarding aspect of the on-site workshop. Although workshop participants did not opt to work on co-authoring any structured reviews on the subject of the workshop, there was evidence that the substance of the workshop influenced co-authored publications of the workshop participants.

Based upon the personal experience of the author, and informed by the responses of the workshop participants on the CAT as well as after additional mentoring/coaching sessions, recommendations for teaching future workshops on structured reviews, or developing a module on structured reviews as part of a graduate course, include: 1. either (or perhaps both) complete formal training or partner with an expert and author a structured review BEFORE attempting to teach the structured review process; and 2. identify exemplars of structured reviews that are highly related to the field of study to offer students examples of the application of best practice.

To begin the process of identifying potential exemplar structured reviews for the field of environmental engineering and science, the following representative examples, available online from the Cochrane library (see: www.CochraneLibrary.com), include: • Clasen et al (2010), “Interventions to improve disposal of human excreta for preventing diarrhoea,”; • Dangour et al (2013), “Interventions to improve water quality and supply, sanitation and hygiene practices, and their effects on the nutritional status of children,”; • Clasen et al (2015), “Interventions to improve water quality for preventing diarrhoea,”; • Ejemont-Nwadiaro et al (2015), “Hand washing promotion for preventing diarrhoea,”; • Majorin et al (2019), “Interventions to improve disposal of child faeces for preventing diarrhoea and soil-transmitted helminth infection,”; and • Paludan-Muller et al (2020), “Hand cleaning with ash for reducing the spread of viral and bacterial infections: a rapid review,”.

Conclusion

In conclusion, bringing a technique from the healthcare field to improve the process of performing literature reviews in the field of environmental engineering and science appears feasible and worthwhile. The major hurdle to the adoption of structured reviews in the field of environmental engineering and science appears to be both the lack of a clear set of instructions relevant to the audience as well as the publication of high quality reviews using these instructions that may serve as a template for future efforts.

Acknowledgements

The author wishes to thank the additional workshop organizers, including: Professor Heather Ross, Professor Pascal Saikaly, Dr. Muhammad Ali, and Tobias Heselton.

References

1. Henry and L. Stieglitz, “An Examination of Systematic Reviews in the Engineering Literature,” in Proceedings ASEE Annual Conference & Exposition, Virtual Online, 2020. [Online]. Available: https://doi.org/10.18260/1-2--34121. [Accessed May 20, 2021]. 2. S. Oerther, and D. B. Oerther, “Pierre Bourdieu’s Theory of Practice offers nurses a framework to uncover embodied knowledge of patients living with disabilities or illnesses: a discussion paper,” J Adv. Nurs., vol. 74, no. 4, pp. 818-826, 2017. [Online]. Available: https://doi.org/10.1111/jan.13486. [Accessed May 20, 2021]. 3. S. Oerther, H. Lach, and D. B. Oerther, “Immigrant Women’s Experiences as Mothers in the United States: A Scoping Review,” Am. J. Maternal Child Nurs., vol. 45, no. 1, pp. 6-16, 2020. [Online]. Available: http://doi.org/10.1097/NMC.0000000000000582. [Accessed May 20, 2021]. 4. S. Oerther, and D. B. Oerther, “Review of Recent Research About Parenting Generation Z Pre-Teen Children,” Western J. Nurs. Res., vol. 43, 2021. [Online]. Available: http://doi.org/10.1177/0193945920988782. [Accessed May 20, 2021]. 5. J. E. Froyd, M. Morrego, M. J. Foster, H. S. Choe, J. P. Martin, and X. Chen, X. “Special session: Introduction to systematic reviews in engineering education research,” in Proceedings IEEE Frontiers in Education Conference, 2015. [Online]. Available: https://doi.org/10.1109/FIE.2015.7344090. [Accessed May 20, 2021]. 6. M. Borrego, M. J. Foster, and J. E. Froyd, “What is the state of the art of systematic review in engineering education?,” J. Eng. Educ., vol. 104, no. 2, pp. 212-242, 2015. [Online]. Available: https://doi.org/10.1002/jee.20069. [Accessed May 20, 2021]. 7. Cochrane, Effectiveness and Efficiency. London: Hodder Education Publishers, 1971. 8. M. Grant, and A. Booth, “A typology of reviews: an analysis of 14 review types and associated methodologies,” Health Info. Libr. J., vol. 26, no. 2, pp. 91-108, 2009. [Online]. Available: https://doi.org/10.1111/j.1471-1842.2009.00848.x. [Accessed May 20, 2021]. 9. National Academies of Sciences, Engineering, and Medicine, Environmental engineering for the 21st century: Addressing grand challenges. Washington, DC: The National Academies Press, 2019.

Appendix 1. Application for the workshop submitted to the conference organizers.

Descriptive Title: Structured Reviews are Necessary to Translate Research into Practice and Policy

Time: Half-day workshop

Motivation (100 word limit): Successful convergence research requires study of a compelling problem and deep integration across disciplines (NSF, 2018). The purpose of this workshop is to share the structured review approach from the discipline of healthcare with our colleagues in the discipline of environmental engineering and science. Through structured reviews, researchers evaluate the quality of published literature in a reproducible format and synthesize valuable “knowledge” starting from overwhelming amounts of available “information”. Using structured reviews, environmental engineering and science will be further empowered to translate convergence research into best practices and compelling policies to protect and promote the environment and human health.

Description of workshop (250 word limit): This face-to-face preconference workshop at AEESP2019 is envisioned as a critical second-phase of a three-phase process. In phase-one – completed online in the winter of 2019 using a “flipped-classroom” approach with materials already prepared by the workshop organizers – participants selected for the workshop will identify a guiding question for their review based in the Grand Challenges and Opportunities in Environmental Engineering for the 21st Century (NAP, 2018), and participants will conduct a search of the published literature following the PRISMA approach to create a collection of research articles for further analysis. In phase-two – completed face-to-face at AEESP2019 using “active learning” guided by the workshop organizers – participants will share with their colleagues the experiences, impressions, and results from phase-one, and participants will use available tools, demonstrated by the workshop organizers, to evaluate the quality of published literature. In phase-three – completed in the summer of 2019 using “peer-to-peer mentoring” facilitated by the workshop organizers – teams of participants will complete the evaluation of literature collections with expert input from the workshop organizers, and these evaluations of literature collections will be used to co-author structured reviews answering the original guiding questions selected to advance our understanding of the Grand Challenges and Opportunities in Environmental Engineering in the 21st Century. These co-authored structured review articles will be submitted as a collection for consideration for publication in Environmental Engineering Science or via another mechanisms within AEESP.

The workshop organizers have broad and deep experience in successful execution of workshops, interprofessional education, and the conduct of structured reviews. A major value of this workshop is translating a demonstrated methodology – the structured review approach – from the discipline of healthcare to the discipline of environmental engineering and science. Healthcare has used the structured review approach for the past three decades to improve best practices and successful healthcare policy recommendations.

The workshop will “train-up” a cohort of environmental engineers and scientists with an appreciation for the value of structured reviews and the skills necessary to execute the structured review approach. It is expected that this trained cohort will serve as the source of a “sea change” within the discipline of environmental engineering and science, and represent a critical link among the disciplines of engineering and healthcare – to the long-term better of protecting and promoting human health and the environment.

Intended audience and size: Ideally, teams of three to five participants will work together on a single structured review topic. The Grand Challenges report identifies five separate opportunities. Therefore, an ideal structure for the workshop would include 25 participants, which would be organized into five teams – one for each Challenge – with five members in each team. Each team should include at least one junior researcher (i.e., doctoral student or postdoctoral trainee) and at least one senior researcher (i.e., tenured faculty). Therefore, the ideal composition for the workshop would be: 25 total participants including at least 5 doctoral students or postdoctoral trainees and at least 5 tenured faculty.

The workshop format could accommodate additional or fewer participants depending upon the size of the venue or needs of the conference organizers.

Applications for the workshop should inquire of potential participants: 1) the topic of interest for the structured review; 2) the career status of the participant; 3) a commitment to co-author a structured review as part of the workshop; and 4) prior experience with authoring reviews.

Workshop organizers: 1. Daniel B. Oerther, PhD, PE, BCEE, Fellow of the American Academy of Nursing, Professor of environmental health engineering, Missouri University of Science and Technology, [email protected] 2. Heather Ross, PhD, DNP, RN, Fellow of the American Academy of Nurse Practitioners, Professor of nursing and health innovation, Arizona State University, [email protected] 3. Pascal Saikaly, PhD, Professor, King Abdullah University of Science and Technology, [email protected] 4. Muhammad Ali, PhD, Postdoctoral fellow, King Abdullah University of Science and Technology, [email protected] 5. Tobias Heselton, Graduate student, Missouri University of Science and Technology, [email protected]

Appendix 2. Pre-conference workshop materials included as part of a web site.

1. Review the following online resources, including: a. Finding what works in : standards for systematic reviews, by National Academies Press, 2011 i. available at: https://www.nap.edu/catalog/13059/ b. Meta-analyses were supposed to end scientific debates. Often, they only cause more controversy, by J. de Vrieze, Science Magazine, 2018 i. available at: https://www.sciencemag.org/news/2018/09/meta-analyses- were-supposed-end-scientific-debates-often-they-only-cause-more c. What are “systematic reviews” by Cochrane i. available at: https://www.youtube.com/watch?v=egJlW4vkb1Y 2. Post at least one blog entry that addresses the following: a. What is your name, institution, and current research, teaching, and service? b. Why are you participating in this workshop? c. What knowledge, skills, and attitudes do you hope to gain during the workshop? d. How will you put your learning into practice after the workshop is completed? 3. Post at least two blog commentaries offering encouragement and thoughtful criticism to two additional workshop participants

Additional assignment to be completed individually and shared with team via email 1. Skim the relevant chapter (i.e., food-water-energy; climate change; no pollution; cities; informed decisions) of the report, “Environmental engineering for the 21st century”. a. available at: https://www.nap.edu/catalog/25121 2. write out at least THREE specific scenarios/problems on the relevant chapter that “we” (as the field of environmental engineering and science) need to solve (i.e., provide a concrete and specific description of a problem that answers questions such as who, what, where, when, why, and how. To clarify, NO it doesn’t need to be a “medical” example as found in the Cochrane library) 3. select a series of “key words” and conduct a search of or similar curated search engine to identify a handful (i.e., 5 minimum to 20 maximum) of research articles that address each of your THREE specific scenarios/problems 4. BEFORE the workshop, email to your team the THREE specific scenarios/problems PLUS the SCOPUS search results (i.e., titles of the 5 minimum to 20 maximum research articles)

Appendix 3. The Readiness Assessment Test (RAT) administered immediately preceding the on-site workshop.

Voluntary and optional – no identifying information will be released, and all results will be reported in aggregate.

OPTIONAL Name:

Demographic information: Age: Gender: Rank: student, postdoc, Assist, Associate, Prof

PRE-test of Knowledge, Skills, and Attitudes:

Define Structured review:

Define Ad hoc review:

List steps in performing a Structured review:

How to select Inclusion and Exclusion criteria:

How to evaluate Weight of Evidence:

How to synthesize Evidence:

With regard to Structured Reviews, from each pair of terms, circle the feeling that is Most Accurate

First pair: Nervous Excited

Second pair: Cautious Prepared

Third pair: Optimistic Foolish

Appendix 4. On-site workshop materials used by the moderators to facilitate group-work by participants.

Appendix 5. The Comprehension Assessment Test (CAT) administered immediately following the on-site workshop, and administered a second time approximately one-year after the on-site workshop was completed.

Voluntary and optional – no identifying information will be released, and all results will be reported in aggregate.

OPTIONAL Name:

Demographic information: Age: Gender: Rank: student, postdoc, Assist, Associate, Prof

POST-test of Knowledge, Skills, and Attitudes:

Define Structured review:

Define Ad hoc review:

List steps in performing a Structured review:

How to select Inclusion and Exclusion criteria:

How to evaluate Weight of Evidence:

How to synthesize Evidence:

With regard to Structured Reviews, from each pair of terms, circle the feeling that is Most Accurate

First pair: Nervous Excited

Second pair: Cautious Prepared

Third pair: Optimistic Foolish

For this workshop, what was:

BEST:

WORST:

How to IMPROVE?