Protocol

1. PROJECT TITLE Maximizing HPV : Real-time Reminders, Guidance, and Recommendations Part 4: Feasibility Trial

2. INVESTIGATOR(S)

Principal Investigator: Stephanie A. S. Staras, PhD, MSPH Co-Investigator: Elizabeth Shenkman, PhD Co-Investigator: Lindsay Thompson, MD, MS Co-Investigator: Michael J. Muszynski, MD, FAAP Co-Investigator: William Hogan, MD, MS Co-Investigator: Matt Gurka, PhD

3. ABSTRACT:

Human papillomavirus (HPV) vaccines have potential to prevent an average of 26,900 cancer cases each year in the United States, but vaccine coverage rates remain low. We have developed an electronic application, Protect Me 4, to help parents and providers assess and have more productive conversations about needed vaccines. Our long-term goal is to increase HPV vaccination rates among boys and girls within the diverse settings of real-world primary care in the United States. The objective of this protocol is to evaluate the feasibility of Protect Me 4. The central hypothesis is the app will be feasible and show potential for efficacy within real-world clinics. The specific aims of this study are to: (1) evaluate the feasibility of implementing Protect Me 4, and (2) estimate preliminary efficacy of Protect Me 4 to increase HPV vaccine initiation (receipt of first dose). We will perform a group-randomized trial of seven clinics from the OneFlorida Cancer Control Network. The proposed research is significant because the feasibility test will allow us to identify areas for further improvement and demonstrate implementation of Protect Me 4. Upon completion of this protocol, we will be well prepared to submit a competitive proposal (e.g., NIH R01) for a full-scale randomized efficacy trial of Protect Me 4.

4. BACKGROUND:

The available human papillomavirus vaccines have potential to prevent up to 17,600 cancer cases among women and 9,300 cancer cases among men each year in the United States. 1 As of 2016, only 65% of girls and 56% of boys aged 13-17 years had received at least one does (i.e., initiated) of the HPV vaccine series. 2 In 2014, the President's Cancer Panel recommended increasing HPV vaccination rates as the number one national priority for cancer prevention.3 Low HPV vaccination rates in the United States are primarily due to missed clinical opportunities (a clinic visit where the patient is eligible for vaccines, but remains unvaccinated). 3-6 Rates of other recommended vaccines for adolescents [tetanus/diphtheria/acellular pertussis (Tdap) vaccine and the meningococcal conjugate (MenACWY) vaccine] are between 22 to 28 percentage point higher than rates for HPV vaccine.2 Indicating that adolescents are going to doctor’s and receiving vaccines, just not the HPV vaccine. Because provider recommendations are the strongest predictor of HPV vaccination (odds ratios range from 3 to 18),6-8 interventions are needed to address provider barriers to recommending the HPV vaccine. Protect Me 4 has been developed using theory- and evidence-based strategies to prime parents for vaccine conversations with providers, remind providers about due vaccines, and inform providers about parent hesitations in real-time. For parents, the Protect Me 4 app will query the Florida state immunization registry and Medicaid and Children’s Health Insurance Program (CHIP) claims databases to retrieve a list of adolescent vaccines that are due for their child. Vaccines included are the four recommended for 11-12 year olds: Tdap, MenACWY, HPV, and seasonal influenza. Protect Me 4 assesses parents’ interest in receiving each due vaccine. For parents who are hesitant to any of the vaccines, the Protect Me 4 app provides tailored responses to common concerns by addressing concepts.9 For each participating patient, providers will use Protect Me 4 to review due vaccines, any reported parent hesitancy, and tips to address parent-specific vaccine concern. Providers will be able to review this information in real time before meeting with parents. The application has been approved by privacy and IT security (see miscellaneous attachments).

5. SPECIFIC AIMS

The study will be conducted at 7 OneFlorida clinics (UF Health- New Berlin, iKids Pediatrics, Halifax Health-Keech Pediatrics, Volusia Pediatrics, UF Health-Blanding, Clermont Pediatrics, and Orlando Health- Baldwin Park). The specific aims of this study are: Aim 1: Evaluate the feasibility of implementing Protect Me 4 in a community clinics. We will randomly assign clinics to use of Protect Me 4. Implementation will be maximized by six theory- and evidence- based quality improvement techniques (i.e., practice facilitation, external and internal provider peer opinion leaders, provider audit and feedback with benchmarking, clinic-wide staff involvement, and provider incentives). To evaluate the feasibility, we will measure reach, adoption, implementation, maintenance, and acceptability.

Aim 2: Test vaccination rate data collection strategies and estimate preliminary efficacy of Protect Me 4 to increase HPV vaccine initiation (receipt of first dose). We will collect and compare immunization records between adolescents attending participating clinics during three months prior to the intervention, three months of the supported intervention, and three months of the unsupported intervention.

6. RESEARCH PLAN

6.1 Study Design We will perform a group-randomized trial of six pair-matched clinics (Table 1). The intervention is delivered at the clinic level within three randomly assigned clinics. Control clinics will receive the enhanced Protect Me 4 app following the maintenance period. Three of the intervention clinics were intentionally involved in the development of Protect Me 4 to maximize applicability and effectiveness. To prepare for a larger, efficacy trial where clinics will participate in the intervention without being involved in the development, we selected an additional clinic to participate in the intervention. This fourth intervention clinic, a demonstration clinic, will not be included in the main efficacy analysis, but be evaluated to estimate the typical clinic experience. During the implementation period (months 4-6), intervention clinics will receive the Protect Me 4 app and quality improvement strategies (practice facilitation and external and internal provider peer opinion leaders and provider incentives). During the maintenance period (months 7-9), intervention clinics will receive Protect Me 4 without external support (practice facilitation and external provider peer opinion leaders). For evaluation, vaccination records will be grouped in three-month periods from Medicaid and CHIP claims and Florida Immunization Registry data.

6.2 Study Population Participants will be providers, parents, and 11-to-12-year olds who attend our study clinics during the nine-month study period. We recruited seven clinics in three North and Central Florida metropolitan areas to participate in Protect Me 4. Stratifying by area (Jacksonville, Orlando, Daytona Beach), we randomly assigned one clinic to immediate and one to delayed intervention. The seventh clinic is located in Orlando and is assigned to the intervention. All seven clinics provide in house, have providers who participate in Vaccines for Children, and see approximately 300 to 700 adolescents ages 11-to-12-years- old each year.

6.3 Study Recruitment and Enrollment Participants will be recruited from the four intervention clinics. Providers will complete paper consents. Parents will use the iPad to review the consent and provide an electronic consent. Adolescent will also assent on the iPad. 6.3.1. Providers. All providers at the intervention clinics are known associates of the study staff based on their participation in the prior phases of this study. If any new providers begin at the clinics, the clinic peer leaders will introduce them to study staff. Study staff will explain the consent forms and written informed consent will be obtained from providers prior to study inclusion either through in-person one-on-one meetings or with a group meeting. Providers will have an opportunity to ask questions of study staff in semi-private rooms. The consent for providers will include participation in the Provider Participant Survey, the Provider Feedback Survey, the app, and linkage of these data with each other and participation in app-related Maintenance of Certification activities, the percent of their 11- to 12-year-old patients participating in the app, and their HPV vaccination rates. 6.3.2 Parents. Clinic staff will invite all parents of 11- to 12-year-old adolescents visiting the clinic during the recruitment period who are scheduled to see a participating provider to participate. Clinic staff will verify the patient and parent identities following their standard clinic practices for checking in patients for appointments; this will ensure that the participant is the parent or LAR of the adolescent. Once the clinic staff verifies the parent’s identity, the clinic staff enters the first three letters of the child’s last name and date of birth into the app. The system creates a linked PIN. The clinic staff gives the parent the PIN, an informational flyer, and the iPad. To enter the app, the parent enters the child’s first and last name, gender, date of birth, associated PIN, and selects their doctor’s name from the clinic list. For children verified as 11- to 12-years-old, the app presents an electronic consent. After reading the consent, the parent will agree to participate by typing in their name and click on the “Yes, I agree to participate” option (see Parent app survey 1-2-18). Adolescents will assent by clicking on the “Yes, I agree to participate.” Children outside the age range are told they are ineligible and contact information is not retained. If the parent does not consent or the adolescent does not assent, the contact information is discarded. Study staff are not present during the consent process but may be reached by the phone number indicated on the study flyer. Parents will have the option to complete a paper consent. The app will instruct parents to call the study staff number listed on the same screen as the electronic consent. They will then be consented over the phone and instructed to indicate “consented on paper” on the electronic consent. Completed paper consents will be kept in a secure drop box in the intervention clinics. 6.3.3 Adolescents. Because adolescent vaccination data is used in the study, adolescents will assent to participate in the app. 6.4 Protect Me 4 Application. Upon parent consent and adolescent assent, Protect Me 4 queries in real- time the state immunization registry (Florida SHOTS) and a static file from Florida Medicaid and CHIP claims (updated as data is available, but no more than monthly) to identify the adolescent’s vaccine history. If multiple children are identified, the app asks for mother’s name, father’s name, and the adolescent’s current city of residence. If the additional information does not identify one person, the app stores the information for the provider and tells the parent to talk to their doctor. Once vaccine records are identified, the parent is shown the list of due vaccines and asked which vaccines they are hesitant about their child receiving. For each hesitant vaccine, the app asks parents to select all reasons for hesitations from a list of the five most common. For each selected topic, the app provides tailored educational information. Finally, the app asks parents which vaccines they intend to have their child receive that day. The final screen will display a color based on the parent responses to provide quick information to clinic staff: green= wants all due vaccines, yellow = hesitant about some vaccines, and red = no vaccines due. In real-time, providers can assess the due vaccines and parent responses for patients at their clinics. The provider is notified whether each vaccine is due, if the parent reported hesitations, and the parent reported reasons. For each parent-indicated hesitation topic, a brief discussion tip is given. The app then asks providers which vaccines they intend to offer that day, if they intend to use the discussion tips, and if they intend to schedule follow-up HPV doses.

6.5 Quality Improvement Strategies. Based on Social Cognitive Theory,10 we selected six evidence- based quality improvement techniques: practice facilitation, peer opinion leaders, audit and feedback with benchmarking, clinic-wide staff involvement, and provider incentives.11-16 We will track each of these activities to measure time, cost, and evaluate any clinic differences. 6.5.1 Practice facilitation. Practice facilitators triple the odds of clinic’s adopting evidence-based guidelines and increase the likelihood of sustained change.12,17-19 Our practice facilitator will play a crucial role in three Social Cognitive Theory concepts:10 increasing provider and clinic staff self-efficacy through mastery training of Protect Me 4 app use; increasing collective efficacy by verbal persuasion; and improving facilitation with technical assistance. Each intervention clinic is assigned one practice facilitator who will assist the intervention clinics every week with alternating onsite and virtual encounters. Visits will include training for Protect Me 4, remedial training, trouble shooting with any implementation issues, providing support, and reports of app use. Additionally, the practice facilitator will collect fidelity measurements described below. 6.5.2 External Peer Opinion Leader. External peer opinion leaders increase compliance with guidelines by 5% and are critical to clinical decision support implementation.13,20,21 Dr. Thompson, co- investigator, will serve as the project's external peer leader because she is an active physician, strong supporter of the Protect Me 4 app, and an effective communicator.11,21 Dr. Thompson will provide educational information at the kick off meeting, hold monthly video or telephone conferences with intervention clinics, and routinely contact providers to address challenges, share successes, and provide education. 6.5.3 In-Clinic Provider Opinion Leader. Identified as essential to quality improvement interventions,11,22 in-clinic provider opinion leaders increase desired practice compliance by 12%.13 For each intervention clinic, we will select one provider opinion leader. The peer leader will meet with the practice facilitator, encourage staff participation, and meet with the external peer opinion leader. 6.5.4 Audit and Feedback with Benchmarking. Audit and feedback (reviewing and reporting clinician performance over a period of time) is an effective technique in improving provider guideline adherence, improved with benchmarking, and recommended by the Community Guide to increase vaccination rates.14,23,24 Addressing self-regulation and observability, we will give providers monthly, written audit and feedback reports of their adolescent vaccination performance benchmarked with pared clinic means.10,25-27 Practice facilitators will help clinc staff create audit-feedback reports by teaching staff to create reports using data in Florida Shots and their EHR systems. 6.5.5 Clinic-Wide Staff Involvement. High collective efficacy and high levels of staff involvement improve quality improvement initiatives.15,24 We will involve clinic staff in Protect Me 4. Providers and clinic staff will be invited to a kick-off meeting consisting of HPV vaccine education presented by the external peer opinion leader or PI, an overview of Protect Me 4, and an overview of support resources (peer mentoring, provider incentive, and practice facilitator).26 6.5.6 Provider incentives. Consistent with Social Cognitive Theory, incentives are effective in improving vaccination rates.10,16 To remain specialty certified, providers must complete the Maintenance of Certification (MOC) process that includes completion of an approved quality improvement project. Certification influences earning power and is often a requirement for hospital privileges.27,28 The OneFlorida MOC coordinator will work with interested providers at the participating clinics to complete the MOC certification.

6.6 Aim 1: Evaluate the feasibility of implementing Protect Me 4 in a variety of community clinics. To estimate the feasibility of conducting an efficacy trial of Protect Me 4, we will collect and evaluate several measurements of feasibility following the RE-AIM framework.29 6.6.1 Feasibility measurements. Each week practice facilitators will meet (either in person or by teleconference) with clinic staff who are assigned to hand the iPads to the parents (i.e., either front office staff, nurses, or nurse practitioners). Practice facilitators will aid clinic staff in preparing the recruitment tracking log for study enrollment (see recruitment log) by adding appointment day of week, time, provider, and patient demographics (age, sex, and race/ethnicity). As patients come into the clinic, the clinic staff will complete the remainder of the log. The practice facilitator will aid the clinic staff in reconciling the recruitment log with the prior week’s visits. The practice facilitator and the clinic staff will complete the weekly summary. 6.6.1.1 Reach. To measure reach, we will use the recruitment tracking logs to evaluate the percentage of parents of 11-12-year-old patients who are: (1) offered the brochure, (2) offered the use of the Protect Me 4 app, and (3) given a PIN. Our goal will be to have 80% of the parents of 11-12 year olds offered the Protect Me 4 app. We will assess differences in offers by demographics. We will use the clinic recruitment log to evaluate clinic staff perception of parents’ reported reasons for non-participation. 6.6.1.2 Adoption. We will evaluate provider-level adoption with the Protect Me 4 app. By comparing the number of providers who login to the app to the participating providers, we can estimate the percentage of providers using the app. We will consider the app feasible if 80% or more of providers use the app during the study. We will also assess the percentage of parents who complete the Protect Me 4 app for each provider from the recruitment log and app statistics. Our goal will be to have 70% of parents complete the app for each provider. We will assess differences by provider characteristics obtained from the audit/feedback report and the Pre-Intervention survey. 6.6.1.3 Implementation. We will assess provider- and parent-level implementation. We will use the Protect Me 4 app to evaluate the percentage of a provider’s patients (using parent reported provider) that the provider logs into the app and looks at the shots list. We will consider the app feasible if providers view the shots list for 60% of their patients with screens indicating that shots are due (green or yellow). We aim to have providers look at the specific hesitations reported by parents 60% of the time they login to the app and hesitations are reported. The app also asks providers to report whether they intend to use the discussion tips provided and intend to offer the due vaccines. We aim to have providers report using the discussion tips approximately half the time they are provided and offering HPV vaccine 75% of the time it is due. We will evaluate the number of pages viewed, average time spent on each page, and the total time spent looking at the app per patient. Further, because the MOC project incentive promotes use of the Protect Me 4 app, we will assess whether the implementation outcome differ by MOC participation. Additionally, we will assess differences by provider characteristics from the Pre-Intervention Survey and Audit/Feedback report. We will evaluate parent-level implementation with tracking data available in the Protect Me 4 app. Among parents who start the app, we will assess the percentage who successfully enter the app (enter matching PIN and child information), the percentage who consent, and the percentage whose children assent. Our goal is to have 95% of people who start the app successfully enter the app and 80% agree to participate (consent and assent). Among the participants, we will assess the percentage who successfully retrieve shot information, receive a thank you screen, and view available vaccine educational information in the Protect Me 4 app (when applicable). We will evaluate the number of pages viewed, average time spent on each page, and the total time spent looking at vaccine information. We will evaluate differences in implementation by child’s age and gender. 6.6.1.4 Maintenance. During months 7-9, we will assess clinic and provider maintenance of use of the Protect Me 4 app. Study staff support will be removed during the final months of the project. First, we will assess whether clinics and providers use the app during each month with the tracking data from the Protect Me 4 app. Second, we will ask clinics to continue to use the recruitment tracking log to aid and evaluate recruitment. Among clinics who use the app, we will evaluate the percent of parents of 11- to 12- year-olds who are offered and use the app with data from the recruitment log and enrollment in the app. If clinics do not complete the recruitment log, the practice facilitator will work with the clinic staff to extract the number of 11- to 12-year-olds who visited the clinic each month by provider from the EHR. Practice facilitators will also record questions and support time for each clinic. 6.6.1.5 Acceptability. Following Diffusion of Innovation Theory,25 the relative advantage and the workflow incorporation of the Protect Me 4 app for the providers are essential to implementation. Thus, at both the end of the implementation and the maintenance period, we will ask providers to complete the Post Intervention Survey. Survey answers will be linked to the provider’s use statistics for the app. 6.6.1 Feasibility analysis. To analyze reach, adoption, implementation, and maintenance, percentages for intervention clinics will be compared to the pre-set thresholds and between active and maintenance periods with Chi-square tests.

6.7 Aim 2: Test vaccination rate data collection strategies and estimate preliminary efficacy of Protect Me 4 to increase HPV vaccine initiation (receipt of first dose). To measure the ability of the app to increase adolescent immunization, we will collect and compare adolescent de-identified immunization records between intervention and delayed-intervention clinics. 6.7.1 Honest broker procedures. All seven clinics will provide the Honest Broker with identifiable data for 11- to 12-year-olds who see a participating provider at the clinic during the 9-month study period. Clinic-provided datasets will include child’s name, child’s date of birth, child’s address, visit dates during study period, name of provider visited, type of visit (acute or well), gender, race, ethnicity, and insurance type. The study team will provide the Honest Broker with a list of HIT system participants and a subset of use variables (app final color, app parent HPV hesitation topics selected, parent total time spent with app, number of vaccines parent was hesitant about, provider use of app, provider viewing of hesitations, and provider intention to discuss or use tips). The Honest Broker will link the clinic records, HIT participation date, and vaccinations from the Florida Medicaid and CHIP claims or the Florida Immunization Registry data. De-identified data for all clinics will be returned to the study team. Child’s age will be provided as 11 or 12 years at the time of the visit. The honest broker will create a coding system to de-identify dates, but maintain the time relationships. 6.7.2 Efficacy measurements. All vaccines recommended for 11- to 12-year olds will be collected from Florida Medicaid and CHIP claims or the Florida Immunization Registry data using Current Procedural Terminology (CPT) and Current Vaccine Administered (CVX) codes (Table 2). Our main outcome measure is HPV vaccine initiation. Our secondary, or unintended outcomes, include receipt of the second HPV dose, Tdap, MenACWY, or seasonal influenza (when applicable). 6.7.3 Statistical analysis. To evaluate effectiveness at the individual-level, we will compare vaccination rates between eligible adolescents who visited intervention and control clinics with multivariable logistic regression. Potential confounders include adolescent's age, race/ethnicity, gender, and Medicaid/CHIP status. We will adjust for differences between clinics using clinic as a fixed effect. We will estimate within-clinic and between-clinic variation to improve power calculations for the larger trial. Multiple visits by the same child will be controlled for by reducing down to whether or not the child ultimately got the vaccine. At the clinic and provider level, we will test differences in vaccination rates between study periods using Chi-square tests and compare vaccination rate changes between intervention and control clinics with difference-in-difference tests to adjust for temporal changes. Analyses will be performed in SAS software version 9.4 (SAS Institute, Inc., Cary, NC). We estimated the lowest detectable difference between intervention and control clinics for HPV vaccine initiation with GPower 3.1. Based on our prior study,30 we assumed a HPV vaccine initiation rate of 0.05 in the control group and a sample size of 300. Assuming an alpha of 0.05 and statistical power of 0.8, we can reliably detect an odds ratio of 2.0 which is sufficient based on the HIT system's potential efficacy (odds ratio ≥ 3.0).30 6.7.4 Exploratory analysis. To better understand our app, we will examine the details of our app efficiency. Among users of the app, we will evaluate differences in HPV vaccine initiation by app final color, app parent HPV hesitation topics selected, parent total time spent with app, number of vaccines parent was hesitant about, provider use of app, provider viewing of hesitations, and provider intention to discuss or use tips. 7. POSSIBLE DISCOMFORTS AND RISKS: The proposed research does not present risk greater than minimal risk defined by 45 CFR 46 as no greater than those encountered during routine clinical examinations or psychological examinations. There are no physical risks to the participants posed by this study. There is a slight risk that participant information could be revealed inappropriately or accidentally. Depending on the nature of the information revealed, such a release could upset or embarrass participants. To minimize the risk of a data breach, we will take extensive security precautions during data collection, data storage, and communication with participants. The iPad Protect Me 4 app has been reviewed and approved by UF-IT security and the UF privacy office (see miscellaneous attachments). The app uses secure HTTPS protocols to query Florida immunization registry and Medicaid and CHIP databases. The Protect Me 4 app application is hosted on UF secure servers housed within the University of Florida firewall. No data will be stored on the iPad. All data will be kept confidential, stored on secure UF servers, and access to data will be limited to the minimum staff necessary. Individual information will not be released to anyone outside the project, and will only be assessable to the principal investigator and the research team. The University of Florida (UF) information technology (IT) Security Office is responsible for the development and maintenance of the Health Science Center (HSC) Information Security Program. The UF-HSC IT Department is responsible for adapting, supplementing as necessary and implementing the Information Security Program for ICHP. Each component is reviewed internally at least biannually (every two years) or as necessary due to environmental, operational, or Information Security Program changes. ICHP is able to maintain optimal security practices through the use of the services and expertise of the UF-HSC IT Department to maintain optimal security practices. Specifically, the UF-HSC IT Department has experts on protection of health data and HIPAA compliance across multiple platforms, systems and applications. The servers supporting the ICHP computing environment are connected to a Cisco based IP network, protected by a Palo Alto high performance firewall which provides both application and port- based security. The firewall provides connectivity to the AHC enterprise network. The UF-HSC IT Security Office and the overall UF IT Security Office conduct audits and perform network and vulnerability scanning and alerting. The University of Florida has a campus-wide Intrusion Detection Service that monitors all traffic leaving the University. Network sweeps are performed using Nessus to look for vulnerabilities to address in cooperation with HSC system administrators. Risk assessments are performed on an ongoing and scheduled basis following a process outlined in the IS Audit Methodology document. Each server is individually protected by a host-based firewall that provides local-port based access control. The configuration of all host-based firewalls is managed centrally through either Active Directory GPOs or through Trend’s Deep Security Console. A VPN encrypts data between remote devices and the secure network. A site-to-site VPN and Globalscape Secure FTP architecture provides security for bulk data transfers. Physical access to the HSC Data Center is monitored 24/7 via a video recording security system. Full motion video for all data center cameras is retained for 15 days. Data Center personnel administer and audit physical access using the University of Florida’s enterprise LENEL access management system. Wiring closets have key specific locks and all equipment is protected by software that only allows devices approved and maintained by the institution to access the network. Trend Office Scan protects ICHP Windows servers from viruses and spyware. Workstations are protected from viruses and malware via TrendMicro’s Corporate Edition OfficeScan Agent. Lumension Endpoint Management and Security Suite software provides automated deployment of Windows Security updates and auditing of server patch compliance. The Linux server patching process utilizes a combination of Puppet configuration management software and shell scripts to download a list of pending patches per system.

8. POSSIBLE BENEFITS: Parents may benefit from taking part in this research study by learning the answers to their hesitations about adolescent vaccines, and therefore, being better able to decide about vaccinating their own adolescent children. Furthermore, participating in the study may help parents discuss vaccines with their child’s health care provider. Additionally, adolescents may benefit if the app increases his or her chances of receiving the recommended vaccines. Providers will receive no direct benefit to their health, but could learn about better ways to talk to their clients about adolescent vaccination.

9. CONFLICT OF INTEREST: There are no conflicts of interest for the PI or any co-investigators.

10. LITERATURE CITED:

1. Markowitz LE, Dunne EF, Saraiya M, et al. Human papillomavirus vaccination: recommendations of the Advisory Committee on Immunization Practices (ACIP). MMWR Morb Mortal Wkly Rep. 2014;63(RR-05):1-30. 2. Walker TY, Elam-Evans LD, Singleton JA, et al. National, Regional, State, and Selected Local Area Vaccination Coverage Among Adolescents Aged 13–17 Years — United States, 2016. MMWR Morb Mortal Wkly Rep 2017;66:874–882. 3. Rimer BK, Harper H, Witte ON. Accelerating HPV Vaccine Uptake: Urgency for Action to Prevent Cancer. A Report to the President of the United States from the President's Cancer Panel. Bethesda, Maryland: National Cancer Institute;2014. 4. Stokley S, Jeyarajah J, Yankey D, et al. Human Papillomavirus Vaccination Coverage Among Adolescents, 2007-2013, and Postlicensure Vaccine Safety Monitoring, 2006-2014-United States. MMWR Morb Mortal Wkly Rep. 2014;63(29):620-624. 5. Vadaparampil ST, Kahn JA, Salmon D, et al. Missed clinical opportunities: Provider recommendations for HPV vaccination for 11-12 year old girls are limited. Vaccine. 2011;29(47):8634-8641. 6. Bynum SA, Staras SAS, Malo TL, Giuliano AR, Shenkman E, Vadaparampil ST. Factors Associated With Medicaid Providers' Recommendation of the HPV Vaccine to Low-Income Adolescent Girls. J Adolesc Health. 2014;54(2):190-196. 7. Ylitalo KR, Lee H, Mehta NK. Health Care Provider Recommendation, Human Papillomavirus Vaccination, and Race/Ethnicity in the US National Immunization Survey. Am J . 2013;103(1):164-169. 8. Rosenthal SL, Weiss TW, Zimet GD, Ma L, Good MB, Vichnin MD. Predictors of HPV vaccine uptake among women aged 19-26: Importance of a physician's recommendation. Vaccine. 2011;29(5):890-895. 9. Glanz K, Rimer BK, Viswanath K, Champion V, Skinner CS. Chapter 3 - The Health Belief Model. In: Health Behavior and : Theory, Research, and Practice. 4th ed.: John Wiley & Sons; 2008. 10. Bandura A. Social foundations of thought and action: A social cognitive theory. Englewood Cliffs, NJ: Prentice-Hall, Inc; 1986. 11. Ash JS, Stavri PZ, Dykstra R, Fournier L. Implementing computerized physician order entry: the importance of special people. Int J Med Inform. 2003;69(2-3):235-250. 12. Baskerville NB, Liddy C, Hogg W. Systematic review and meta-analysis of practice facilitation within primary care settings. Ann Fam Med. 2012;10(1):63-74. 13. Flodgren G, Parmelli E, Doumit G, et al. Local opinion leaders: effects on professional practice and health care outcomes. The Cochrane database of systematic reviews. 2011(8):Cd000125. 14. Jamtvedt G, Young JM, Kristoffersen DT, O'Brien MA, Oxman AD. Audit and feedback: effects on professional practice and health care outcomes. The Cochrane database of systematic reviews. 2006(2):Cd000259. 15. Hogg W, Baskerville N, Nykiforuk C, Mallen D. Improved preventive care in family practices with outreach facilitation: understanding success and failure. J Health Serv Res Policy. 2002;7(4):195-201. 16. Flodgren G, Eccles MP, Shepperd S, Scott A, Parmelli E, Beyer FR. An overview of reviews evaluating the effectiveness of financial incentives in changing healthcare professional behaviours and patient outcomes. The Cochrane database of systematic reviews. 2011(7):Cd009255. 17. Hogg W, Lemelin J, Moroz I, Soto E, Russell G. Improving prevention in primary care: Evaluating the sustainability of outreach facilitation. Can Fam Physician. 2008;54(5):712-720. 18. Stange KC, Goodwin MA, Zyzanski SJ, Dietrich AJ. Sustainability of a practice-individualized preventive service delivery intervention. Am J Prev Med. 2003;25(4):296-300. 19. Meropol SB, Schiltz NK, Sattar A, et al. Practice-tailored facilitation to improve pediatric preventive care delivery: a randomized trial. Pediatrics. 2014;133(6):e1664-1675. 20. O'Brien MA, Rogers S, Jamtvedt G, et al. Educational outreach visits: effects on professional practice and health care outcomes. The Cochrane database of systematic reviews. 2007(4):Cd000409. 21. Byrne C, Sherry D, Mercincavage L, Johnston D, Pan E, Schiff G. Key Lessons In Clinical Decision Support Implementation. Department of Health and . 22. Abbott PA, Foster J, Marin Hde F, Dykes PC. Complexity and the science of implementation in health IT--knowledge gaps and future visions. Int J Med Inform. 2014;83(7):e12-22. 23. Mayne SL, duRivage NE, Feemster KA, Localio AR, Grundmeier RW, Fiks AG. Effect of Decision Support on Missed Opportunities for Human Papillomavirus Vaccination. Am J Prev Med. 2014;47(6):734-744. 24. Schierhout G, Hains J, Si D, et al. Evaluating the effectiveness of a multifaceted, multilevel continuous quality improvement program in primary health care: developing a realist theory of change. Implement Sci. 2013;8:119. 25. Rogers EM. . 4th ed. New York: Simon and Schuster; 2010. 26. Knox L, Taylor EF, Geonnotti K, et al. Developing and running a primary care practice facilitation program: A how-to guide (Prepared by Mathematica Policy Research under Contract No. HHSA290200900019I TO 5.). Rockville, MD: Agency for Healthcare Research and Quality;December 2011. AHRQ Publication No. 12-0011. 27. Freed GL, Dunham KM, Gebremariam A. Changes in hospitals' credentialing requirements for board certification from 2005 to 2010. J Hosp Med. 2013;8(6):298-303. 28. Gray B, Reschovsky J, Holmboe E, Lipner R. Do Early Career Indicators of Clinical Skill Predict Subsequent Career Outcomes and Practice Characteristics for General Internists? Health Serv Res. 2013;48(3):1096-1115. 29. Glasgow RE, Vogt TM, Boles SM. Evaluating the public health impact of interventions: the RE-AIM framework. Am J Public Health. 1999;89(9):1322-1327. 30. Staras SA, Vadaparamil ST, Livingston MD, Sanders AH, Shenkman E. Increasing Human Papillomavirus Vaccine Initiation among Publically-Insured Florida Adolescents. J Adolesc Health. 2015;56(5 Suppl):S40-46.