First PDSA Cycles Improvement Advisor Program NES IA Wave 33

Wednesday, 26th March 2014

2:00 – 3:30 pm, GMT

Call time amended due to US daylight savings time start.

Copyright © Institute for Healthcare Improvement Please Check In IA Wave 33:

IA IA IA April Masson Gavin Russell Michelle Affleck EYC – East Ayrshire CPP EYC – East Renfrewshire CPP EYC - NHS Greater Glasgow and Clyde David Maxwell Graham MacKenzie Michelle Cochlan Healthcare Improvement Scotland EYC – NHS Lothian EYC - Perth Dawn Moss Hamish Fraser Penny Bond EYC – NHS Borders EYC – Midlothian CPP Healthcare Improvement Scotland Derek Kilday Judith Cain Sacha Will Scottish Government EYC – North Lanarkshire Council EYC – Aberdeen City Council Diana Beveridge Kerstin Jorna Sally Hall Scottish Government EYC – Dundee City Council NHS Scotland Donna Murray Kirsty Ellis Shalani Raghavan EYC – City of Edinburgh Council Healthcare Improvement Scotland Scottish Government Eileen McGinley Marie-Claire Stallard Stephanie Frearson NHS Lanarkshire EYC – East Dunbartonshire CPP NHS Ayrshire & Arran Emma Levy Marsha Scott Stephanie Mottram NHS Education Scotland EYC – West Lothian Council NHS Dunfries and Galloway Royal Infirmary Gareth Adkins Michele Dowling Wendy Toner Healthcare Improvement Scotland EYC – South Lanarkshire Council EYC – NHS Greater Glasgow and Slide 2 Copyright © InstituteClyde for Healthcare ImprovementD 11 1. NHS Ayrshire & Arran 9. NHS Lanarkshire

2. NHS Borders 10. NHS Lothian

12

3. NHS Dumfries & Galloway 11. NHS Shetland

14 4. NHS Fife 12. NHS Orkney

5. NHS Forth Valley 8 6 13. NHS Tayside

6. NHS Grampian 13 14. NHS Western Isles

4 5 7. NHS Greater Glasgow & Clyde 7 10 15. Golden Jubilee National Hospital 9 2 1

8. NHS Highland 3 N D Copyright © Institute for Healthcare Improvement IA Program Faculty and Staff Team: NES Wave 33 IHI Faculty and Staff

Dr. Robert Lloyd Brandon Bennett Sandy Murray Lloyd Provost Beth O’Donnell Debbie Ray Lead Faculty Faculty WS1 Faculty WS2 Faculty WS3 Program Mgr. Faculty/Director NHS Education for Scotland

Dr. Lesley Anne Smith Dr. Elaine Pacitti Louise Cavanagh Samantha Smith QI Programme Director Educational Projects Mgr. QI Project Officer QI Administrator Slide 4 Copyright © Institute for Healthcare ImprovementB IHI Support Staff for Wave 33

Brian Sanderson, Tom Charlton, Project Assistant Project Assistant bsanderson@ihi. [email protected] org

Slide 5 Copyright © Institute for Healthcare Improvement IA Graduates and Their Role during the Program

Assist your professional development by serving as teacher, coach, and fellow learner.

Bernadette McCulloch Laura Allison Scottish Patient Safety Programme DG Health and Social Care Maternity & Children Quality Scottish Government Improvement Collaborative IA Wave 15 IA Wave 28

Slide 6 Copyright © Institute for Healthcare ImprovementB Agenda – 26th March 2014 – “PDSAs

Time Agenda Lead

2:00 pm GMT Welcome and Check-In Debbie

2:10 pm GMT Paired Charter Reviews – Assignment Revisited and Debbie Insights from the Exercise 2:15 pm GMT Paired PDSA Review Assignment Debbie Preparation for SPC Assignment and upcoming WebEx on SPC 23rd April 2014 2:25 pm GMT Presentation of PDSA Cycles 10 min each Bob -learning from each other, questions, insights: Michelle Cochlin, Marie-Claire Stallard, Penny Bond

3:00 pm ET Accelerating the Rate of Improvement Bob Cascading Driver Diagrams Driver Diagram Priority Setting

3:25 pm ET Additional Questions/ Next Steps Debbie

Slide 7 Copyright © Institute for Healthcare Improvement Reminder: Paired Charter Review Due Friday, 21st March 2014 The purpose is to review a colleague’s Project charter offering feedback for improvement and to receive feedback or improvement on your Project Charter.

1. Post your updated Project Charter to your Extranet homepage under Newest Resources tab.

2. Send an email to your partner letting them know that you have posted your Charter.

3. Use the Charter Feedback form as a guide and the skills learned during IA Workshop 1 to offer feedback for improvement to your partner. You can send your feedback in an email or make edits to the Posted Charter making sure to save it as a separate document, leaving your partner’s original Charter intact.

Slide 8 Copyright © Institute for Healthcare Improvement Charter Review Pairings

April Masson David Maxwell Dawn Moss Derek Kilday Diana Beveridge Donna Murray Eileen McGinley Emma Levy Gareth Adkins Gavin Russell Graham MacKenzie Hamish Fraser Judith Cain Kerstin Jorna Kirsty Ellis Marie-Claire Stallard Marsha Scott Michele Dowling Michelle (Shelly) Afflect Michelle Cochlan Penny Bond Sacha Will Sally Hall Shalani Raghavan Stephanie Frearson Stephanie Mottram

Slide 9 Wendy TonerCopyright © Institute for Healthcare Improvement Insights Conducting the Charter Review 1. How did you give feedback to your partner – email, notes on their charter, completed the Charter Feedback Form?

Email Notes on Charter Feedback Form

2. What did you find useful about the Charter Feedback Form?

3. What did you learn about your own project as the result of reviewing your partner’s charter?

4. If you did not conduct your Paired Charter Review, what was the barrier – partner didn’t have a Charter posted, I didn’t understand the assignment, I couldn’t find time to do the assignment?

No Charter Posted Didn’t Understand No Time Slide 10 Copyright © Institute for Healthcare Improvement IA ---PDSA Review Assignment Purpose: Learn about, and provide feedback related to PDSA cycles . Time estimate: 20 minutes & 4 Easy Steps! 1. Make sure you have your next PDSA cycle posted to your Extranet team page. 2. Check to see who you are paired with (next slide). 3. Review your colleague’s most recent PDSA cycle using PDSA Cycle Feedback Form by making comments right on the Feedback Form (Extranet –> Resources –> Forms –> PDSA Cycles). 4. Send an email to your colleague with the Feedback Form attached and copy Brian Sanderson at [email protected] by Thursday, 3rd April 2014.

Slide 11 Copyright © Institute for Healthcare Improvement Paired PDSA Review Assignment

Wendy Toner  David Maxwell

Stephanie Mottram  April Masson Derek Kilday  Stephanie Frearson Shalani Raghavan  Dawn Moss Diana Beveridge  Sacha Will Sally Hall  Donna Murray Michelle Cochlan  Eileen McGinley Penny Bond  Emma Levy Michelle Dowling  Gareth Adkins

Graham MacKenzie  Marie-Claire Stallard Michelle (Shelly) Afflect  Gavin Russell Kirsty Ellis  Hamish Fraser Judith Cain  Kerstin Jorna

Slide 12 CopyrightMarsha © Institute Scott for Healthcare Improvement SPC Software Call: Set Up

. Next WebEx: Wednesday, 23rd April 2014, 3:00 – 5:00 pm GMT (Note this call is 2 hours)

. Prior to this call:  Email Brian Sanderson [email protected] to let us know which SPC software you have decided to use. We need this information from each of you. We’re assuming QI Charts but need to verify successful installation.

 We will send a prompting email to which you can reply with this information.

Slide 13 Copyright © Institute for Healthcare Improvement SPC Software Call: Set Up (cont.)

. We will post to the Extranet (Resources-Action Period Materials-SPC Assignment):

An Excel database with data and an assignment to build: − a run chart and − a Shewhart control chart using your software and this data − You do not need to know Control Chart theory to complete this assignment. You’ll be getting familiar with the Excel macro QI Charts.

. Please post your completed run and Shewhart control chart to your home page (chart imported to PowerPoint or Word) No later than Wednesday, 16th April 2014. Please title your document with your name Wave 33 SPC Assignment Run or Shewhart Chart& date.

Slide 14 Copyright © Institute for Healthcare Improvement First PDSA Cycle Sharing!

Appreciation of a System

Building Human Knowledge side of change

Understanding Variation

IG Ch. 4, p. 77

Slide 15 Copyright © Institute for Healthcare Improvement CHAPTER 7 TESTING A CHANGE

“Testing changes builds knowledge about the causal mechanisms at work in a system. A process of building knowledge emphasizes the importance of rational prediction. If during testing a prediction is incorrect, the theory that was used to generate the prediction must be modified.” p. 140

Copyright © Institute for Healthcare Improvement Our Observations

. Confusion about purpose of the cycle versus overall Aim . Test is a small change to the system . All PDSA cycles collect data—that doesn’t make it the purpose of the cycle . Qualitative data may be all you can get at first . What questions are you trying to answer? . Theory & Predictions required . Scale-sometimes a test is too big! 1: 3: 5: All . Record what happened (documentation) . Study-many didn’t go back and make sure each question/prediction had an answer in the study . Act-plan the next cycles . Test under different conditions

Slide 17 Copyright © Institute for Healthcare Improvement The PDSA-o-meter! How Many PDSAs since WS 1?

6 or more

5 4 3

2 1

0

Slide 18 Copyright © Institute for Healthcare Improvement Model for Improvement What are we trying to accomplish? . The activity was How will we know that a planned, including a change is an improvement? plan for collecting What change can we make that data. will result in improvement? . The plan was attempted. . Time as set aside to analyze the data and Act Plan study results. . Action was rationally based on what was Study Do learned. IG page 24

IG page 97 Used with permission: Associates in Process Improvement Slide 19 Copyright © Institute for Healthcare Improvement Repeated Use of the PDSA Cycle Model for Improvement What are we trying to Reduce Per-op harm by 30% accomplish? How will we know that a change is an improvement? Peri-op Harm Rate What change can we make that Changes That will result in improvement? Result in DVT Prophylaxis Beta Blockade Prop Improvement SSI interventions

Implementation of Change Wide-Scale Use clippers Instead of A P Tests of Shaving site Change S D Follow- Hunches up Tests Theories Very Small Ideas Scale Test Slide 20 Copyright © Institute for Healthcare Improvement Successful Cycles to Test Changes

Plan multiple cycles for a test of a change: . Think a couple of cycles ahead . Initially, scale down size of test (# of patients, clinicians, locations) . Do not try to get buy-in or consensus for test cycles . Test with volunteers . Use temporary supports to facilitate the change during the test . Be innovative to make test feasible . Collect useful data during each test . In latter cycles, test over a wide range of conditions IG-p.143

Slide 21 Copyright © Institute for Healthcare Improvement Use a Concept Design: Multiple PDSA Cycle Ramps

Triage Diagnostic Fast Track Capacity / Testing Patients Demanding Flow Change Concepts

Slide 22 Copyright © Institute for Healthcare Improvement We have different PDSA Forms on Extranet:

Short Act Plan MODEL FOR IMPROVEMENT DATE ______Study Do Objective for this PDSA Cycle: form:

Is this cycle used to develop, test, or implement a change? What question(s) do we want to answer on this PDSA cycle?

Plan: Plan to answer questions: Who, What, When, Where

Plan for collection of data: Who, What, When, Where

Predictions (for questions above based on plan):

Do: Carry out the change or test; Collect data and begin analysis.

Study: Complete analysis of data;

Compare the data to your predictions and summarize the learning

Act: Are we ready to make a change? Plan for the next cycle

Slide 23 Copyright © Institute for Healthcare Improvement Long form:

Slide 24 Copyright © Institute for Healthcare Improvement Wave 28 Tested Form:

Slide 25 Copyright © Institute for Healthcare Improvement As we listen to our colleagues-let’s add value!

. What could improve this PDSA cycle? . Size? Complexity? Questions? Predictions? Plan? . What did you learn that will help you with your future PDSA cycles?

Slide 26 Copyright © Institute for Healthcare Improvement Michelle C., Marie-Claire, and Penny

Michelle Cochlin Perth and Kinross Marie-Claire Penny Bond Council Stallard Health Improvement Bedtime Reading East Dunbartonshire Scotland Enrichment for Council Identification and Vulnerable Children Preschool Referrals Management of Delirium

Slide 27 Copyright © Institute for Healthcare Improvement Building a

Cascade

of

Learning

Slide 2828 Copyright © Institute for Healthcare Improvement What system are you trying to improve? The key question, however, is do you fully understand the complexity of these systems and which aspects of the system you want to improve?

You need to start drilling down from the… Macro

Meso

Micro levels and build a cascade. Most cascades start at the top!

And, trickle downward… A typical top-down cascade

The Big Dots

Macrosystem Board & CEO

Sr VPS & Mesosystem VPs

Departments/Units/Wards/Service Lines Microsystem Departments/Staff/Patients Which way does (should) your cascade flow? Top Down?

Spread from the Middle?

Bottom Up? IOM Chasm Report Chain of Effect (it all starts with the patient)

1. Patient (start here) 2. Physician 3. Clinical Unit/Microsystem 4. Clinical Service Line/Mesosystem 5. Health System/Macrosystem

Information System Design Principle: Capture data at lowest level and aggregate up to higher levels for cascading metrics throughout system. Think about reversing the cascade!

Traditional Inverted Pyramids Pyramids

Adapted from the work of Dr. Gene Nelson, Dr. Paul Batalden and Marjorie Godfrey Quality By Design: A Clinical Microsystems Approach, Jossey-Bass, 2007. So, think about building an inverted pyramid

Start with the Little Dots

Level 1 Micro Level Micro: Patient & the provider of care Level 2 Meso: Clinical Units, Departments and Service Lines

Level 3 Macro Macro Level Adapted from R. Lloyd & G. Nelson, 2007 Building a Cascading Set of Driver Diagrams

• Review the Driver Diagram you just made to improve a particular outcome. • Review the Secondary Drivers you identified on this initial Driver Diagram. • Select one of the Secondary Drivers and make it the Outcome of your new Driver Diagram. • Identify the Primary and Secondary Drivers of this new outcome.

36 Jönköping's System Level Cascade

Nursing Services Macrosystem

Mesosystem Nursing Divisions

Microsystem Frontline Nursing Units

Source: G. Henriks & Bojestig, Jonkoping County Council, Sweden, 2008 A Cascading Approach to Improvement

Percent inpatient mortality

Hospital Acquired Infection rates Percent compliance with “bundles”

VAP bundle + CL bundle + Pressure Hand washing ulcer bundle + bundle Hamad Medical Corporation Best Care Always Change Package: Critical Care Driver Diagram

Prevent VTE Improving Care for Colon Cancer Patients

The primary effect Secondary effect "What? "How?’’ Early detection You begin the adequate treatment within four weeks Investigation/Treatment

Goal/ You are well informed / involved in the Patient’s involvment objectives entire healthcare chain Investigation/Treatment Our promise to The diagnosis and treatment with ’best Patient’s Involvment patients with colon method’ is offered cancer Multi-disciplinary Collaboration Patient Equally good palliative care is provided Involvement no matter of the place of residence Palliation

Good health care The best possible health promotion measures and efficient screening Prevention program is offered

Interactive research approach in Regional cancer center should several parts of the project prioritize patient-oriented research in oncology

2014-03-26 Dialogue Cascading Systems  Does your organization approach improvement as an interrelated cascading system or as a bunch of singular events that are unrelated and fragmented?

 Do senior managers and the Board or Governance (Non- Execs) regularly discuss how your systems of care are driven by many interrelated factors? Or, do they approach issues of quality and safety as if one solution will produce better results?

 Does your organization have dashboards of measures that cascade from the macro, through the meso and down to the micro levels?

 Do your measures cascade down from the top or percolate up from the places where patient care is actually delivered (the inverted pyramid)? Prioritizing the Drivers

Limitations of resources, attention, and will usually mean we cannot work on everything.

• Which drivers do we believe will deliver the biggest impact? • Which ones will be easiest to work on? (Factors include personnel, culture, resources) • What is our current level of performance on these drivers?

© Richard Scoville & I.H.I. What is your level of ambition?

Multiple Significant system level Primary • Priority Drivers • Sponsor • Resources y t i x e l p m o

C Department-level What level of • Priority One ambition do you have • Sponsor Secondary for your project? • Resources Driver Pilot Unit Department InstitutionIntitution System Scope Timely Scheduling of Appointments

Oral Health Treatment Planning & Execution

Clinic Project Caries Control Patient Sense of Urgency, (all active caries Acceptance of Protocol restored)

At OHC over 16 months, we will Ability/Willingness to Pay 1) increase the % of pts completing caries control within 2 month by X% and Population Management 2) decrease the % of “risk management” pts who need treatment for new caries by Y% (active pt = 18+ w/ >=1 visit in past 2 years, not withdrawn) Patient Self Management (hygiene & preven. Products)

Patient Diet

Risk Management Patient Education & Support (no active caries)

Risk assessment, communication of risk status

Source: Richard Scoville, Ph.D. Risk-based preventive care (cleaning, etc) 44 Timely restorative care for new caries Oral Health Care Prioritization

Process WELL 4 Prev 3.5 Care defined Risk Assess 3

Pt Ed 2.5 Self Mgmt Timely 2 Tx Resore

Status Scheduling Ability Pay 1.5 Pt Involved

1 Popn Diet Mgmt 0.5

Process NOT 0 defined 2.5 3 3.5 4 4.5 5 Impact Lower High Impact Impact What’s The Status of This Driver/Process? DRIVER STATUS APPROXIMATE LEVEL DEFINITION RELIABILITY Driver is not defined or status is unknown 0 .

There is an informal understanding about the driver 1 by some of the people who do the work. No widely 50% recognized or formal written description of the driver.

DDriverriver isis documented.documente dDriver. Ddrive descriptionr descript iincludeson inclu dalle s all 2 rrequiredequired participants participan t(includings (includi nfamiliesg famil iwherees wh ere 80% aappropriate).ppropriate) The. Th drivere driv eisr understoodis underst obyo dall. by all.

The driver is well-defined, and enacted reliably. 3 Quality measures are identified to monitor outcomes of 90% the driver and may be in use by few/some. Ongoing measures of the driver are monitored routinely by key stakeholders and used to improve the 4 95% driver. Documentation is revised as the driver is improved. Driverdriver outcomesoutcome ares a rpredictable.e predicta bDriversle. driv areere fullys are fully embeddedembedde ind ioperationaln operatio nsystems.al syste Thems .driver The dconsistentlyriver 5 99% meetscons isthete nneedstly m andeet sexpectations the needs ofan alld efamiliesxpecta tand/orions o f all providers.families a nd/or providers. 46 What Is It’s Predicted Impact?

PREDICTED IMPACT

LEVEL DEFINITION ThisThis driver drive rhas ha nos n impacto impa orct doesor d onotes apply not a topp ourly t osystem our s yofs tem 0 ocare.f care ThisThis driver drive rhas ha onlys on lminimaly minim oral indirector indir eimpactct imp onac patientt on pa tient services and outcomes 1 services and outcomes.

ThisThis driver drive rwill wil limprove improv eservices servic efors fouror o patients,ur patie butnts ,other but other driveres are more important 2 drivers are more important.

ThisThis driver drive rhas ha significants significa impactnt imp aonct outcomeson outco forme ours fo patients.r our 3 patients

ThisThis isd rnecessaryiver is ne force sdeliveringsary for d patienteliveri nservices.g patien Itt hasser vai ces It has a major, direct impact on the outcomes. 4 major, direct impact on the outcomes.

ThisThis driverdrive ris is absolutely absolute lessentialy essen tforial achievingfor achie vresults.ing re sults. ImprovementImprovemen int i nthis th idrivers driv ealoner alo nwille w haveill ha av direct,e a dir ect, 5 imimmediatemediate impactimpac ton on outcomes. outcomes 47 Results of OHC Prioritization

Process WELL 4 Prev 3.5 Care defined Risk Assess 3

Pt Ed 2.5 Self Mgmt Timely 2 Tx Resore

Status Scheduling Ability Pay 1.5 Pt Involved

1 Popn Diet Mgmt 0.5

0 Process NOT 2.5 3 3.5 4 4.5 5 defined Impact Lower High Impact Impact Results of OHC Prioritization

Process 4 Prev 3.5 Care WELL Risk defined Assess High impact,3 not Pt Ed 2.5 Self well defined Mgmt Timely processes2 are Tx Resore

Status Scheduling key targets for Ability Pay 1.5 Pt improvement! Involved 1 Popn Diet Mgmt 0.5

0 Process NOT 2.5 3 3.5 4 4.5 5 defined Impact Lower High Impact Impact Exercise Prioritizing Drivers

• Use the Prioritizing Drivers Worksheet.

• Plot your secondary drivers on the grid based on your assessment of: (1) how well the process is defined, and (2) the level of impact that the drive can have.

• Discuss and select the drivers that are most important for improving your system of care. Prioritizing Drivers Worksheet

Process WELL 4 Prev 3.5 Care defined Risk Assess 3

Pt Ed 2.5 Self Mgmt Timely 2 Tx Resore

Status Scheduling Ability Pay 1.5 Pt Involved

1 Popn Diet Mgmt 0.5

Process NOT 0 defined 2.5 3 3.5 4 4.5 5 Impact Lower High Impact Impact Churchill on Learning Making predictions slows up the “hurry”

“Men occasionally stumble over the truth, but most of them pick themselves up and hurry off as if nothing has happened.” --Winston Churchill

Taken from: http://home.att.net/~quotesexchange/sirwinstonchurchill.html Slide 52 Copyright © Institute for Healthcare Improvement Your Questions & Comments

Can speak them, “chat” them, raise hand….just please do ask them!!

Slide 53 Copyright © Institute for Healthcare Improvement