<<

Breaking Down Barriers with Master Management and Data Governance Session #249, March 8th, 2018 1:00 PM – 2:00 PM Kim Jackson, Vice President, Strategy, Products and Governance, Providence St. Joseph Health

1 Conflict of Interest

Kim Jackson Has no real or apparent conflicts of interest to report.

2 Agenda • Providence St Joseph Health Overview • Creating Funding for Data Governance and • Presenting a Clear ROI to Support Funding for Data Governance and Master Data Management • Challenges facing successful program

3 Learning Objectives • Define the stages of a master data management and data governance program • Illustrate key uses cases and metrics to prove hard ROI for engagement and funding • Evaluate some of the challenges and barriers encountered throughout the data governance process

4 About us

5 Data Governance Mission

Developing TRUST in data to run a business that is consistent, timely and integrated

Increase TRUST in the data Enhance data quality Master key data domains

Encouraging more, not less, data Create a culture of healthy data Resolving analytic priorities access

Establish a foundation for future Establishing standards for master -based initiatives and Campaign for data literacy reference data innovations

6 Holistic Approach

People and Technology Without Process and Process Technology Automated Chaos and Confusion Without People Poor Service Delivered Alienation and Turnover Underutilized Systems

People and Process Without Technology Frustration and Inefficiency High Cost of Operation 7 PROCESS

Manage Standardize Manage ,Data Monitor Quality & Automate Collaborate Data Quality Data Data Quality Fix Process

On-going Process

Data Quality Management People People People People Technology Process Process Process Process Technology Technology Establish Support internal Establish data Establish data Ensure data is enterprise process, and external quality standards quality report card created as close to policy and data across the and escalation the source as technology synchronization organization process possible architecture to manage business- critical data quality Leverage Utilize technology industry data to improve governance best manage data Establish a value practice to quality based roadmap manage data aligned with quality Analytic Governance Council strategy 8 Create Funding for Data Governance and Master Data Management Call for Action

Voice of Determine the Real ROI Customer

Funded Program

Vision and Road map Goals

Highlight pain points

9 Call for Action Determ Voice ine of the Real Custom Reason for Action ROI er Funded Program Vision Road and map Data governance is a set of processes that ensures that Goals Highlig ht pain important data assets are formally managed throughout the enterprise points 1 Need for more timely, accessible, consistent, high quality information for • Improving reimbursement, cost efficiency • Improving quality of care and patient safety (e.g., reduce readmissions) • Support at-risk payment models/ACO participation 2 Business Case: • Support strategic Initiatives– foundational IS component to support Portals, Big Data/Population Health, EMR Reginal efforts, HIE • Reduce time spent extracting, merging, validating, and cleaning data and refocus those FTEs on analyzing and using data • Improve timeliness and quality of information for strategic & operational decision making and performance improvement 3 In Scope: • Identify data governance needs (e.g., data quality, master data management, management) to support Information Sophistication initiatives • Inpatient, Ambulatory, Home Health, Community Health Ministries • Propose tactical recommendations and implementation roadmap Out of Scope: • IT Governance; Implementation (this will be a subsequent phase)

10 10 Call for Action Determ Voice ine of the Real Custom Voice of the Customer ROI er Funded Program Vision Road and map Goals Highlig ht pain Representative Quotespoints Key Themes

Significant barriers to accessing information

Lack of data integration—need to access multiple systems

Significant manual effort required to get needed data

Lack of trust in the data

Need for more timely data

Need for standardization

Lack of accountability

Difficult to turn data into actionable information 11 Call for Action Determ Voice ine of the Real Custom Create Vision and Goals ROI er Funded Program Vision Road and map Goals Highlig ht pain points Vision Goals

Establish processes that ensure data assets are Provide data that is: effectively managed across the enterprise  Consistent – One enterprise-wide definition  Accurate – Correctly represents underlying process  Timely – Available to authorized users when needed Develop standards, policies, processes, guidelines,  Accessible – Easy to find, understand and use framework and technology for management, use,  Actionable – Relevant to business decisions distribution and protection of enterprise data and compliance with standards and policies

Standardize and oversee how data is processed, Provide knowledge, skills, and tools to enable data-driven prioritized, and continually verified decision making at multiple levels of the organization

Establish a foundation for Analytic Services and future information-based initiatives and innovations 12 Call for Action Determ Voice ine of the Real Custom ROI er Pain Points Funded Program Vision Road and map Goals Highlig ht pain Projects Team listed as too completed but could say ‘Yes’ to if they had Governance tools points EIM Module Tasks Supported/Benefits Example Data Integration • Shorten development/testing of extract/integration PICIS Integration to NHSN programs so DW and other teams can spend less Integration to EPSI and other 3PA time developing, more time analyzing Systems • EIM is for Batch/ as HL7 is to Clinical Master Data • Integrate med staff and community provider data from Managing IPC’s is manual process IP & Amb credentialing systems, MEDITECH where an Excel list is sent to Each Management instances, Allscripts, and Allscripts Home Health and Org, HH, IDX, ETC. hoping they will publish a ‘golden record’ to Portal and Explorys be integrated. • Integrate list of payors/payor types across contracting, MEDITECH instances, IDX, and Allscripts Home Health Data Quality • Publish a dashboard of data quality metrics to 1) CPT, POA, Complication Rates, - proactively identify and resolve problems Providers DOC? • Setup business rules to identify when # of encounters 2) Admission Types, Charge Cost changes by X% at any location and alert the data Ratio’s – Staff Complete Charges? steward of the potential issue 3) Reduction in RAD Reports – 13 Interface Breaks Call for Action Determ Voice ine of the Real Custom ROI er Pain Points Funded Program Vision Road and map Goals Highlig ht pain Projects Analytics Team listed as too completed but could say ‘Yes’ to if they had Governance tools points

EIM Module Tasks Supported/Benefits Example Metadata • Provide a glossary to business users with Measures: Readmit , CPOE, MU, Management metric/field definitions and linage showing Central Line Days, NHSN, Pressure the source Ulcer Progression, Census, Surgical • Do an impact analysis of downstream Procedures, Algorithms: LACE, impact to Amalga and reports of changes Adjusted Pt Days to a MEDITECH field before making the DIAG Groups: CHF, SEPSIS, AIM, change Chief Complaint, Surgery pt, Obs Pts Terminology/ • Translation of ICD9 to ICD10 codes to Glucose tests, Critical Care Day, Ontology Services plan for the transition and for retrospective Lactose, Antibiotics, pressure ulcers, reporting of trends after the change over Social Risk Factors, Positive • Translation between LOINC, SNOMED, Culture, OR Charges. OBS ADMIT CPT, ICD codes for regulatory & quality Order reporting 14 Call for Action Determ Voice ine of the Real Custom ROI er Funded Program Vision Road and map Goals Highlig ht pain Most Common Road Map ROI Driven Road Map points

MDM/ Reference Oversight Data Committee KPI standardization for Analytics KPI Standardization/ Data Glossary/ MDM/ Data Glossary Skills Assessment Reference Oversight Data Committee for Reporting Data Literacy

Prioritize Reporting Demand Master Data Master15 Data Expand Reach Prioritize Domains Road Map

Phase 0: Phase 1: Phase 2: Phase 3: FY13-14 FY15 FY16 Mature

* Operational rollout *DG is part of business *Interviews *Focused pilots (JIT *Streamline Global codes operations *strategy involvement of business) (Reference Data) **Reference Data across *tool set selection **Narrow scope *Broader scope within all analytics/Reporting *Defined requirements for Provider/EMPI same data domains, *Additional data domains first two projects downstream consumption and problems *Proposed org structure *Technical infrastructure and process *Lean exec oversight *Business assumes data *Mature DG organization stewardship (with Analytics Governing * Documented current – IT/ Strategy Owner *Slightly broader exec Council, Data Standards state information *Global codes in EDW’s workgroup) architecture oversight – IT and Data *Data Literacy Owners *Data Literacy *Data Literacy *Data Literacy **Data Glossary

Scope of data problems As scope is broadened, Data governance Define Scope, strategy, to be solved is narrow; exec oversight required to becomes embedded make prioritization decisions fund DG can’t be solely a into operational 16 and to enforce “complaint forum” accountability processes Data Literacy

Analytics 30 Program: Domain Training and Documentation • 2X per month 30 Min Education Webinars • Each Analyst required to complete 4 year - part of Performance Review – Documentation – Engagement with Knowledge worker community – Increased Adoption – Expanding Knowledge Base – Career Growth (Speaking, Data Literary, and story telling)

Various subjects (100 + todate) New Solution Review Lens of the Consumer (Domain Solution Lists) New Data Source, Dashboards Data Domains & Data use Special topics: Complicated Report Review (Bundle Upgrades, ICD 10, MU, etc.. compliance) 17 Reference Data “Global Codes”

Report Request Process

Report Request Spec Design Report Build

• “I would like a • What do you • Hard code sepsis report of Sepsis mean by Sepsis ICD – 10 codes patients and • What Orders do • Hard Code Admit timeliness Admit you want to Order codes Orders” include

18 Reference Data “Global Codes”

Report Request Process

Report Request Spec Design Report Build

• “I would like a • What do you • Hard code sepsis report of Sepsis mean by Sepsis ICD – 10 codes patients and • What Orders do • Hard Code Admit timeliness Admit you want to Order codes Orders” include

Why do we ask the end user to define Data Cohorts Most users Don’t know the answer Don’t want to review 1000’s of codes to determine their version They are looking for our expertise19 Reference Data “Global Codes”

Report Request Process

Report Request Spec Design Report Build

• “I would like a report • Review Standards • Reference standard of Sepsis patients Sepsis Cohort and timeliness • Reference Standard Admit Orders” Order set

Start with Standard Data Cohorts Reference Data Can be created without Data Governance tools This can start from current know best practice • National and state standards Code Sets • Started with adding Fields in EDW with .net web page for data owners • Commonly used definitions used by local to map new data values knowledge workers • Later move reference data to Data Governance Platform and send to • Create visibility and transition ownership of downstream applications definition to business Goal: create trust, reduce variation, speed delivery  More important to expand adoption and reach vs which platform is used Measure: number Analytics solutions using referenced standards 20 Identify the “Call for Action/ ROI”

Clean Up: Clean up $ Delayed/Denied TAT: Duplicate/Delayed Clinical: Medical Errors $100 per Account Claims Tests

• 1.2 M – ETA 36 weeks to • 6.4 Million in Accrual • 19% of Accounts are • Metric TBD clean up Duplicates, to place unknown orphaned • Overlays can result in Linkages and Wrong SSN payments • 2% have Orders procedures on wrong patient • 31 % are completed • Missed allergies can cause • ~7.5M / ~250K/year medication reaction

Registration Downstream P Process Impact A P P S R T R A U E I O Y P E F V O P N E T I R L D Y R I E D E D N E R O C N M H C T A A E I I I T Y N N 21 Pt Identity– Close the loop Data Quality Scorecard & Operational Dashboard

• Potential Overlays (Potentially Deadly) • Patient • Potential Information Duplicates (Costly entered by Reg Processing to Clean Up) Inputs staff in Source Outputs Systems (ex. • Enterprise Master Patient Index (EMPI) Name, Address, tool looks up patient SSN) records to determine potential matches • Tool automatically merges like records and queues potential issues

Data Quality Scorecard Operational Dashboard

• Provides Data Quality Health for Completeness, • Provides info on overlays created, duplicates Conformance, and Validity for Key Patient created/resolved, aging of unresolved items, and Attributes (First Name, Last Name, SSN, etc) other Reg staff operational metrics

Goal: Improve patient outcomes, increase patient safety Measure of Success: reducing the number of potential overlays and duplicates created by Registration Staff. 22 Accomplishment Patient Identity - inputs

23 Accomplishment – Patient Identity

Average clean-up cost per duplicate account is $100 !!!

 eMPI currently integrated with 18+ sources across ministries with 10 million records  Enterprise rollout of Duplicate dashboard to HIM, PAS and other critical business areas  86% decrease in duplicates, Stable duplicate reduction over last 2 years  Enterprise policies implemented  Consensus reached on duplicate reduction 10% 24  Goal is to reach industry best in class < 1% Provider the “Call for Action/ ROI”

Clean Up: Duplicates Compliance: Failure to Workflow: Orders and Attribution: Wrong MD Attached to Decreased Trust in Data Sign Orders Results go to wrong MD Pt (Quality and Break the Glass)

21% of providers have at 10% H&P not Signed .9% MD’s had 1.3% orders >2700 monthly visits with least one error > 2600 in 6 months invalid Attending MD

Credentialing Downstream P / Dictionary Impact A P P S R T R A U E I O Y P E F V O P N E T I R L D Y R I E\ D E D N E R O C N M H C T A A E I I I T Y N N 25 Provider Data Quality Error Volumes

26 Provider Quality Score

Best Version of Truth (BVT) Score 95% Score 90+ If at-least one attribute value pass DQ test, its get 100% score for that attribute. Not required to have all attribute values to pass to get 100% score.

Golden Record (GR) Score

ALL attributes and every value needs to be passed to 95% attain (100%) Score 90+ attribute score.

27 KPI Standardization

• Standard Metrics – One piece: screening …

– One piece: ACO population … Repository Measure Variable Variable – One piece: age group … Variable Variable Variable Variable

– Remove: diagnosis … Variable Variable Measure

– Remove: hospice patients … Variable Variable Variable Variable = Cancer Screening Metric Variable Variable • Prepackage Standard Metrics – Operating Commitments – MSSP Prior work in Reference Data enables/Jump starts KPI Standardization 28 Business Glossary/ Search Engine

Common Pit Falls ROI Approach Starting too early: Create window to reference data. • Knowledge worker staff spend value time • A view of reference data, allows users gain understanding of entering data definitions that are rarely looked at Cohorts and does not create additional work. By ensuring that reference data is used in operational reporting System Expanding too fast: Build changes will be sure to be incorporated • When staff enter too much Meta data manually, Document Strategic and broadly used KPI’s knowledge worker staff get fatigue and disengage when adoption is low by high • that have been standardized only Heavily dependent on Manual Data Entry: Workflow • work effect for reference data, is not sustainable • Ensure tool is linked to analytics solutions Automate Report inventory list/ elements and audit logs. • Most reporting platforms have Metadata feeds that allow report search functions.

29 Accountability and Customer Responsibilities

Tools Goal: Create methods to determine customer Buy-in Measure of Success: Reduction in # reports run 5 of less times

Search Tool - Allows users and Tech team to determine • Use of analytics • Availability – Menus • Meta Data – Fields, Formula’s, Limited lineage

On our Intranet: By User, By Analytic, Last run date • Adding more all the time/ Programmed vs user maintained

Before any new request is completed

complete search for current content, 30usage myHiway…..for a unified experience

Report List and Content

Catalog Training

User Audit Meta Data Logs

Metric Apps Definitions

31 31 Data Governance Overview and Functional Description

OVERSIGHT New Exec Sponsor to come from Senior-level group that can Analytics Governance Council Business. DGC to serve as forum for prioritize across DG projects escalation.

OWN Accountable for data quality, Has overall accountability for data Data Owners adherence to standards, and access to in his or her subject area information

Originally reports to DGM; could Data Steward/Business Liaison eventually report to business

Form network of data, application, and Focus group or collaborative analytics stewards CONSULT Knowledgeable and respected in Work groups: business domain Work groups: Policy, Subgroups formed as needed. pilots Standards, etc.

Application Analytics Analytics Stewards know the rules surrounding consumption; Application Stewards Stewards Stewards understand the source systems

DATA SUPPORT Policy and process support and Data Governance Manager/EIM Dedicated team of analysts and technical work required to support IT team developers using Informatica tools data governance objectives 32 DG steering committee and collaborative owners

Data Governance - EIM Program

Analytics Governing Council

Data Governance IT support team

Data Domain Owners + Data Advisors DQ | MDM | ETL | Metadata | Policy

Collaborative | Collaborative | Collaborative | Patient Collaborative | Payer Provider Reference Data

DATA OWNER: HIM/RCS DATA OWNER: CMIO DATA OWNER: VP RCS DATA OWNER: Manger KPI

Policy Standar Analytic Policy Standar Analytic Policy Standar Analytic Policy Standar Analytic work ds work s work work ds work s work work ds work s work work ds work s work group group group group group group group group group group group group

Data Stewards Data Stewards Data Stewards Data Stewards

Application Stewards Application Stewards Application Stewards Application Stewards

Analytics Stewards Analytics Stewards Analytics Stewards Analytics Stewards 33 Analytics Governing Council Charter

• Partner with all analytics teams across the enterprise to prioritize and • Provide direction and oversight of a comprehensive Information Governance align analytics initiatives with strategic and operational priorities Program - Guide decisions on the Analytics Portfolio (projects and - Pending merger, guide data & analytics strategy and structure solutions) - Advocate for data quality and use of capabilities in the analytics - Prioritize initiatives for the Analytics Roadmap information management portfolio - Scope: analytics initiatives with cross-disciplinary uses - Oversight of Data Standards Workgroup (DSW), tasked with and/or cross-regional scope and/or that support PSJH definition of data owners, information lifecycle management strategic and operational goals - oversight of the Master and Reference Data Management & Data - Initial focus: initiatives requiring HI &/or CCPH Quality Management resources - Connect analytics with DSW for support on enterprise data • Consult on and provide direction for key analytics initiatives standards work including dissemination/adoption of established - Provide visibility into analytics initiatives across the enterprise data standards organization – what is available, status of in-flight initiatives - Provide Updates on current and Expansion of Master data - Connect resources from various initiatives when there are Management programs opportunities for collaboration or efficiencies • Communicate with peers and staff about data and analytics - Provide an escalation point for concerns for Resource • Communicate AGC’s Charter and Progress to various Governance bodies contention • Define and track metrics of success (KPIs) for this Council - Advise on addressing organizational barriers to position initiatives for success 34 Lessons Learned

MDM • Access to Source system was hard to get.. Provider Credentialing • Education in business terms took time and Various methods • Gap between Executive leadership goals and commitment vs Middle Management tactical priorities varied from large to small • SME’s were missing in payor domain in particular Governance • Keep information at forefront for all front line areas (registration Med Staff office). • Continual education required all Data Stewards and front line staff • Keep involved key stakeholders and checking in for any perceived issues. Track continual adoption and effectiveness. • This is not a one time effort, this is ongoing operations KPI Standardization • Metric governance requires resilience • Leadership support for resistant staff • Sleuth skills to find community requirements and remove duplicate processes • Usage and Catalog made a huge difference in leadership support 35 Questions

Kim Jackson | Vice President, Strategy, Products and Governance Providence St. Joseph Health 1515 E. Orangewood Avenue Anaheim, CA 92805-6824 [email protected]

36