1 Opportunity Identification/Project Selection Project Charter Preparation Team Selection Project Leadership Roles & Responsibilities High Level Process Mapping Voice of the Customer Identification and Analysis Cost of Poor Quality Analysis 2 Champion Define Phase: ◦ Value Stream Map ◦ Project Selection Matrix ◦ Project Charter ◦ Stakeholder Analysis ◦ High level process map ◦ Project plan 3 Team Define Phase: ◦ SIPOC ◦ Voice of the Customer Plan ◦ SWOT Analysis ◦ Critical to Quality Tree ◦ Cost of Poor Quality Analysis 4 Six Sigma Process Improvement Road Map Phase Objectives Key Activities Possible Tools and Techniques Key Deliverables ▪ Select Team with 1.0 Document the problem Champion Migration e-Pro Process Improvement SUPPLIER INPUT PROCESS OUTPUT CUSTOMER ▪ Problem Statements Project Charter Project Description Error corrections and clarification of benefits are generating GO Decision rework throughout the migration and case installation Sales 1. Conduct migration statement and establish the ▪ Develop problem analysis Client / Policy ▪ Project Charter Define Account Data processes, accounting for 20% of the total number of e-Pro change transactions. It is estimated that the volume of error Policy Renewal Date Loaded in System Holder and rework will grow proportionally as the number of 2. Complete account profile accounts migrating by 1/1/2004 increases, driving a Client / Policy Holder charter. Demonstrate statement HR Benefits Summary of Benefits Third Party Benefits ▪ SIPOC map proportionate increase in cost and potentially dissatisfying Coordinator 3. Load account Vendor Opportunity structure in system customers. Member and Start Date April 1, 2003 Administrative Dependent 4. Set up and validate Member and Requirements Eligibility alignment with the Business ▪ Develop Charter account benefits in Dependent ▪ COPQ or CODND Client Consultant system Information Completion Date Loaded in System Scheduled to be completed by September 5, 2003 Account Organizational 5. Produce account Providers eligibility record metrics and strategies. ▪ Create SIPOC Baseline Metrics For 1/1/03 migrated accounts: Structure ▪ Communication Plan Third Party Benefits National Accounts Vendor 6. Load account data Member ID Card Claim - Average number of change transactions: 14.3, of which Detail Benefits in product claim 2.9 are due to error and rework engines Determine Customer ▪ Address gap between VOC - Average hours of rework: 309 hours Call Regional Accounts Member - Average number of change transactions: 8.0, of which 1.6 are due to error and rework requirements and and process - Average hours of rework: 137 hours Primary Metrics 1. Total e-Pro change transactions 2. Percentage of change transactions due to error and benefits clarification performance standards. ▪ Estimate financial benefits 3. Average rework hours per error and benefits clarification Secondary Metrics none ▪ Obtain Customer Goal Reduce error and rework in the migration process by 50% starting with 1/1/04 migrating accounts Rework Loops highlighted in Red GO Decision Customer Customer migration survey results IMPLEMENTATION STRUCTURE BENEFITS ELIGIBILITY Total e-Pro Change Transactions by Account from Sep 2002 thru Mar 2003 Migration Run Legislative Legislative Tool Set Up Client ID in Review Conduct Structure TS Tool Review Mapping Job End State Draft Analysis To Review Reformat Aid Structure e-PRO BPC & Structure, Draft e- Create Client Create Implementation Impl. PRO Class Review Benefits, Request/ReceiveRequest Codes Codes Eligibility Data 30 Benefits Guid Draft Guide ID Claim requirements e and CodesCodes EPRO ▪ Process Description Scenarios 2.0 Eligibility CDB SMT linking EPRO Receive Enrollment UCL=26.81 Draft legacy Yes No CODND (Cost of Doing Nothing Differently) Draft Data Financial To Rework structure to Load Data into EPRO e- Structure, PRO end state Test Scenarios Downstream e e th Benefits, Match & Expert Team Complete Systems No 4 Qtr 2003: $500K and Structure Merge Meeting Go back to e-PRO Eligibility Eligibility Load data in u u Rates Redo? In CED Track Client Inspection/Verify No CEO l l Create Client ID Fix Claim OK? Systems ID with e-PRO Yes Year 2004: $2.5M Errors Check vs. e- Loads Yes Fix Errors 20 No No PRO a a Yes e- EPRO No OK? OK? Structur Check vs. e- Member Release e-PRO PRO Develop a reliable and valid ▪ Create overall project plan Internal Productivity Estimated cycle time reduction of 18,868 hours (assuming Rework? e in PRO Yes Data Are cancelle ▪ Project Plan & Timeline Record Yes OK? No Yes Cancel V V OK For CDB Engines errors d Member in Release Update From Yes Loaded resolve in No Yes Legacy 195 accounts migrating 1/1/04). to Vendor EPRO Benefit Get Release OK? No (e.g ATC, d? Legacy l l s Underwriting ATC CAIP DocGen, Measure Approval To Vendor etc) ERW Create a a ERW VOB Mean=10.98 ERW From u u 10 April 1 – April 21, 2003 Define Eligibilit y e-PRO e-Pro Rework Rework d d To i measurement system of the ▪ Develop measurement i Structure ▪ Metrics and collection plan OUT OF SCOPE v v Elig. i i PROCESS STEPS Rework SALE VENDO Plan Projects & Metrics April 14 – April 18, 2003 Get d d S 0 Underwriting R Approval CLIENT / From (ID Client Vendor n performance n CUSTOMER Input CARDS) I I LCL=-4.854 To Sales business process to plan & compile project To ERW To ▪ Baseline Performance results Baseline Project April 21 – May 2 Enrollmen Eligibility ID Implementatio t File Card n GO s Decisio n Production Cancel Migration Legacy Subgroup 0 10 20 30 40 Support Structure Phase Milestones Phase Consider Lean Tools May12 – May 16, 2003 ERW To Implementatio effectively evaluate the metrics n ▪ Process capability analysis Employer Services functional areas Processes shaded in green are specific to Processes out of scope, but critical to MSA May 19 – June 2, 2003 migration Employer Services Gage R&R http://www.aiag.org/ Part1 Number http://www.qimacros.com/free-lean-six-sigma-tips/aiag-msa-gage-r&r.html success of meeting ▪ Determine defect tracking Average & Range Method 1 2 3 4 5 6 7 8 9 10 Sum ▪ Lean Tools Assessment Appraiser20 1 Trial 1 0.65 1 3.250 e Entere your data here->WisdomTrial2 of the0.6 Org. 1 June 2 – June 6, 2003 UCL=19.45 g g Trial3 n n Trial4 a a PassiveTrial 5 Analysis June 9 – June 20, 2003 Total 1.25 2 R customer requirements. requirements R Skewness▪ Measurement Systems Average-Appraiser0.625 1 1 #N/A #N/A #N/A #N/A #N/A #N/A #N/A #N/A Stdev0.41 g g 10 Range1 0.05 0 #N/A #N/A #N/A #N/A #N/A #N/A #N/A #N/A 1.67 n n Appraiser 2 ProactiveTrial 1 Testing0.55 1.05 June 23 – August 4, 2003 3.100 0.20 i i Max Enter your data here-> Trial2 0.55 0.95 6.20 v v Trial3 1 R=5.952 o ▪ Assess baseline o Analysis ControlTrial4 August 4 – September 5, 2003 M M Trial 5 Total 1.1 2 0 Average-Appraiser0.55 2 1 #N/A #N/A #N/A #N/A #N/A #N/A #N/A #N/A LCL=0 performance-estimate Range2 0 0.1 #N/A #N/A #N/A #N/A #N/A #N/A #N/A #N/A ▪ Process model – ‘as is’ Appraiser Trial 1 Enter your data here-> Trial2 Trial3 Trial4 Trial 5 process capability Total Average-Appraiser#N/A 3#N/A #N/A #N/A #N/A #N/A #N/A #N/A #N/A #N/A Range3 #N/A #N/A #N/A #N/A #N/A #N/A #N/A #N/A #N/A #N/A EV (Equipment Variation)0.0332 Equipment Variation (EV) %EV 11.3% 39.9% # Parts #Trials #Ops % of Total Variation (TV) ▪ Measurement Systems AV: (Appraiser Variation)0.02066 2 2 2 Appraiser Variation(AV) %AV 7.0% 24.8% % of Total Variation (TV) R&R (Gage Capability) 0.0391 Repeatability and Reproducibility (R&R) %R&R 13.3% 47.0% NDC 11 % of Total Variation (TV) PV (Part Variation) 0.2917 Part Variation (PV) Analysis %PV 99.1% 350% % of Total Variation (TV) ▪ Statistical tests / tools 3.0 Utilization of data ▪ FMEA ▪ Data relationships Technology People Analyze techniques to gain insight ▪ Pareto chart Ÿ System Error During Processing Ÿ Skill Level of Processor ▪ Validated Key Input Variables Ÿ Data Fallout Ÿ Accessiblity of Site Coach/Training Staff Ÿ Aurhorization Mis-Match Ÿ Aggressive Productivity goals conflict with low Ÿ System Restrictions - LPI Manual Calc quality requirements Ÿ Data Set Up Issues (eligibility, provider, benefits) Ÿ Rushed Training Schedule into process. Divide data ▪ Correlation/Regression Ÿ Timeliness of Batch Processing Ÿ Lack of up-training / reinforcement training (KPIVs) & Key Output Variables Opportunity Ÿ Bank Acct Set-Up Delays Ÿ Best Practice / Skill Training not conducted Ÿ Customer Touchpoints Delays Ÿ OJT training on SOP usage Ÿ Inappropriate assignment or missing hold codes into groups based on key ▪ Fishbone Diagram Ÿ Provider Mis-Match (KPOVs) Ÿ Transaction Limitations on data collected at gateway characteristics and assess ▪ Box plot ▪ Prioritize sources of variation PMHS 100 Overpayment the root causes of errors ▪ Hypothesis Testing Ÿ Auth / Referral Info missing/incomplete/incorrect ▪ root causes 1 Ÿ OI Info missing/incomplete/incorrect T - Ÿ Member Eligibility Info missing/incomplete/ 4 incorrect T 50 Ÿ Standard Operating Procedures (SOPs) Ÿ Benefit Info missing/incomplete/incorrect and poor performance. ▪ Describe findings – identify Ÿ Claim Audit Process >$5K Ÿ Provider Fee Schedule Info missing/incomplete/ ▪ Identify & communicate potential Ÿ Second/Third Party Internal Review incorrect (Medical Management, Claim Benefit Ÿ Provider/Vendor TIN/SSN Info missing/ Build) incomplete/incorrect Ÿ iTrack - drives usage of paper reports to Ÿ Additional Information Necessary to Process Determine where to focus potential root causes RCA 0 Claim improvements sort older claims Ÿ Transaction/Codeset data excluded at gateway 0 1 Contracting efforts for improvement.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages99 Page
-
File Size-