MODEL | SEPTEMBER 27-28, 2018 | BOSTON

© 2018 Darling Consulting Group, Inc. • 260 Merrimac Street • Newburyport, MA 01950 • Tel: 978.463.0400 • DarlingConsulting.com Page 1 The Faculty

Drew H. Boecher Sam Chen Jon Hill Brandon Blanchard Managing Director Quantitative Consultant former Global Head of Model Risk GovernanceVP, Management Darling Consulting Group, Inc. Darling Consulting Group, Inc. Credit Suisse Commerce Bank

Joe Montalbano Michael R. Guglielmo Ray Brastow Liming Brotcke Quantitative Consultant Managing Director Senior Financial Economist Quantitative Manager Darling Consulting Group, Inc. Darling Consulting Group, Inc. Federal Reserve Bank of Richmond Federal Reserve Bank of Chicago

© 2018 Darling Consulting Group, Inc. Page 2 Agenda – Day One

Time Topic

9:00 Next Level MRM [Drew Boecher]

9:45 Regulatory MRM Perspective [Panel: Mike Guglielmo, Ray Brastow, Liming Brotcke]

10:30 Break

10:45 Establishing a Model Risk Management Culture [Brandon Blanchard and Mike Guglielmo]

12:00 Lunch

1:00 Lifecycle of a Model [Brandon Blanchard and Mike Guglielmo]

2:00 Break

2:15 Model Inventory Management [Mike Guglielmo & Jonathan Hill]

3:15 Break

3:30 Managing Inventory Risk: “Should a Model ‘Know’ Its Own ID?” [Jonathan Hill]

4:30 Assessing Model Risk In The Aggregate [Ray Brastow & Liming Brotcke]

5:15 Q&A and Open Discussion

© 2018 Darling Consulting Group, Inc. Page 3 Agenda – Day Two

Time Topic

9:00 New Era of Data Management [Joe Montalbano]

10:00 Break

10:15 Case Study: Validation of Statistical Models [Joe Montalbano]

11:15 Case Study: Validation of Non-Statistical / Non-Complex Models [Sam Chen]

12:00 Lunch

1:00 Case Study: Validation of Compliance / Data-Driven Models [Brandon Blanchard & Mike Guglielmo]

1:30 Case Study: Validation of Vendor / “Black-Box” Models [Sam Chen]

2:00 Break

2:15 Follow-Up to Validations: The Validation Is Complete – Now What? [Sam Chen & Drew Boecher]

3:00 Break

3:15 The Future State of MRM - Adaptable, Efficient And Effective [Mike Guglielmo & Jonathan Hill]

4:00 Q&A and Open Discussion

© 2018 Darling Consulting Group, Inc. Page 4 MODEL RISK MANAGEMENT | SEPTEMBER 27-28, 2018 | BOSTON Next Level Model Risk Management CFP Masterclass

Day 1 | 9:00 am

Drew Boecher, Managing Director, DCG

© 2018 Darling Consulting Group, Inc. • 260 Merrimac Street • Newburyport, MA 01950 • Tel: 978.463.0400 • DarlingConsulting.com Page 5 Model Risk Management Guides Hudson River Rafting Company…

© 2018 Darling Consulting Group, Inc. Page 6 Next Level MRM

A Industry Trends & the Regulatory Pendulum US Banking Industry Themes, Regulatory Pendulum, Model Proliferation

B Contemplating the Future of Model Risk Management SWOT, Regulatory Approaches, Possibilities, Challenges, 3 Lines of Defense, Validator Perspective

C Workshop Preview & Motivation MRM Culture, Model Lifecycle, Model Inventory, Aggregate Risk, Data, Validations, Future of MRM

© 2018 Darling Consulting Group, Inc. Page 7 US Banking Industry Themes

Intensely Competitive Industry

2018 2008 1998 1990 u Consolidating Industry Commerical 4,880 7,076 8,774 12,343 Savings 726 1,229 1,690 2,815 Total Banks 5,606 8,305 10,464 15,158

3/31/2018 12/31/2008 12/31/2001 u Concentrated Industry FDIC Insured Banks 5,606 8,305 9,613 Total Assets (Billions) $ 17,531 $ 13,847 $ 7,868

3/31/2018 12/31/2008 12/31/2001 u Cyclical Profitability Average ROA 1.28% -0.94% 1.14% Average ROE 11.44% -9.88% 12.73% NIM 3.32% 3.34% 4.03% Noncurrent/Total Loans 1.15% 2.93% 1.31% Coverage Ratio 110% 75% 127.56%

© 2018 Darling Consulting Group, Inc. Page 8 US Banking Industry Themes

2008-2009 Financial Crisis u Model Proliferation…

u …After Model Failures Ø Correlations rise in crises

u Prevent Bank Failures

© 2018 Darling Consulting Group, Inc. Page 9 2009-2016 Regulatory Pendulum

Good Times Bad Times

Free Enterprise Question Capitalism (“new economy”) (“end of capitalism”)

Optimism Pessimism

Decreased Regulation Increased Regulation

“Geniuses” & “Heroes” “Idiots” & “Villains”

Example: Example: 1999 boom led to the 2009-2016: increased emphasis on Stress Testing, Great Depression led to Gramm-Leach-Bliley Act MRM, & Capital Planning the Glass-Steagall Act

© 2018 Darling Consulting Group, Inc. Page 10 2011 Model Risk Management Guidance

Supervisory Guidance on Model Risk Management OCC 2011-12/ Fed SR 2011-7/ FDIC FIL 22-2017 u Governance Ø Rests with the Board and senior management u Guiding Principle of “Effective Challenge” Ø Critical analysis by objective, informed parties u Model Risk Management Policy Ø Formalize model risk management activities with policies u Project Plan Ø Statement of model purpose u Documentation Ø Include technical documentation u Model Inventory Ø Lists models, locations, owner, developer, etc.

© 2018 Darling Consulting Group, Inc. Page 11 2017-2018 Regulatory Pendulum

Good Times Bad Times

Free Enterprise Question Capitalism (“new economy”) (“end of capitalism”)

Optimism Pessimism

Decreased Regulation Increased Regulation

“Geniuses” & “Heroes” “Idiots” & “Villains”

Example: Example: 1999 boom led to the 2017-2018: increased Great Depression led to Gramm-Leach-Bliley Act emphasis on Deregulation the Glass-Steagall Act and Free Markets

© 2018 Darling Consulting Group, Inc. Page 12 US Banking Industry Themes Even More Model Proliferation…

u More Robust Statistical IRR ALLL/CECL AML/BSA Stress Testing Models Funds Transfer u Credit Scoring MSRs Loan Pipeline Loan Pricing Deposit Pricing MRM Guidance Pricing Continues as Dodd- Capital Operational Derivative Profitability Loss Migration Hedging Frank rolled back Planning Risks Pricing

Budget Financial Securities Compensation Incentives Loan Valuation Modeling Planning Valuation

Deposit Economic Cash Flow VaR Prepayments Fraud Sensitivity Capital

Behavioral Salaries & Cost Insurance Taxes Fee Income Models Benefits Allocation

© 2018 Darling Consulting Group, Inc. Page 13 Next Level MRM

A Industry Trends & the Regulatory Pendulum US Banking Industry Themes, Regulatory Pendulum, Model Proliferation

B Contemplating the Future of Model Risk Management SWOT, Regulatory Approaches, Possibilities, Challenges, 3 Lines of Defense, Validator Perspective

C Workshop Preview & Motivation MRM Culture, Model Lifecycle, Model Inventory, Aggregate Risk, Data, Validations, Future of MRM

© 2018 Darling Consulting Group, Inc. Page 14 Model Risk Management in SWOT

Helpful Harmful to achieving objectives to achieving objectives Where does MRM fit in SWOT analysis?

Internal Origin Internal How do senior executives and CEO view MRM? § Cost center? Origin External § Profit center?

© 2018 Darling Consulting Group, Inc. Page 15 Model Risk Management in SWOT

EOM – Opportunity Mgmt. ERM – Risk Mgmt. Strategy to implement Risks to mitigate Enterprise Helpful Harmful Risk to achieving objectives to achieving objectives Management is part of a wider value creation and preservation strategy. Internal Origin Internal If using models strategically, then Origin

External Model Risk Management Enterprise Strategy Management (“ESM”) is strategic!

© 2018 Darling Consulting Group, Inc. Page 16 Two Approaches to Regulation

Prescriptive Principles-based

† “Tell me what to do” † “Tell me the goal” † For example, 5% Tier 1 † For example, 1996 Interest leverage is required to be Rate Risk policy statement well-capitalized (FIL-52-96)

© 2018 Darling Consulting Group, Inc. Page 17 MRM Possibilities – Clarity through Contrast

At its worst… …

© 2018 Darling Consulting Group, Inc. Page 18 Model Risk Management Possibilities

At its worst… At its best… • Fail to distinguish model usefulness • Spread model confusion • Miss quantification errors • Permit opaque qualitative judgment • Generate executive uneasiness • Prompt regulatory concerns

© 2018 Darling Consulting Group, Inc. Page 19 Model Risk Management Possibilities

At its worst… At its best… • Fail to distinguish model usefulness • Identify useful models • Spread model confusion • Promotes confidence in models • Miss quantification errors • Improve quantitative accuracy • Permit opaque qualitative judgment • Provide transparency • Generate executive uneasiness • Support executive decision-making • Prompt regulatory concerns • Instill regulatory confidence

© 2018 Darling Consulting Group, Inc. Page 20 MRM - Horizontal Challenges

Phase 3: Strategically Useful – December 2020 Future State X u Improve Design Leadership MRM process improving models for financial benefit

Future State E

Excellent regulatory and validator ratings Deliver Develop u Communication Unusually effective communication Phase 2: Regulatory Compliant - December 2018

Current State C Improve Design Regulator/Validator Satisfactory u Talent Stakeholders see some benefit

Implementation Phase Implementation Deliver Develop Phase 1: MRM Setup MRM

Prior State B Improve Design Solid Governance Improved validation Prior State A Deliver Develop No Policy, Limited Governance

Now December 2020

© 2018 Darling Consulting Group, Inc. Page 21 Model Governance & Three Lines of Defense

Governance: Board (Vision & Tone from the Top) Align risk appetite, business strategies, and the budget

1st Line 2nd Line 3rd Line Business Lines Model Risk Management Internal Audit

§ Manage the business and § Oversee and challenge business § Review the 1st and 2nd lines model development line risk management § Challenge current processes § Involved in day-to-day risk § Provide guidance, direction and independently management a different perspective § Objective evaluation and § Follow a risk process § Develop risk management assurance on effectiveness of risk § Apply internal controls and risk framework mgmt (design & implementation) responses

© 2018 Darling Consulting Group, Inc. Page 22 Vision, Communication, & Talent

u Coach Clear Strategy (Vision) Ø Worst – top doesn’t care Ø Best – clear “tone from the top” u Coordination across 3 lines Ø Worst – excessive confrontation Ø Best – collaboration & respect u Business Lines begin Defense! Ø Worst – not my job Ø Best - effective limit identification

© 2018 Darling Consulting Group, Inc. Page 23 MRM Horizontal Challenges

At its worst… At its best… • Lack of MRM Program Vision • Terrible Communication (3 lines) • Lack Staff and Talent Required

© 2018 Darling Consulting Group, Inc. Page 24 MRM Horizontal Possibilities

At its worst… At its best… • Lack of MRM Program Vision • Clarity of Strategic MRM Vision • Terrible Communication (3 lines) • Unusually Effective Communication • Lack Staff and Talent Required • Fully Staffed with Proper Expertise

© 2018 Darling Consulting Group, Inc. Page 25 Insights from an Independent Validator

Observed recent trends: • “Key man” developer risk at multiple institutions • Documentation quality varies immensely § Often proportionate to model quality • Communication challenges frequent • Big difference between firms envisioning strategic benefit versus those primarily focused upon regulatory compliance • Effective challenge is not always effective!

© 2018 Darling Consulting Group, Inc. Page 26 Insights from an Independent Validator

“Management then provided effective challenge…”

© 2018 Darling Consulting Group, Inc. Page 27 Validation as a Process

May Improve May Deteriorate • Principles-Based approach contributes to mean reversion § Best practices change over time § Regulators inform those falling behind § Ratings don’t matter as much as might think

• Focus upon constant and continuous improvement § Never arrive at “perfect” models § Validation feedback is part of a wider business conversation

© 2018 Darling Consulting Group, Inc. Page 28

Modeling Challenges – Predicting The Next Financial Crisis

Currency Devaluation Devaluation Currency Credit & Debt Bubble Debt & Credit OPEC Oil Crisis S&L Crisis The Great Recession

Trade War

1973 1981 1986-95 2000 2008 ? Early ‘80s Recession Dot-com Bubble Bubble Loan Student GreenBubble Auto Loan Bubble

© 2018 Darling Consulting Group, Inc. Page 29 Next Level MRM

A Industry Trends & the Regulatory Pendulum US Banking Industry Themes, Regulatory Pendulum, Model Proliferation

B Contemplating the Future of Model Risk Management SWOT, Regulatory Approaches, Possibilities, Challenges, 3 Lines of Defense, Validator Perspective

C Workshop Preview & Motivation MRM Culture, Model Lifecycle, Model Inventory, Aggregate Risk, Data, Validations, Future of MRM

© 2018 Darling Consulting Group, Inc. Page 30 Recipe for Successful Model Risk Management

Ingredients: Workshop Agenda – Day One ü 2 cups Regulatory Perspective – Ray & Liming ü 1 lbs. MRM Culture (Gov.) – Brandon & Mike ü 3 tsps. Lifecycle of a Model – Brandon & Mike ü 4 oz. Model Inventory Mgmt – Mike & Jonathan ü 5 cups Managing Inventory Risk – Jonathan ü 6 oz. Aggregate Risk – Ray & Liming

© 2018 Darling Consulting Group, Inc. Page 31 Recipe for Successful Model Risk Management

Ingredients: Workshop Agenda – Day Two ü 1 lbs. Data Management – Joe ü 5 cups Validating Statistical Models – Joe ü 3 tsps. Validating Other Models – Sam ü 4 oz. BSA/AML Validations – Brandon & Mike ü 5 cups After the Validation – Sam & Drew ü 6 oz. Future State of MRM – Mike & Jonathan

© 2018 Darling Consulting Group, Inc. Page 32 MODEL RISK MANAGEMENT | SEPTEMBER 27-28, 2018 | BOSTON Regulatory MRM Perspective Panelist Discussion

Day 1 | 9:45 am

Ray Brastow, Sr. Financial Economist, Federal Reserve Bank of Richmond Liming Brotcke, Quantitative Manager, Federal Reserve Bank of Chicago

© 2018 Darling Consulting Group, Inc. • 260 Merrimac Street • Newburyport, MA 01950 • Tel: 978.463.0400 • DarlingConsulting.com Page 33 Evolving MRM Expectations and Practices

u MRM has expanded in practice since 2011 guidance Ø Examiner expectations and industry practices continue to evolve Ø More advanced approaches migrating downward into mid-size and even community banking space u Broader risk management initiatives have been adding additional emphasis Ø Enterprise Risk Management (ERM) Ø Operational risk management initiatives u Significant changes are occurring in data, technology, model use and complexity u Organizations are seeking ways to streamline and improve MRM effectiveness and contribution

© 2018 Darling Consulting Group, Inc. Page 34 Most Common Regulatory Criticisms

û An incomplete model inventory û Lack of a robust process to update the model inventory on a regular basis û Models have not been validated prior to implementation and on an ongoing basis û Lack of independence of the person/group performing model validation û Model validation/quality control failing to demonstrate “effective challenge” û Lack of data used in the model development process to support review/validation

© 2018 Darling Consulting Group, Inc. Page 35 Most Common Regulatory Criticisms

û Undocumented and informal model enhancements û Lack of developmental evidence to substantiate model assumptions û Lack of explanation to support the application of expert judgment and model overrides û Lack of a process to generate important inputs for a model û Failure to maintain comprehensive and up-to-date model documentation û Failure to complete ongoing model risk management due diligence/monitoring

© 2018 Darling Consulting Group, Inc. Page 36 Key Areas of Focus

u MRM framework evolution u Ensuring effective challenge is occurring u Model inventory management u Coverage of “non-model” models and tools u Ongoing monitoring u Data management and validation u Addressing expanding technology, data, and automation u Evolving roles and responsibilities

© 2018 Darling Consulting Group, Inc. Page 37 MODEL RISK MANAGEMENT | SEPTEMBER 27-28, 2018 | BOSTON Establishing a Model Risk Management Culture CFP Masterclass

Day 1 | 10:45 am

Brandon Blanchard, VP Operational Risk Management, Commerce Bank Mike Guglielmo, Managing Director, DCG

© 2018 Darling Consulting Group, Inc. • 260 Merrimac Street • Newburyport, MA 01950 • Tel: 978.463.0400 • DarlingConsulting.com Page 38 Aligning Risk Mgmt Strategy & Organizational Culture

Credit: Dawson McDonald Consulting © 2018 Darling Consulting Group, Inc. Page 39 Secrets to MRM Success & Corporate Benefit

u We are all “cast members” u Perform with an experience mindset, not a task mindset! u Make personal accountability a part of the organization’s culture u Little wins add up!

© 2018 Darling Consulting Group, Inc. Page 40 Establishing a Model Risk Management Culture

† A Roles and Responsibilities The “Three Lines of Defense” B Governance Organization Organizational Structure C Governance and Validation Staff Staffing, Compensation/Incentives, Outsourcing, Independence D Policies and Procedures Model Lifecycle Management, Documentation, Validation Process, Communication E Establishing Effective Relationships, Maintaining Independence Relationships, Presence F Running Model Risk as a Risk Function Beyond Validations Communication, Stature, Strategic Value

© 2018 Darling Consulting Group, Inc. Page 41 Roles and Responsibilities

Regulator

External Audit: rd Assurance of MRM Controls for Key Financials 3 Line Board / Audit Committee of Defense Senior Management Internal Audit: 2nd Line Assurance of MRM Processes/Controls of Defense Model Risk Management: Model Validation, Reviews & Governance 1st Line of Defense Model Development, Documentation, Model Monitoring, Implementation & Usage Controls Owner

© 2018 Darling Consulting Group, Inc. Page 42 Detailed Roles and Responsibilities

First Line of Defense Second Line of Defense Model Owners, Developers, Users, Model Risk Management Group and Business Area Managers Model Risk Governance, Controls for Individual Model Risk Identification and Reporting Model Risk Mitigation Committee Oversight Formalized Control Framework Model Risk Appetite Independent Model Resolve Validation Model Identification Validation with Findings Disputes Model Risk Model, Data Methodology Management Policy Annual Model Review Review Model Risk and Development Process Reporting Model Risk Score Model Implementation Review Model Oversight of Model Model Use Risk Performance Monitoring Model Usage Escalation Risk Governance Review Model Validation Conduct Model Periodic Model Risk Finding Remediation Performance Monitoring Reporting

Third Line of Defense: Internal Audit Audits the Contents of and Compliance with MRM Policies, Procedures and Guidelines within the First and Second Lines of Defense © 2018 Darling Consulting Group, Inc. Page 43 Establishing a Model Risk Management Culture

A Roles and Responsibilities The “Three Lines of Defense” † B Governance Organization Organizational Structure C Governance and Validation Staff Staffing, Compensation/Incentives, Outsourcing, Independence D Policies and Procedures Model Lifecycle Management, Documentation, Validation Process, Communication E Establishing Effective Relationships, Maintaining Independence Relationships, Presence F Running Model Risk as a Risk Function Beyond Validations Communication, Stature, Strategic Value

© 2018 Darling Consulting Group, Inc. Page 44 Organizational Structure

u Varies by institution u Key factors Ø Size of organization Ø Type of institution Ø Existing infrastructure Ø Corporate vision Ø Independence u Evolves from strategy and efficiency

© 2018 Darling Consulting Group, Inc. Page 45 Common Structures

CRO ERM Audit

Operational Risk Operational Risk • Model Risk Audit

Performance Model Model Risk Monitoring Validation

• No single “right” way • Considerations – Independence – Competence – Authority and influence

© 2018 Darling Consulting Group, Inc. Page 46 Structure and Impact on Effective Challenge

“[C]ritical analysis by objective, informed parties that can identify model limitations and produce appropriate changes. Effective challenge depends on a combination of incentives, competence, and influence.”

u Incentives to provide effective challenge to models are stronger when: Ø Greater separation of challenge from model development Ø Well-designed compensation practices Ø Corporate culture

* Source: Guidance on Model Risk Management (OCC 2011-12/ Fed SR 11-7) © 2018 Darling Consulting Group, Inc. Page 47 Structure and Impact on Effective Challenge (cont’d)

u Competence is a key to effectiveness since technical knowledge and modeling skills are necessary to conduct appropriate analysis and critique.

u Challenge may fail to be effective without the influence to ensure that actions are taken to address model issues. Ø Explicit authority Ø Stature within the organization Ø Commitment and support from higher levels of management

* Source: Guidance on Model Risk Management (OCC 2011-12/ Fed SR 11-7) © 2018 Darling Consulting Group, Inc. Page 48 Establishing a Model Risk Management Culture

A Roles and Responsibilities The “Three Lines of Defense” B Governance Organization Organizational Structure † C Governance and Validation Staff Staffing, Compensation/Incentives, Outsourcing, Independence D Policies and Procedures Model Lifecycle Management, Documentation, Validation Process, Communication E Establishing Effective Relationships, Maintaining Independence Relationships, Presence F Running Model Risk as a Risk Function Beyond Validations Communication, Stature, Strategic Value

© 2018 Darling Consulting Group, Inc. Page 49 Staffing Perspective

CCAR (>$50B) 10-100+

DFAST ($10B-$50B) 1-5

Community Banks (<$10B) 0-1

© 2018 Darling Consulting Group, Inc. Page 50 Model Risk Management Staff Backgrounds

Leadership Model Governance Validation and Controls • CROs • Quantitative • Consultants • Operational Risk analysts • Risk managers executives • Data scientists • Audit and • Former examiners • Model developers compliance • Former external • Consultants • Model managers auditors or • Entry: math, • Project managers consultants actuary, science, economics or

© 2018 Darling Consulting Group, Inc. Page 51 Staffing Challenges

u Under-estimated budgets u Competing talent pool u Skillset gaps u Over-reliance on external resources u Retention u Communication u Training

© 2018 Darling Consulting Group, Inc. Page 52 Trends in Risk Management Staffing

u Growth and development

u Validation distribution and co-sourcing

u Qualitative modeling specialists

u Data specialists

© 2018 Darling Consulting Group, Inc. Page 53 Resource Sharing Framework

Team A Model Developers

Team C Team B Model Model Developers Developers

© 2018 Darling Consulting Group, Inc. Page 54 External Validators

u Later session to discuss in more detail

u Requisite skill is a must

u Willingness to build a relationship and transfer knowledge is key

u Cultural and stylistic differences matter!

© 2018 Darling Consulting Group, Inc. Page 55 Use of External Model Validators: Benefits

u Domain expertise Ø Quantitative skills Ø Vendor models u Horizontal perspective u Fill in skill or resources gaps u Source of training

© 2018 Darling Consulting Group, Inc. Page 56 Use of External Model Validators: Challenges

u Cost u Selection u Vendor quality and management u Ability to implement recommendations u Independence/conflict of interest with other internal or external parties

© 2018 Darling Consulting Group, Inc. Page 57 Establishing a Model Risk Management Culture

A Roles and Responsibilities The “Three Lines of Defense” B Governance Organization Organizational Structure C Governance and Validation Staff Staffing, Compensation/Incentives, Outsourcing, Independence † D Policies and Procedures Model Lifecycle Management, Documentation, Validation Process, Communication E Establishing Effective Relationships, Maintaining Independence Relationships, Presence F Running Model Risk as a Risk Function Beyond Validations Communication, Stature, Strategic Value

© 2018 Darling Consulting Group, Inc. Page 58 Risk Management Policies and Procedures

uModel Risk Management function beyond conducting validations on individual models!

Maintenance of a Definitions and Model Risk Policies, models inventory and identification of Management procedures and model metadata risks Reporting templates

Formalized Control Framework Requirements for Validations, Ongoing Performance Monitoring and other Model Testing

Organization – 1st, 2nd and 3rd line uA holistic process with “top-down” approach and executive/Board sponsorship to support execution of policies and procedures © 2018 Darling Consulting Group, Inc. Page 59 Model Risk Management Policies and Procedures

Model Risk Policy sets the “risk tone” u Enterprise model risk tolerance, appetite or framework u Definition of “model” for the organization u Roles and responsibilities for LOD u Lines of authority – committee and board u Model tier or risk rank u Validation ratings and findings typology (or in procedures) u Exceptions to model policy

© 2018 Darling Consulting Group, Inc. Page 60 Model Risk Management Policies and Procedures

Procedures should tell the organization “how” to do it. u Model validation process u Ownership or stakeholder responsibilities u Lifecycle of a model u Model validation engagement processes

© 2018 Darling Consulting Group, Inc. Page 61 Establishing a Model Risk Management Culture

A Roles and Responsibilities The “Three Lines of Defense” B Governance Organization Organizational Structure C Governance and Validation Staff Staffing, Compensation/Incentives, Outsourcing, Independence D Policies and Procedures Model Lifecycle Management, Documentation, Validation Process, Communication † E Establishing Effective Relationships, Maintaining Independence Relationships, Presence F Running Model Risk as a Risk Function Beyond Validations Communication, Stature, Strategic Value

© 2018 Darling Consulting Group, Inc. Page 62 Establishing Effective Relationships while Maintaining Independence: Relationships

u Establishing liaison responsibilities within MRM; maintain independence while providing: Ø Consultative support to clarify expectations Ø General practices / training on model methods u When specific guidance in required, make it part of finding with clear, actionable and impactful recommendations u Promote self-reporting as helpful (and required); and escalate non-reporting of models or model changes

© 2018 Darling Consulting Group, Inc. Page 63 Establishing Effective Relationships while Maintaining Independence: Presence

u Member of key risk management committees (either attendee or voting member) u Imbedded in other risk management processes; risk-control self- assessments, new initiative reviews, attend meetings for major model development (e.g., CECL, new risk grade or portfolio performance tools), data governance, etc. u Regular presentations to Board and executive committees

© 2018 Darling Consulting Group, Inc. Page 64 Establishing Effective Relationships while Maintaining Independence: Independence

u Primary goals are effective model challenge, credibility and transparency u Governance structure and policy can help formalize Ø Reporting structure and approval authority Ø Roles and responsibilities for three lines of defense Ø MRM standards and requirements of model owners/developers Ø Minimum expectations for model quality, by risk level Ø Process for dispute resolution and escalation u Model validation practices can be periodically confirmed with external consultant, audit, or peer review

© 2018 Darling Consulting Group, Inc. Page 65 Establishing a Model Risk Management Culture

A Roles and Responsibilities The “Three Lines of Defense” B Governance Organization Organizational Structure C Governance and Validation Staff Staffing, Compensation/Incentives, Outsourcing, Independence D Policies and Procedures Model Lifecycle Management, Documentation, Validation Process, Communication E Establishing Effective Relationships, Maintaining Independence Relationships, Presence † F Running Model Risk as a Risk Function Beyond Validations Communication, Stature, Strategic Value

© 2018 Darling Consulting Group, Inc. Page 66 Running Model Risk as a Risk Management Function Beyond Validations

Communication with business managers u Training on MRM requirements / providing clear guidelines for meeting MRM requirements u Instilling sense of responsibility/accountability for understanding and managing model risks u Creating confidence in MRM value and credibility

© 2018 Darling Consulting Group, Inc. Page 67 Running Model Risk as a Risk Management Function Beyond Validations

Establishing Stature u Enforcing Model Risk requirements u Rejecting or requiring changes to models as necessary u Identifying, reporting and escalating material model risks u Preserving / protecting the organization

© 2018 Darling Consulting Group, Inc. Page 68 Running Model Risk as a Risk Management Function Beyond Validations

u Challenge not audit: Look to avoid creation of a “gotcha” mindset u Value-add: Evaluate MRM’s contribution to improvements, and support ongoing improvements by model owners u Evaluate: Development and use of performance measures should be effective by achievable u Track: Key MRM risk factors to enterprise-risk tolerance or risk appetite u Report: Change over time with C-Suite and MRM Committee

© 2018 Darling Consulting Group, Inc. Page 69 Running Model Risk as a Risk Management Function Beyond Validations

At its worst… At its best… • Fail to distinguish model usefulness • Identify useful models • Spread model confusion • Promotes confidence in models • Miss quantification errors • Improve quantitative accuracy • Permit opaque qualitative judgment • Provide transparency • Generate executive uneasiness • Support executive decision-making • Prompt regulatory concerns • Instill regulatory confidence Model Risk Management Possibilities

© 2018 Darling Consulting Group, Inc. Page 70 Running Model Risk as a Risk Management Function Beyond Validations

EOM – Opportunity Mgmt. ERM – Risk Mgmt. Strategy to implement Risks to mitigate Enterprise Helpful Harmful Risk to achieving objectives to achieving objectives Management is part of a wider value creation and preservation strategy. Internal Origin Internal If using models strategically, then Origin

External Model Risk Management Model Risk Management in SWOT Enterprise Strategy Management (“ESM”) is strategic!

© 2018 Darling Consulting Group, Inc. Page 71 Session Takeaways

u Clear and complete roles and responsibilities are a necessity for a strong MRM function. u Establish an organizational structure with independence and stature. u Ensure a strong, flexible staffing model to ensure necessary qualifications and expertise u Policies and procedures and other Model Risk guidelines should communicate requirements and processes clearly.

© 2018 Darling Consulting Group, Inc. Page 72 QUESTIONS & ANSWERS

© 2018 Darling Consulting Group, Inc. Page 73 MODEL RISK MANAGEMENT | SEPTEMBER 27-28, 2018 | BOSTON Lifecycle of a Model CFP Masterclass

Day 1 | 1:00 pm

Brandon Blanchard, VP Operational Risk Management, Commerce Bank Mike Guglielmo, Managing Director, DCG

© 2018 Darling Consulting Group, Inc. • 260 Merrimac Street • Newburyport, MA 01950 • Tel: 978.463.0400 • DarlingConsulting.com Page 74 Agenda

u Building or Buying the Model u Documentation: Acceptable Standards u Ongoing Model Performance Monitoring u Model Tiers and Grading Models for Queuing Validations u Validation: Standards and Scheduling u Model Validation Findings u Resolution of Disputes over Model Issues u Revalidation of Models u The “Annual Touch”

© 2018 Darling Consulting Group, Inc. Page 75 Building or Buying the Model

u Start with the business case – why do we need/use this model? u Define the requirements – data, frequency, approach, accessibility, scalability u For internally developed models – Ø Assess for adequate level of skill and experience Ø Assess for awareness of all risks (ex. CRA/Fair Lending) Ø Availability and adequacy of data u For externally developed models – Ø Important to evaluate competency and reputation of vendor too! (and willingness to work with MRM)

© 2018 Darling Consulting Group, Inc. Page 76 Building or Buying the Model

• Methods & Variables • Define purpose & objectives selection • Assign roles • Data sets Gather • Notify MRM of new model • Testing Requirements • Documentation • Pre-Implementation validation • Issue remediation • Establish tolerances Development • Documentation • Procedures • Training • Ongoing monitoring • Challenger models • Re-development • Documentation • Change controls Implementation • Risk-based independent validations • Issue remediation Model Governance • Documentation • Retirement

© 2018 Darling Consulting Group, Inc. Page 77 Lifecycle of a Model

ü Effective Model Lifecycle Management Model Development & Documentation

Independent Model Model Retirement Validation

Ongoing Model Model Review Performance & Validation Monitoring

© 2018 Darling Consulting Group, Inc. Page 78 Documentation – Acceptable Standards

Executive Summary Model Theory & Design

Data Assumptions Output

Ongoing Performance Procedures Governance Monitoring Appendices

© 2018 Darling Consulting Group, Inc. Page 79 Documentation – Acceptable Standards

u Purpose / Objective u All assumptions and limitations u Description of management decisions using the model results u Design, theory, and logic u Mathematical specifications u Comparisons should be made with alternative theories u Rigorous assessment of data quality and relevance u Testing: Demonstrate the model is accurate, robust and stable u Interpretation of outputs with limitations delineated u Back testing or out of sample analysis and sensitivity analysis Ø Consider the use of Standardized Templates.

© 2018 Darling Consulting Group, Inc. Page 80 Model Risk Ratings – Evolving Practices

High Moderate Low

More Complexity Less • Statistical models • Multi-model constructs

More Business Impact Less • Financial (IRR, Liquidity, Credit, Capital/Earnings) • Operational (Compliance, , Fraud) More Review Frequency Less • 1/2/3 year cycle (or 2/3/5?)

© 2018 Darling Consulting Group, Inc. Page 81 Model Tiers and Validation Prioritization

Example: Tier Determines Validation Intensity, Documentation Requirements Model Importance/Tier Tier 1 Tier 2 Tier 3

Validation Type Full Limited Minimal

Ongoing monitoring Semi-annual Annual Annual

Re-validation frequency Every 1 years Every 2 years Every 3 years (or significant changes) MRM performs a review on years validation does not occur.

© 2018 Darling Consulting Group, Inc. Page 82 Ongoing Performance Monitoring Identified as one of three core elements of MRM in guidance

u“Confirms that the model is appropriately implemented and is being used and is performing as intended” u“Essential to evaluate whether changes in products, exposures, activities, clients, or market conditions necessitate adjustment, redevelopment, or replacement of the model” u“Monitoring begins when a model is first implemented… [and] should continue periodically over time, with a frequency appropriate to the nature of the model, the availability of new data or modeling approaches, and the magnitude of the risk involved” û Failure to complete ongoing model risk management due diligence/monitoring

© 2018 Darling Consulting Group, Inc. Page 83 Common Questions/Challenges

u What tests should be performed? u How frequent should testing occur? u Who is responsible for testing? u How do you set thresholds? u How should results be documented and communicated? u Exception handling? u Can the process be automated?

© 2018 Darling Consulting Group, Inc. Page 84 Testing Approaches and Considerations Considerations when developing a testing plan

u Tests often ideally resides with model owners u Frequency is function of the model and its relative performance u Testing will vary by model Ø Model type and complexity (statistical, stress, cashflow, etc.) Ø Source (in-house vs. 3rd party/black box) Ø Ongoing performance and strategic significance u Types of testing Ø Data and statistical tests (stability testing, Gini coefficient, k-fold, walk-forward, etc.) Ø Benchmarking Ø Backtesting Ø Sensitivity testing u Tip: start with what was done for testing during development u Note: tests should reassess model limitations identified during development

© 2018 Darling Consulting Group, Inc. Page 85 Who Is Responsible for Ongoing Monitoring?

u In many instances, model owners should be responsible Ø They have most intimate understanding of the model mechanics Ø They will generally have access to the data and tools u MRM can contribute if resourcing is an issue u Vendors can contribute with black box models u Often is a source of contention between LOBs and MRM u Create clarity and “rules of engagement” in MRM policy u Specify testing expectations, thresholds, and actions related to exceptions in model documentation

© 2018 Darling Consulting Group, Inc. Page 86 Setting Thresholds and Handling Exceptions

u Function of the model, types of tests and results variability u For benchmarking or back-testing, 5% is common starting point but try to evaluate historical trends and develop limits from that exercise u Thresholds should be defined in OPM documentation u Triggers and action plans or steps should be well described u Ideally, results are reported regularly to MRM, risk, audit, etc.

© 2018 Darling Consulting Group, Inc. Page 87 Validation: Standards and Scheduling

Assign Request Analyze Report Finalize

Model Risk MR requests MR analyzes MR creates Model owner, Director and receives all model validation report developer, assigns model information. with findings, user provide validation documents. key risks, action plan to to MR Model owner, analysis, and address personnel. developer, risk ratings, findings. MRM user provide communicated enters all into information. to model owner, the system of developer, user. record.

© 2018 Darling Consulting Group, Inc. Page 88 Validation: Standards and Scheduling

•Ensure data •Verify model •Replicate •Sensitivity •Ensure ongoing adequacy and design and model results analysis governance accuracy methodology •Confirm the •Model •Assess control •Assess data (incl applicable accuracy and stability environment limitations guidance) reasonability testing •Documentation •Verify and test •Evaluate inputs of model •Model stress adequacy model & assumptions •Build testing as •Review ongoing assumptions •Discuss challenger needed (non- monitoring alternative models as regulatory) model designs needed

© 2018 Darling Consulting Group, Inc. Page 89 Model Validation Findings

u Treat these just like Audit Findings! u Align severity with Internal Audit or ERM scale u Categorize findings to identify trends (“Emerging Issues” to report to ERM) u Ensure findings and recommendations clearly define the risks and that the Model Owner has a well-defined plan to remediate u These require MRM retesting time and effort to close!

© 2018 Darling Consulting Group, Inc. Page 90 Model Validation Findings

Tracking and Resolving Model Validation Findings

Initiation Resolution Closure Central Select Persons Who Are... Database Responsible for closing the finding Accountable Consulted Verification for the finding for the Efforts Testing Finding Inform staff who need to know progress Validation Bodies Set and Enforce.. Close Out Issue Audit Regulator Validation Timelines Deliverables

© 2018 Darling Consulting Group, Inc. Page 91 Resolution of Disputes

u Challenging the draft report u Exit meeting to review final report and conclusions u Accountability for the finding(s) u Policy provisions for escalation or appeals process u Risk Oversight group

© 2018 Darling Consulting Group, Inc. Page 92 Revalidation of Models

u Limited scope vs. full scope? u Common elements include: Ø Review of prior validation work papers and testing plan Ø Review of recent audits or compliance reviews of model owner department to evaluate overall modeling environment Ø Review of remediation steps from prior findings Ø Evaluation of changes to model, documentation, data, expanded use or upgrades to vendor system Ø Assessment of change control process and log Ø Review trends of ongoing performance monitoring/testing results performed by Model Owner

© 2018 Darling Consulting Group, Inc. Page 93 The “Annual Touch” (Periodic Review)

FOCUS MEASURE VERIFY Overall Review KRIs Document AFFIRM model and KPIs that current function and captured and change Validation schedule possible during management affirmed systemic ongoing procedures issues monitoring are followed

© 2018 Darling Consulting Group, Inc. Page 94 Executive and Board Reporting

u Validation progress or key metrics u Model risk includes: Ø Adverse consequences from decisions based on models that are incorrect or misappropriated Ø Financial loss, poor decisions, damage to reputation

© 2018 Darling Consulting Group, Inc. Page 95 Executive and Board Reporting (cont’d)

uHow do you end up with Model Risk? § Fundamental errors in methodology, design, assumptions, input data, or implementation § Using the wrong model for the wrong purpose without understanding the model’s limitations

© 2018 Darling Consulting Group, Inc. Page 96 Executive and Board Reporting (cont’d)

uModel risk never goes away § Model controls § Risk acceptance uCommon snapshot for enterprise-wide model risk § Model risk dashboard § Model risk tolerance § Input to enterprise risk appetite or operational risk

© 2018 Darling Consulting Group, Inc. Page 97 Executive and Board Reporting (cont’d)

uBuild a Dashboard § Decide with executives on metrics or “factors” to gauge enterprise-wide model risk. § Ex: Inventory vs. schedule status, model ratings by tier, severity and status of findings, exceptions to model policy, audit or regulatory § Set up management reporting from inventory, findings, GRC, other sources. § Build a visual snapshot on status of model risk factors for routine reporting. © 2018 Darling Consulting Group, Inc. Page 98 MODEL RISK MANAGEMENT | SEPTEMBER 27-28, 2018 | BOSTON Model Inventory Management CFP Masterclass

Day 1 | 2:15 pm

Mike Guglielmo, Managing Director, DCG Jonathan Hill, PhD, former Global Head of Model Risk Governance, Credit Suisse

© 2018 Darling Consulting Group, Inc. • 260 Merrimac Street • Newburyport, MA 01950 • Tel: 978.463.0400 • DarlingConsulting.com Page 99 Agenda

• Definition of a “Model” and “Model Risk” • Examples of Risk Models • Establishing and Maintaining an Inventory • Data to Determine Model Materiality • Using the Inventory to Manage the Program • Technology to Maintain the Models Inventory

© 2018 Darling Consulting Group, Inc. Page 100 Regulatory Definition of “Model”

Supervisory Guidance on Model Risk Management

[T]he term model refers to a quantitative method, system, or approach that applies statistical, economic, financial, or mathematical theories, techniques, and assumptions to process input data into quantitative estimates…

A model consists of three components: • an information input component, which delivers assumptions and data to the model; • a processing component, which transforms inputs into estimates; and • a reporting component, which translates the estimates into useful business information…

FRB SR11-7/OCC 2011-12 © 2018 Darling Consulting Group, Inc. Page 101 The Challenge

u The definition in the Supervisory Guidance is accurate but purposely non-prescriptive

u It does not readily distinguish between a model and a non-model tool, ignoring key discriminators: Ø Judgmental assumptions Ø Uncertainty of outputs

u Many approaches in practice

© 2018 Darling Consulting Group, Inc. Page 102 Definition of “Model” – Cases

uCalculators, critical spreadsheets, end-user computing (“EUCs”) “[Q]ualitative approaches … – i.e., those not defined as models according to this guidance – should also be subject to a rigorous control process.”* uJudgmental processes that produce quantitative results “[T]he term model … also covers quantitative approaches whose inputs are partially or wholly qualitative or based on expert judgment, provided that the output is quantitative in nature.”* uFeeder models uSystems of models uModels used for multiple purposes uBenchmark and challenger models *Supervisory Guidance on Model Risk Management © 2018 Darling Consulting Group, Inc. Page 103 Definition of “Model Risk”

u Model risk is “the potential for adverse consequences from decisions based on incorrect or misused model outputs and reports. Model risk can also lead to financial loss, poor business and strategic decision making, or damage to a bank’s reputation.” u Two primary reasons: Ø Fundamental errors with the model Ø Incorrect or inappropriate use of the model

© 2018 Darling Consulting Group, Inc. Page 104 Establishing Materiality and Priority

Materiality Complexity Importance • P&L or income • Number of • New strategic line statement ($M) assumptions or data of business • EPS ($0.xx) transformations • Regulatory scrutiny • Capital ratios • Number of input or (e.g., DFAST, (x percent) feeder models CCAR, BSA, • Balance sheet • Sophistication of CECL) movement ($M) type or platform (i.e., • Reputational risk • SOX threshold Monte Carlo vs. • Upcoming changes (x percent) linear regression?) to the Bank (e.g., M&A) • Assess controls: documentation, oversight and governance, reporting, data rules.

© 2018 Darling Consulting Group, Inc. Page 105 Definition of “Model” Business

“Model” is defined as a quantitative method, Decisions system or approach that applies statistical, economic, financial, or mathematical Data theories, techniques, and assumptions to process input data into quantitative estimates. ` Theory Estimates Variables Analytics Projections Statistics Inferences Techniques Assumptions Quantitative approaches whose inputs are partially or wholly qualitative or based on expert judgment, provided that the output is INPUT quantitative in nature also fit this definition. OUTPUT

© 2018 Darling Consulting Group, Inc. Page 106 Example #2: Model Definition

If YES to 3 or more, it is a model.

Does the tool or spreadsheet use Does it produce an estimate, forecast, mathematical techniques? or score? (e.g., linear regression, neural network, scoring or scorecard?)

Does it use assumptions or sub-models as Is it used for detection or decision- inputs? making? (e.g., FICO scores, payment speed, or (e.g., automated underwriting, fraud decay rate) identification, investment valuation, or loss estimation)

© 2018 Darling Consulting Group, Inc. Page 107 Example #3: An Alternative Approach

“Is it a model?” is a contentious question.

Consider asking instead: Should it be subject to MRM governance and rigor?

This question facilitates compliance with both the spirit and letter of regulatory expectations, while simultaneously streamlining model governance requirements, thereby freeing up resources for other priorities.

If a tool is not subject to model governance, it is still inventoried, tracked and subject to LoB governance.

KEY: MRM needs to own the final decision!

© 2018 Darling Consulting Group, Inc. Page 108 Model “Inventory” Development

u Model definition IRR Liquidity Risk Credit Risk ALLL/CECL AML/BSA

Ø Models vs. “tools” Stress Testing Credit Scoring MSRs Loan Pipeline Loan Pricing u Model risk ratings Funds Transfer Capital Operational u Formalized inventory development Deposit Pricing Pricing Profitability Planning Risks

Derivative Budget Financial u Ongoing assessment frequencies Loss Migration Hedging Pricing Modeling Planning u Spreadsheet and database controls Securities Compensation Incentives Loan Valuation Valuation Cash Flow

Deposit Economic VaR Prepayments Sensitivity Capital Fraud

Behavioral Salaries & Models Benefits Insurance Taxes Fee Income

© 2018 Darling Consulting Group, Inc. Page 109 Examples of Risk Models

Rate Liquidity Credit ALLL / AML / Stress Credit Risk Risk Risk CECL BSA Testing Scoring

Loan Loan Deposit Transfer Capital MSRs Profitability Pipeline Pricing Pricing Pricing Planning

Operational Loss Derivative Financial Hedging Budgeting Incentives Risks Migration Pricing Planning

Loan Securities Cash Value Prepay- Deposit Economic Valuation Valuation Flow at Risk ments Sensitivity Capital

Behavior Salaries Fee Cost Fraud Insurance Taxes Models & Benefits Income Allocation

© 2018 Darling Consulting Group, Inc. Page 110 Establishing a Models Inventory

uAsk the Business Lines to list all their models? No!

uMeet with key Business Lines and their Risk Management partners. § Road show to generate awareness and educate groups § Articulate the definition and Rule of Thumb with examples.

uInsert model risk assessment into Business Line processes. § Partner with risk groups – Enterprise Risk, Third Party Risk, Audit, Compliance, Business Continuity § Work with groups – Strategic Planning, Vendors, IT, EDW

© 2018 Darling Consulting Group, Inc. Page 111 Maintaining the Model Inventory

uFirst item on auditors’ and examiners’ request list: Why? uUpdate at regular risk assessments, validation conclusions uK.I.S.S. Focus on low effort / high return activities. § Quarterly management surveys/attestations – to confirm existing models, report new ones, and notify of changes or retirements. § Leverage the periodic review. § Monitor new product committees and business unit strategies. § Establish regular program reporting. § Include model risk in the RCSA process.

© 2018 Darling Consulting Group, Inc. Page 112 Process is Key!

uPartner with Vendor Management and Legal → If a vendor offers a model as part of service, include language in the Statement of Work that the service must comply with the Supervisory Guidance and Model Risk Policy.

uInternal Audit, Planning, IT and Data Governance can help. • Be proactive before Audit hits Model Risk with findings.

© 2018 Darling Consulting Group, Inc. Page 113 Should be in the Model Inventory

u All active and retired models and models in development

u Listing of multiple applications for any model

u Computational tools deemed to be non-models?

u Information to assess model prioritization or risk ranking

© 2018 Darling Consulting Group, Inc. Page 114 Data to Record for Each Model

• Name, ID, version, status (active, retired, in development), classification (e.g., champion, challenger), type • Materiality, complexity, uses, and controls • Implementation platform (e.g., Excel, SAS, vendor platform) • Exceptions to Model Policy, grant date, deadline, and reason • List of stakeholders: developer, vendor (if applicable), owner, user(s), oversight committee, validator, etc. • Dates of implementation and approval • Risk tier or ranking (e.g., High, Medium, Low) • Availability and iteration of model performance report Recommend a • Date and outcome of last annual validation and rating separate findings • Outstanding validation findings and their status, due database. dates, remediation action owner, etc.

© 2018 Darling Consulting Group, Inc. Page 115 Using the Inventory to Manage the Program

u Prioritization of limited resources u Scheduling activities u Program reporting u Upcoming deadlines u Aggregation of risk throughout the institution u Visualizing relationships (feeder models, data sources, etc.) u Tracking issues u Knowledge base of activities for Model Owners to reference

© 2018 Darling Consulting Group, Inc. Page 116 Role of Technology

uIf the bank has up to 100 models: § An Excel or Access database or homegrown solution may suffice, depending on the number of data fields. § Many Model Risk processes can be handled manually (e.g., model approval, change management, inventory confirmation). uIf the bank has more than 100 models: § MS Access or Excel is likely not going to cut it. § Model risk technology should be used by all Business Units and leveraged across more phases of the model life cycle.

© 2018 Darling Consulting Group, Inc. Page 117 Technology Considerations

u Still fairly new, so implementation teams are still learning. u Every implementation requires heavy customization. u Insufficient resources internally or externally who understand Model Risk requirements for a technology solution. u Who will maintain the system? u Complicated system for model owners/users to learn for one model and to interact with semi-annually. u Consider integration with established GRC technology

© 2018 Darling Consulting Group, Inc. Page 118 Three Lines of Defense

• Business Lines must ensure that all computational processes are considered against the “model” definition. • Model Risk has final say on model-or-non-model. → Collaboration is needed between the business lines and Model Risk. → Model Risk, as experts on the Supervisory Guidance, makes final decision (with the Model Risk Committee). → Supervisors want to see a list of rejections, and reasons why. • Audit does clean-up during process audits. → Ensures that all end-to-end processes were considered.

© 2018 Darling Consulting Group, Inc. Page 119 The End Game: Effective Inventory Management Process

u Initial identification of models u Ongoing process to identify new models u Model risk weighting assignment u Validation/review cycle u Overall model lifecycle management Ø Inception Ø Use Ø Replacement Ø Retirement

© 2018 Darling Consulting Group, Inc. Page 120 Section Takeaways

• Adopt the definition of “model” for your Bank, recognizing the Supervisory Guidance language. • Ensure critical buy-in and confirmation from C-Suite and Business Line executives. • Develop an ongoing, robust inventory identification and maintenance process – a must! • Consider carefully technology needs vs. cost. • Adapt the model inventory system to program maturity, organization size, and need for enterprise-level integration.

© 2018 Darling Consulting Group, Inc. Page 121 Emerging Practices

u Inventory of non-models / tools Ø Tools can evolve into models u Ongoing reassessment u Expanded attestation process u Use of automated tool management u Establishment of qualitative validation sub team

© 2018 Darling Consulting Group, Inc. Page 122 QUESTIONS & ANSWERS

© 2018 Darling Consulting Group, Inc. Page 123 MODEL RISK MANAGEMENT | SEPTEMBER 27-28, 2018 | BOSTON Managing Inventory Risk: Should a Model “Know” Its Own ID?

Day 1 | 3:30 pm

Jonathan Hill, PhD, former Global Head of Model Risk Governance, Credit Suisse

© 2018 Darling Consulting Group, Inc. • 260 Merrimac Street • Newburyport, MA 01950 • Tel: 978.463.0400 • DarlingConsulting.com Page 124 Disclaimer

All of the ideas, opinions, suggestions, notions or asides offered in this presentation are entirely the opinions of the speaker and should not be construed to represent in any way those of Credit Suisse, Morgan Stanley, Citigroup or any other previous employers.

© 2018 Darling Consulting Group, Inc. Page 125 Outline

• What is inventory risk and why is it important? • What are the liabilities of unmitigated inventory risk to financial firms? • What are the 8 most terrifying model inventory questions that can be posed to a model risk manager during a bank exam? • The single piece of model-embedded information that would enable firms to answer those 8 questions readily and accurately. • Why are financial firms so far behind leading tech firms (Tesla, Uber, Amazon, eBay, Google) in their ability to track usage? • The presentation will include a video demonstration of a prototype employing simulated model usage data. • The major pros and cons of the proposed solution to inventory management • Discussion of some potential objections

© 2018 Darling Consulting Group, Inc. Page 126 SOME THOUGHTS ON My Definition of Inventory Risk (adapted from SR11-7): MITIGATING INVENTORY Inventory risk is the risk resulting RISK BY IMPROVING MODEL from incomplete or inaccurate quantitative model inventories, the USAGE TRANSPARENCY use of models that have previously been retired or remain unvalidated or the use of models that have not been entered into inventory.

© 2018 Darling Consulting Group, Inc. Page 127 Let’s Be Honest And Confront Our Fears The Eight Most Terrifying Questions a Model Risk Manager Could Be Asked During a Regulatory Bank Exam1

1) What is the exact number of different models that have been used over the last year? 2) How often has each model been executed, by day, by month, by year? Can you identify the most frequently and least frequently executed models? 3) Where are the firm’s models being used? By business unit, legal entity, geographic regions? 4) Can you provide a complete list of the models used by each of the above entities over the last year? 5) Are there any models in your inventory with an active status that were not executed during the last year? 6) Are there any models that were executed on any of your firm’s computers that do not appear in inventory? Please provide a full listing. 7) Are you able to provide a full list of the IDs of models that exhibit significant seasonality? If so, what are the peak and trough’s of seasonal model usage. 8) Were there any instances of a retired model still being executed during the last year?

1 There are likely other types of questions regarding model inventory that are difficult to answer accurately. These eight are the most obvious questions I can think of. Perhaps readers can think of some others. © 2018 Darling Consulting Group, Inc. Page 128 Why Is it So Difficult For Firms to Give Accurate Quantitative Answers to These Types of Usage Questions?

© 2018 Darling Consulting Group, Inc. Page 129 This section will attempt to identify a single underlying reason why few if any firms can THE HEART OF THE answer the 8 questions MATTER presented on slide #4 with a high degree of confidence. This is the true source of most model usage opacity and inventory risk

© 2018 Darling Consulting Group, Inc. Page 130 The Simple Answer Is This: Models Do Not ‘know’ Their Own IDs!

Software implementations that are classified as models are assigned unique l IDs as a convenient shorthand identifier. At most firms these IDs typically appear in 3 places: in the model documentation, in the validation documents and in the inventory database as a lookup index. Where it does not appear is in the actual model source code. It is in this sense that models do not ‘know’ who they are.

The root cause of model usage opacity may be traced to this single surprising blind spot 1 in most firms’ model risk management framework. Adding this one piece of information to a model can create a path to mitigating or even eliminating inventory risk.

Mode inventory databases typically house all of the relevant documentation for every model that is assigned an ID such as development and validation documents, and in some rare cases, even source code. Yet the models IDs themselves are not embedded in the model’s source code as a standard practice. In the next section a proposed innovation will be described. If added to every model in a firm’s inventory, it can go a long way towards improving a firm’s overall model discipline, the transparency of model usage and addressing the vexing questions posed in slide #4.

1 At first blush this model ‘blind spot’ may not seem to be a true root cause. This presentation will endeavor to convince any doubters that this is indeed the case

© 2018 Darling Consulting Group, Inc. Page 131 Compare This Singular Blind Spot In Our Models to Some Other Familiar Technologies:

• My smart phone ‘knows’ its unique serial number (it’s embedded in the permanent onboard memory that stays with the phone for life). • My washing machine knows its own serial number too, so does my automobile. These are embedded in the onboard electronics that control these devices. • Even before electronics, serial numbers were stamped on the frames of every automobile that Henry Ford produced and somewhere on almost all manufactured products of any significance. • The Uber ride service tracks the current geographic location of every one of its active vehicles and advices clients on both the location & estimated time of arrival of their ride. • Today, Tesla has the ability to track every one of their vehicles in service at a given time, its location, travel speed, level of charge and other useful indicatives.

© 2018 Darling Consulting Group, Inc. Page 132 We Might Call This Particular Form Of Opacity “Inventory Risk”

Inability to answer the previous questions regarding model usage is indicative of a form of model risk that is that is rarely identified or analyzed in its totality because it belongs to a class of less familiar risks that reside outside of and between models. These are risks that arise not from any model but from within a Firm’s Model Ecosystem.1

What are the types of liabilities that may arise from unmitigated model inventory ecosystem risk? Here are a few …. • Regulatory risk arising from incomplete or inaccurate model inventories (i.e. CCAR bank exams) • Financial and regulatory risks arising from the use of unvalidated or retired models • Difficulty in identifying models still in inventory but no longer in use • Inability to enforce model risk management practices uniformly across all models, asset classes, regions and legal entities. • Manual inventory attestation processes are error-prone and invariably result in errors of omission • Incomplete understanding of upstream and downstream model dependencies • Lack of transparency into firmwide model usage, regionality, seasonality, etc. 1 A tip of the hat to Martin Goldberg for his seminal 2017 paper entitled “Much of Model Risk Does Not Come From Any Model”, The Journal of Structured Finance, Spring, 2017, pp. 32-37. Although not described in this paper, inventory risk is clearly from the class of less well-recognized model risks that are external to models. Martin is currently working at Bloomberg on credit risk models. © 2018 Darling Consulting Group, Inc. Page 133 Model Inventory Is Still Primarily a 20th Century Manual Process! Why Is That?

o One of the more daunting challenges facing model risk managers at major financial firms is the task of ascertaining that the model inventory, however it is implemented and maintained, is complete and accurate. 1

o At almost every firm this is accomplished through a manual process called attestation: model managers or functional heads for every asset class and business unit are asked to sign off on the complete set of models that fall within their domain of ownership and responsibility according to inventory records.

o Such a manual process can be both clumsy and error-prone – some models may simply be overlooked in the process, some may be ‘orphans’ (models mis-assigned due to staff turnover or re-allocation of responsibilities and therefore without owners) while some orphans may no longer be in use

1 None of the firms I have worked at (Salomon Smith-Barney, Citigroup, Morgan Stanley, Credit Suisse) or with as a consultant have any accurate quantitative way of ascertaining the accuracy and completeness of their model inventories other than to query model owners/developers or their downstream users and receiving qualitative estimates. It is also un uncomfortable fact that model supervisors/owners/developers do not always know a who all of their downstream users are.

© 2018 Darling Consulting Group, Inc. Page 134 Model Inventory Is Still Primarily a 20th Century Manual Process! Why Is That?

o Resolving any discrepancies can require numerous iterations of the attestation process to determine the current correct ownership of orphan models.

o Particularly problematic are upstream and downstream dependencies between models. MRM relies on model owners to identify all upstream models, but often the model owners will not have complete knowledge of all of downstream models that receive their models’ output as input.

In an age of automation, machine learning and big data we really should ask ourselves if we cannot find better ways to make firmwide model usage more transparent and in doing so help to automate the model attestation process. Inventory risk is one of the few risk types that, unlike model risk, can potentially be completely eliminated.

1 Note: the role of model risk manager is relatively new and complements the role of model validator in mitigating model risk 2 If they cannot there are larger problems in model development management. © 2018 Darling Consulting Group, Inc. Page 135 Creating models that ‘know’ their own IDs by TWO STEPS TO embedding them in MITIGATING OR their source code is a ELIMINATING MODEL simple yet necessary first step. But this alone INVENTORY RISK is not sufficient. The benefits will follow from what we do with that embedded information.

© 2018 Darling Consulting Group, Inc. Page 136 First Step: Embed Model IDs Inside the Models

Embedding each model ID in the model’s source code as an active object is a simple yet necessary first step that will offer immediate benefits if the model IDs and (optionally) version numbers are attached to the model outputs – this could be used to identify version incompatibility issues for example. It may be useful to think of the embedded IDs as tokens that can be passed from upstream to downstream models (models that receive input from other models). A downstream model’s collection of tokens would uniquely identify all upstream contributors based on execution sequence rather than on attestation by model owners.

This first step will not provide answers to all of the questions posed in slide #4 but it will create the basis for a scalable methodology that can enable firms to collect the data necessary to answer such inventory questions for all models used within a firm, including EUC (End User Controlled) and spreadsheet models.

© 2018 Darling Consulting Group, Inc. Page 137 Second Step: Create A Model Transponder Function Innovation Embedding Model IDs as tokens into Source Code is a Necessary First Step But The Second and Final Step Will Require More Investment

• Assuming models have their IDs embedded in their source code, the next obvious question is what could we do with that information? The answer is: with a little thought, quite a lot • The second step will require the creation of a Model Transponder Function, inspired by the radio transponders that air traffic controllers rely on to track civilian and commercial aircraft • What sorts of data would we want a model transponder to ‘broadcast’ to a centralized model usage database via the firm’s intranet? Here are a few important indicative data fields to start with : 1) Model ID 2) Name of the Model (as a text string) – model names may not be unique, so cannot serve as index 3) Timestamp at execution – date, hour, minute granularity 4) Type of model – pricing, risk, credit, forecasting, finance, HR, etc. 5) Implementation – production code (C++, JAVA, etc.), or EUC model 6) A MAC address 1 - uniquely identifies the processor executing the model 7) Vector of Upstream model ID tokens – this information would be invaluable if the model ID is also embedded as a token in any results produced by the model. If deployed comprehensively across the firm, passing ID tokens between models could capture all upstream and downstream dependencies based on execution process rather than on attestation by model owners/developers

1 The Media Access Control, or MAC, address is the hardware equivalent of an IP address. It is a unique identifier embedded in every computer’s network interface card and can be used to identify not only the actual computer executing the software but through a lookup function its physical location. A computer’s unique MAC address can be obtained via a function call to the computer’s Operating System. © 2018 Darling Consulting Group, Inc. Page 138 How Would A Model Transponder Function Operate? Embedding Model IDs as tokens into Source Code is a Necessary First Step But The Second and Final Step Will Require More Investment In order to use embedded Model IDs to track model usage globally, a firm’s developers would need to add a basic new functionality to each model in the form of a Transponder Function:

1) The Transponder Function would be called once each time the model code is executed. 2) The Transponder should be have the ability to transmit indicative data about the model via the Firm’s Intranet to a central database. (These data fields are listed in the previous slide) 3) Transmission permission must be strictly one-way, from model to database, in order to avoid opening a back door into the model. 4) As an option to #3, to avoid the risk of jamming the firm’s intranet Transponder output could be written into local temporary file systems (or databases). 5) Since the usage data is not timely, a sweep of all temp files into a central database could be made on a regular basis during off-peak intranet hours (i.e. weekly at 3 AM on Sundays). 6) At the end of a year’s worth of data collection a treasure trove of information about model usage could be available in the central database.1

1 The resulting trove would constitute a voluminous audit trail of information about model usage amenable to analysis using data mining and Machine Learning algorithms to find patterns of model usage not readily detectable by human inspection and analysis of the usage data. © 2018 Darling Consulting Group, Inc. Page 139 Conceptually, Model Usage Tracking Is Really Rather Simple

The Most Important Goal is to Achieve Independence from Execution Platforms!

Model Model Model usage usage usage indicative indicative indicative Dummy data Transponder data Intranet data Model Function or temp Centralize with ID file d Database

But the devil may be hiding in the details …

Note: It may not be necessary for the Transponder to send data to a centralized database via the Firm’s intranet – this is really a placeholder for any type of communication pipe that a Firms’ IT staff choose, For the purposes of this presentation it is not particularly important to specify how the communication is to be implemented, only that the final destination is a central database with a log of model usage statistics from slide #12, indexed by model ID and collected over a significant length of time, e.g. at least one year. © 2018 Darling Consulting Group, Inc. Page 140 As a First Step, Create a Simple Proof of Concept Simulation A practical way to establish the value added by embedding IDs and installing a Transponder Function Using ‘Dummy’ Models A proof of concept could be demonstrated via a simulation that doesn’t require modifying any production models and very little time or IT resources:

1) Create a set of one hundred of ‘dummy’ or synthetic models that contain only an embedded test ID and a prototype Model Transponder Function.1 2) Develop a script that will execute the dummy models with randomly assigned frequencies, some very frequently, some infrequently and others with random intermediate frequencies. 3) Use the script to simulate seasonality and regionality for subsets of the models. 4) Simulate a several year’s worth of model usage. 5) Mine the resulting database information to create various types of analyses (frequency histograms, seasonality charts, distribution by regions, usage spikes, dead periods, etc.) and to identify patterns of usage. 6) Use the simulation to identify flaws in the Transponder Function, communication pipelines and the centralized database. This can help to identify problems and refine the method before production 7) Present results to management to make the a case for authorizing formal production. 1 Note that the source code for the transponder does not have to be included in the model’s source code, in fact it probably should not be. Rather, the Transponder Function code should be maintained separately of any model and compiled into a Dynamically Linked Library (DLL) that can be joined with the compiled model code during the build process. This will allow the Transponder Function to be modified without modifying the model codes. © 2018 Darling Consulting Group, Inc. Page 141 A Transponder Prototype Written in the R Language Might Look Something Like This:1 A Call to Execute the Transponder Function

postLog("1500", “BondPricer", "CashFlow", "R") (This is the single line to be embedded in model source code) R source code for the transponder function used in the simulation: library(httr) postLog <- function(modelid, modelname, modeltype, language) { p <- POST(paste0(url, 'id=', runif(1)*10000000, '&modelid=', modelid, '&modelname=', modelname, '&modeltype=', modeltype, '&language=', language, '&date=', as.Date(substr(gsub(" ", "_", Sys.time(), fixed = TRUE), 1, 10)), '&time=', substr(gsub(" ", "_", Sys.time(), fixed = TRUE), 12, 19), '&user=', as.character(Sys.info()['user']), '&location=', location, '&sysname=', as.character(Sys.info()['sysname']), '&ip_address=', gsub(".*? ([[:digit:]])", "\\1", system("ipconfig", intern=TRUE)[grep("IPv4", system("ipconfig", intern=TRUE))]))) } 1 R code for this prototype Transponder Function was developed by David Leonard at FI Consulting, Arlington, VA.

© 2018 Darling Consulting Group, Inc. Page 142 What Would Transponder Simulation Results Look Like?

The graphs displayed in the following slide were produced by collecting 4 data fields for each of the 100,000 model execution events on 100 ‘dummy’, or synthetic, models with embedded model IDs and prototype Transponder Functions (slide #19) over a simulation horizon of 3 1/2 years. The four data fields required to produce the plots on slide #21 are: Model ID, Model Name, Time Stamp, MAC Address.

The graphical dashboard was implemented on an Amazon Web Services Cloud platform.

© 2018 Darling Consulting Group, Inc. Page 143 Simulation Dashboard for an Embedded Prototype Transponder 100 Synthetic Models and 100,000 Random Execution Events2

2The prototype Transponder Function and dashboard display used for this simulator were developed in collaboration with the author by David Leonard at FI Consulting, Arlington, VA. The usage plots were produced by collecting only 4 data points for each simulated event: ID, model name, timestamp and MAC address © 2018 Darling Consulting Group, Inc. Page 144 Summary of Pros and Cons This presentation has described an innovative method for improving the transparency of model usage across an entire Firm, but not without cost. Here are the pros and cons of this approach: Pros: • Inside the Model: The Model Transponder approach places the tracking usage software inside each model rather than relying on an external execution platform to track and store usage statistics.1 • Scalable: This approach will scale up readily from a few dozen to many thousands of models. Execution platforms for production models at most firms have limited model scope and therefore will not scale up easily. • Comprehensive Solution: Because it is platform independent it is a global solution that will operate on any Firm computer that has access to the firm’s intranet (or that can write results to a temporary file). • Incremental: The proposed innovation can be implemented incrementally over time beginning with limited sets of models such as those used for CCAR/DFAST stress testing or the set of pricing models in the high risk tier. Changes could be included in the regular release cycles. • Tracking Model Dependencies: Offers a direct token-based means for comprehensively identifying upstream and downstream dependencies based on execution processes rather than attestation by model developers. Cons: • Touches Every Model: Requires some minor additions to the source code of each model to be tracked. • High bandwidth from heavily used models could bottleneck the Firm’s intranet. • Vendor models present a special challenge – doubtful vendors would agree to install Transponders in their models. But there may be workarounds through the inhouse execution scripts or host programs that Firms use to interface between the vendor code and the Firm’s computers. • Spreadsheet models could present challenges as well, but not insurmountable ones. 1 Most production models at banks are managed by host execution platforms, although most EUC models are not. It is possible for execution platforms to be designed or modified to track usage statistics but large firms may have hundreds of different platforms and each would have to be customized to provide similar data. Any changes would have to be made to all such platforms. © 2018 Darling Consulting Group, Inc. Page 145 Potential Objections to Implementing the Transponder Method

• Can’t model usage tracking be performed by the IT production environment or execution platform for important models?

• Why does the solution need to be inside the models? Can’t this function be performed by external monitoring? • Pricing models may be called hundreds of times a day for full calculations. Other models may be called hundreds of times as sub-models to Monte-Carlo simulators, etc. Won’t this obfuscate usage tracking and skew perceptions of importance?

• Can this methodology also be applied to non-model calculators or near-models that don’t yet have IDs? • What about rogue models developed on trading desks and used for a few days & then discarded? • What if a firm migrates their models to a Cloud platform? Doesn’t that confound the MAC address?

• Would installation of a transponder function into a model affect the model’s performance and trigger a review by model validation?

© 2018 Darling Consulting Group, Inc. Page 146 Final Thoughts: Many Tech Industries Have Had this Functionality for Years!

There is nothing new about the concept of embedded one-way ‘transponder’ functions to track usage. Google, Amazon, eBay, Uber & Tesla have been had their equivalents in place for a decade or more.

One-way transponder functions have been used for years by online vendors like Amazon and eBay to track external clients behavior patterns. This information is useful to vendors to target their online ads to potential customers, to monitor consumer interests and to build profiles of each of millions or hundreds of millions of clients. This is why we see those popup ads mysteriously tailored to our individual purchasing patterns.

Somewhere at Tesla Central Headquarters there could a very large screen that can display the location of every operational Tesla vehicle (which number in the hundreds of thousands), along with its current speed, direction, time since last charge, driving patterns of the owner, and a host of tracking data that help Tesla to understand usage, geographic concentrations, charging stations used, etc., so they can improve and expand their services and market share optimally. They can do this because every Tesla vehicle has the equivalent of a two-way transponder embedded in each vehicle in its onboard computers.

Uber ride hailing service uses the GPS tracking functionality of smart phones to display the current location of its drivers and clients and an expected time to arrival for pickups and drop-offs.

© 2018 Darling Consulting Group, Inc. Page 147 Final Thought

Many Tech Industries Have Had this Functionality for Years! Why Don’t We Have It In Finance?

Why is it that financial firms are so far behind and cannot manage to collect similar patterns of behavior for a few thousand internal clients (i.e. their quantitative models)?

Are we not as smart or forward-looking as the Techs?

© 2018 Darling Consulting Group, Inc. Page 148 QUESTIONS & ANSWERS

© 2018 Darling Consulting Group, Inc. Page 149 MODEL RISK MANAGEMENT | SEPTEMBER 27-28, 2018 | BOSTON Assessing Model Risk in the Aggregate CFP Masterclass

Day 1 | 4:30 pm

Ray Brastow, Sr. Financial Economist, Federal Reserve Bank of Richmond Liming Brotcke, Quantitative Manager, Federal Reserve Bank of Chicago

© 2018 Darling Consulting Group, Inc. • 260 Merrimac Street • Newburyport, MA 01950 • Tel: 978.463.0400 • DarlingConsulting.com Page 150 END OF DAY 1

QUESTIONS & ANSWERS

© 2018 Darling Consulting Group, Inc. Page 151 MODEL RISK MANAGEMENT | SEPTEMBER 27-28, 2018 | BOSTON

© 2018 Darling Consulting Group, Inc. • 260 Merrimac Street • Newburyport, MA 01950 • Tel: 978.463.0400 • DarlingConsulting.com Page 152 Agenda – Day One

Time Topic

9:00 Next Level MRM [Drew Boecher]

9:45 Regulatory MRM Perspective [Panel: Mike Guglielmo, Ray Brastow, Liming Brotcke]

10:30 Break

10:45 Establishing a Model Risk Management Culture [Brandon Blanchard and Mike Guglielmo]

12:00 Lunch

1:00 Lifecycle of a Model [Brandon Blanchard and Mike Guglielmo]

2:00 Break

2:15 Model Inventory Management [Mike Guglielmo & Jonathan Hill]

3:15 Break

3:30 Managing Inventory Risk: “Should a Model ‘Know’ Its Own ID?” [Jonathan Hill]

4:30 Assessing Model Risk In The Aggregate [Ray Brastow & Liming Brotcke]

5:15 Q&A and Open Discussion

© 2018 Darling Consulting Group, Inc. Page 153 Agenda – Day Two

Time Topic

9:00 New Era of Data Management [Joe Montalbano]

10:00 Break

10:15 Case Study: Validation of Statistical Models [Joe Montalbano]

11:15 Case Study: Validation of Non-Statistical / Non-Complex Models [Sam Chen]

12:00 Lunch

1:00 Case Study: Validation of Compliance / Data-Driven Models [Brandon Blanchard & Mike Guglielmo]

1:30 Case Study: Validation of Vendor / “Black-Box” Models [Sam Chen]

2:00 Break

2:15 Follow-Up to Validations: The Validation Is Complete – Now What? [Sam Chen & Drew Boecher]

3:00 Break

3:15 The Future State of MRM - Adaptable, Efficient And Effective [Mike Guglielmo & Jonathan Hill]

4:00 Q&A and Open Discussion

© 2018 Darling Consulting Group, Inc. Page 154 MODEL RISK MANAGEMENT | SEPTEMBER 27-28, 2018 | BOSTON The New Era of Data Management Facts vs. Hype

Day 2 | 9:00 am

Joe Montalbano, Quantitative Consultant, DCG

© 2018 Darling Consulting Group, Inc. • 260 Merrimac Street • Newburyport, MA 01950 • Tel: 978.463.0400 • DarlingConsulting.com Page 155 Agenda

† A Enhancing Data Collection Increased data volume, sophistication, and headaches over time

B Collection Challenges Ensuring we have the right data for the right job

C Facts vs. Hype: Segmentation Opportunities, requirements, and challenges of deeper modeling dives

D Facts vs. Hype: Supplemental Data Benefits and risks of adding more data to the solution

E Cautionary Tale: Economic Context Safety tips

© 2018 Darling Consulting Group, Inc. Page 156 Data Quality Over Time

1990s u Quarterly aggregate data u Maybe origination data for loans

“Game over, man!” 2008

1990 2000 2010 2018

2000s 2008 and onward u Quarterly aggregate data u Quarterly or monthly account-level data u Annual account-level data u New reporting systems for quick aggregation to portfolio level

© 2018 Darling Consulting Group, Inc. Page 157 Acquiring older data (of good quality) can prove a challenge.

Constraints that Lead to the Wrong Data Limited Historical Data Uncaptured Changes

Processes Bringing Data Processes Causing Data from Outside Decay

System Consolidations Data Cleaning & Purging

Initial Data Entry System Upgrades

Batch Feeds Loss of Expertise

Real-time Data Interfaces Process Automation Enterprise Data Warehouse

© 2018 Darling Consulting Group, Inc. Page 158 Agenda

A Enhancing Data Collection Increased data volume, sophistication, and headaches over time † B Collection Challenges Ensuring we have the right data for the right job

C Facts vs. Hype: Segmentation Opportunities, requirements, and challenges of deeper modeling dives

D Facts vs. Hype: Supplemental Data Benefits and risks of adding more data to the solution

E Cautionary Tale: Economic Context Safety tips

© 2018 Darling Consulting Group, Inc. Page 159 Example: Identifying the Correct Data to Analyze

Record: 13-3 Record: 3-13

© 2018 Darling Consulting Group, Inc. Page 160 Example: Identifying the Correct Data to Analyze

© 2018 Darling Consulting Group, Inc. Page 161 Survivorship Bias

u Abraham Wald’s WWII memo about losing bombers to enemy fire

u Survivorship bias affects because defaulted loans and failed or acquired portfolios are often ignored

© 2018 Darling Consulting Group, Inc. Page 162 Survivorship Bias for Borrowers

u A high data completion Defaulted

rate may be deceptive Defaulted and misrepresent actual

data quality Healthy Healthy

• 99% overall data completion among all borrowers

• 55% data completion Defaulted among defaulted 55% complete borrowers Healthy

u Missing data is often Sample Data Field with 99% Completion 99.8% complete concentrated in a particular segment of the Complete Data portfolio Missing Data

© 2018 Darling Consulting Group, Inc. Page 163 Agenda

A Enhancing Data Collection Increased data volume, sophistication, and headaches over time

B Collection Challenges Ensuring we have the right data for the right job † C Facts vs. Hype: Segmentation Opportunities, requirements, and challenges of deeper modeling dives

D Facts vs. Hype: Supplemental Data Benefits and risks of adding more data to the solution

E Cautionary Tale: Economic Context Safety tips

© 2018 Darling Consulting Group, Inc. Page 164 Data Segmentation

u Improved data collection has allowed for more model segmentation by: Ø Sub-portfolios Ø Vintages Ø Risk ratings Ø Product type Ø Age or seasoning effects

u Higher sample sizes, more fields, and more categories per field enable more granular analysis

© 2018 Darling Consulting Group, Inc. Page 165 Example 1: Default Rate Curves

u Improved loan tracking has allowed for more targeted default rate analysis Time Default Rate Default Time - in - Point

idealized example Loan Age u We can observe seasoning effects in default rates © 2018 Darling Consulting Group, Inc. Page 166 Example 1: Default Rate Curves

u Actual observations align with established research u Although benchmarks should not be considered “correct,” they do provide useful Time Default Rate Rate Default Default Time Time - -

in in grounding - - Point Point

Loan Age

© 2018 Darling Consulting Group, Inc. Page 167 Example 1: Default Rate Curves

u Further segmentation Segment 1 can allow more targeted analysis, but only if sufficient Time Default Rate Default Time Time Default Rate Default Time - - in in sample sizes are - - Point Point present

Loan Age Loan Age u Some segments may Segment 2 Segment 3 be hindered by data collection or new trends may be Time Default Rate Default Time Time Default Rate Default Time - - discovered in in - - Point Point

Loan Age Loan Age

© 2018 Darling Consulting Group, Inc. Page 168 Example 1: Default Rate Curves

u Analysis can also benefit from our knowledge of prior research

u We may not yet have full data history for a new loan product Time Default Rate Default Time - in - u Benchmarks and established trends can Point temporarily fill gaps Loan Age

© 2018 Darling Consulting Group, Inc. Page 169 Example 2: Migration Matrices

1970-2017 Average Migration Rates 1970 From\To Aaa Aa A Baa Ba B Caa Ca-C WR Default 1971

Aaa 87.71% 7.94% 0.58% 0.07% 0.02% 0.00% 0.00% 0.00% 3.67% 0.00% … Aa 0.82% 85.15% 8.51% 0.42% 0.06% 0.04% 0.02% 0.00% 4.95% 0.02% A 0.05% 2.46% 86.78% 5.37% 0.48% 0.11% 0.04% 0.01% 4.64% 0.05% Baa 0.03% 0.14% 4.12% 85.72% 3.79% 0.69% 0.15% 0.02% 5.17% 0.17% 2014 Ba 0.01% 0.04% 0.42% 6.12% 76.32% 7.17% 0.71% 0.11% 8.22% 0.88% B 0.01% 0.03% 0.14% 0.45% 4.78% 73.49% 6.62% 0.52% 10.70% 3.27% 2015 Caa 0.00% 0.01% 0.02% 0.08% 0.34% 6.51% 67.87% 2.85% 14.35% 7.96% Ca-C 0.00% 0.00% 0.05% 0.00% 0.56% 2.29% 8.94% 39.39% 22.12% 26.66% 2016 2017 u Portfolios with many accounts can be used in migration matrix analysis 2017 Migration Rates u Data requirement: account-level tracking From\To Aaa Aa A Baa Ba B Caa Ca-C WR Default Aaa 100.00% 0.00% 0.00% 0.00% 0.00% 0.00% 0.00% 0.00% 0.00% 0.00% u Segmentation can be as deep as sample Aa 0.00% 75.08% 20.20% 0.00% 0.00% 0.00% 0.00% 0.00% 4.71% 0.00% A 0.00% 1.35% 90.08% 3.89% 0.24% 0.00% 0.08% 0.00% 4.37% 0.00% size permits Baa 0.00% 0.06% 2.88% 90.13% 1.29% 0.29% 0.00% 0.00% 5.35% 0.00% Ba 0.00% 0.00% 0.00% 6.74% 82.07% 3.97% 0.12% 0.00% 6.86% 0.24% u Example: Moody’s corporate ratings B 0.00% 0.00% 0.00% 0.10% 7.24% 72.15% 5.45% 0.00% 14.77% 0.30% Caa 0.00% 0.08% 0.00% 0.00% 0.08% 4.99% 70.38% 3.07% 17.65% 3.76% Ca-C 0.00% 0.00% 0.00% 0.00% 0.00% 0.81% 19.51% 35.77% 16.26% 27.64%

Source: Moody’s Annual Corporate Default Studies

© 2018 Darling Consulting Group, Inc. Page 170 Example 2: Migration Matrices

2017Q3 2017Q4 2018Q1 2018Q2 2018Q3 2018Q4 2019Q1

u Quarterly matrices give more detailed views u Aggregation to lifetime roll rates through matrix multiplication u Macroeconomic forecasts can be layered into forecast migration rates

© 2018 Darling Consulting Group, Inc. Page 171 Example 2: Migration Matrices

…or they can include fewer categories for more statistical certainty. 2017Q3 2017Q4 2018Q1 2018Q2 From\To Healthy 30 DPD 60 DPD 90 DPD Default Healthy 92.03% 7.94% 0.00% 0.00% 0.02% 30 DPD 34.79% 0.00% 65.15% 0.00% 0.06%

From\To Aaa Aa A Baa Ba B Caa Ca-C WR Default 60 DPD 14.35% 0.00% 0.00% 85.17% 0.48% Aaa 100.00% 0.00% 0.00% 0.00% 0.00% 0.00% 0.00% 0.00% 0.00% 0.00% 90 DPD 3.79% 0.00% 0.00% 0.00% 96.21% Aa 0.00% 75.08% 20.20% 0.00% 0.00% 0.00% 0.00% 0.00% 4.71% 0.00% Default 0.00% 0.00% 0.00% 0.00% 100.00% A 0.00% 1.35% 90.08% 3.89% 0.24% 0.00% 0.08% 0.00% 4.37% 0.00% Baa 0.00% 0.06% 2.88% 90.13% 1.29% 0.29% 0.00% 0.00% 5.35% 0.00% Ba 0.00% 0.00% 0.00% 6.74% 82.07% 3.97% 0.12% 0.00% 6.86% 0.24% B 0.00% 0.00% 0.00% 0.10% 7.24% 72.15% 5.45% 0.00% 14.77% 0.30% Caa 0.00% 0.08% 0.00% 0.00% 0.08% 4.99% 70.38% 3.07% 17.65% 3.76% Analyses can be even Ca-C 0.00% 0.00% 0.00% 0.00% 0.00% 0.81% 19.51% 35.77% 16.26% 27.64% more targeted with Migration matrices can be as granular as statistical models. desired with sufficient sample sizes…

© 2018 Darling Consulting Group, Inc. Page 172 Agenda

A Enhancing Data Collection Increased data volume, sophistication, and headaches over time

B Collection Challenges Ensuring we have the right data for the right job

C Facts vs. Hype: Segmentation Opportunities, requirements, and challenges of deeper modeling dives † D Facts vs. Hype: Supplemental Data Benefits and risks of adding more data to the solution

E Cautionary Tale: Economic Context Safety tips

© 2018 Darling Consulting Group, Inc. Page 173 Example: Peers and Panels

u Insufficient data can be supplemented with peer or proxy data u Check applicability of data u Data can be aggregated or left separate and analyzed as panel data

© 2018 Darling Consulting Group, Inc. Page 174 Example: Peers and Panels

u A panel regression allows separate intercepts for each entity: + 70

60 !"# = %" + ' ,(-#( + ."# ()* 50 unique intercepts shared 40 coefficients

30

20 % %0 1 10 %/ % - * 03/01/2001 08/01/2001 01/01/2002 06/01/2002 11/01/2002 04/01/2003 09/01/2003 02/01/2004 07/01/2004 12/01/2004 05/01/2005 10/01/2005 03/01/2006 08/01/2006 01/01/2007 06/01/2007 11/01/2007 04/01/2008 09/01/2008 02/01/2009 07/01/2009 12/01/2009 05/01/2010 10/01/2010 03/01/2011 08/01/2011 01/01/2012 06/01/2012 11/01/2012 04/01/2013 09/01/2013 02/01/2014 07/01/2014 12/01/2014 05/01/2015 10/01/2015 03/01/2016 08/01/2016

© 2018 Darling Consulting Group, Inc. Page 175 Example: Peers and Panels

u Some series are more suitable for panel analysis than others Time series 350 Time series 200 180 300 160

250 140

120 200

100

150 80

100 60

40 50 Loss (bps)Rate Loss (bps)Rate 20

0 - 03/01/2001 08/01/2001 01/01/2002 06/01/2002 11/01/2002 04/01/2003 09/01/2003 02/01/2004 07/01/2004 12/01/2004 05/01/2005 10/01/2005 03/01/2006 08/01/2006 01/01/2007 06/01/2007 11/01/2007 04/01/2008 09/01/2008 02/01/2009 07/01/2009 12/01/2009 05/01/2010 10/01/2010 03/01/2011 08/01/2011 01/01/2012 06/01/2012 11/01/2012 04/01/2013 09/01/2013 02/01/2014 07/01/2014 12/01/2014 05/01/2015 10/01/2015 03/01/2016 08/01/2016 03/01/2001 08/01/2001 01/01/2002 06/01/2002 11/01/2002 04/01/2003 09/01/2003 02/01/2004 07/01/2004 12/01/2004 05/01/2005 10/01/2005 03/01/2006 08/01/2006 01/01/2007 06/01/2007 11/01/2007 04/01/2008 09/01/2008 02/01/2009 07/01/2009 12/01/2009 05/01/2010 10/01/2010 03/01/2011 08/01/2011 01/01/2012 06/01/2012 11/01/2012 04/01/2013 09/01/2013 02/01/2014 07/01/2014 12/01/2014 05/01/2015 10/01/2015 03/01/2016 08/01/2016

Scatterplot 350 Scatterplot 200 180 300

160 250 140

200 120

150 100

80 100

60 50 Loss (bps)Rate Loss (bps)Rate 40

0 20

-50 - 4 5 6 7 8 9 10 4 5 6 7 8 9 10 Unemployment Rate Unemployment Rate

© 2018 Darling Consulting Group, Inc. Page 176 Example: Peers and Panels

u Supplemental data also requires more due diligence u Example: Growth rate calculations Ø Auto loan portfolio Ø Model to forecast aggregate loan balances Ø Panel dataset of peer banks Ø Panel regression fits historical balance growth rates u Starting point: How best to calculate growth?

© 2018 Darling Consulting Group, Inc. Page 177 Example: Peers and Panels

-6.0%-20% -4.0%0% -2.0%20% 0.0%40%2.0%60%4.0% 80%6.0% 100%8.0% 120%10.0% 12.0%140% 1234,561234,578 u !"# $%&'(ℎ *+(,-,/ = Bank 1 1234,578 Bank 2

Bank 3 8 ⁄<@ 123:;8<=> u 9,( !"# $%&'(ℎ *+(, = − 1 Bank 4 123:;;8=8 Bank 5

Bank 6 1234,561234,578 u !"# $%&'(ℎ *+(,-,/ = Bank 7 DEF4,5GDEF4,578 C : Bank 8

Bank 9

Bank 10

Avg. of Growth Rates Net Growth % (quarterly) Avg. of Denominator Avg. Growth

© 2018 Darling Consulting Group, Inc. Page 178 Example: Peers and Panels

Peer Bank #8 Quarter-over-Quarter Portfolio Growth 9,000%

8,000%

7,000%

6,000%

5,000%

4,000%

3,000%

2,000%

1,000%

0%

-1,000% 03/01/2001 09/01/2001 03/01/2002 09/01/2002 03/01/2003 09/01/2003 03/01/2004 09/01/2004 03/01/2005 09/01/2005 03/01/2006 09/01/2006 03/01/2007 09/01/2007 03/01/2008 09/01/2008 03/01/2009 09/01/2009 03/01/2010 09/01/2010 03/01/2011 09/01/2011 03/01/2012 09/01/2012 03/01/2013 09/01/2013 03/01/2014 09/01/2014 03/01/2015 09/01/2015 03/01/2016 09/01/2016

© 2018 Darling Consulting Group, Inc. Page 179 Example: Peers and Panels

Peer Bank #8 Portfolio Balance 30,000

25,000

20,000

15,000 Thousands of Dollars of Thousands 10,000

5,000

0 03/01/2001 09/01/2001 03/01/2002 09/01/2002 03/01/2003 09/01/2003 03/01/2004 09/01/2004 03/01/2005 09/01/2005 03/01/2006 09/01/2006 03/01/2007 09/01/2007 03/01/2008 09/01/2008 03/01/2009 09/01/2009 03/01/2010 09/01/2010 03/01/2011 09/01/2011 03/01/2012 09/01/2012 03/01/2013 09/01/2013 03/01/2014 09/01/2014 03/01/2015 09/01/2015 03/01/2016 09/01/2016

© 2018 Darling Consulting Group, Inc. Page 180 Example: Peers and Panels

u There can be structural breaks in a bank’s portfolio due to acquisitions

700,000

600,000

500,000

400,000

300,000

200,000

100,000 Farm Portfolio Balance ($000) Balance Portfolio Farm

0 2001Q1 2001Q3 2002Q1 2002Q3 2003Q1 2003Q3 2004Q1 2004Q3 2005Q1 2005Q3 2006Q1 2006Q3 2007Q1 2007Q3 2008Q1 2008Q3 2009Q1 2009Q3 2010Q1 2010Q3 2011Q1 2011Q3 2012Q1 2012Q3 2013Q1 2013Q3 2014Q1 2014Q3 2015Q1 2015Q3 2001Q1 2001Q3 2002Q1 2002Q3 2003Q1 2003Q3 2004Q1 2004Q3 2005Q1 2005Q3 2006Q1 2006Q3 2007Q1 2007Q3 2008Q1 2008Q3 2009Q1 2009Q3 2010Q1 2010Q3 2011Q1 2011Q3 2012Q1 2012Q3 2013Q1 2013Q3 2014Q1 2014Q3 2015Q1 2015Q3 Raw Bank Data Series Raw BankData DataSeries Series Controlled for Acquisitions

© 2018 Darling Consulting Group, Inc. Page 181 Agenda

A Enhancing Data Collection Increased data volume, sophistication, and headaches over time

B Collection Challenges Ensuring we have the right data for the right job

C Facts vs. Hype: Segmentation Opportunities, requirements, and challenges of deeper modeling dives

D Facts vs. Hype: Supplemental Data Benefits and risks of adding more data to the solution † E Cautionary Tale: Economic Context Safety tips

© 2018 Darling Consulting Group, Inc. Page 182 Case Study: C&I NCO Rate Modeling

0.70% Unemployment 0.60% 0.50% CPI Inflation 0.40% 0.30% BBB Corp. Yield 0.20% 0.10% VIX 0.00% 1992Q1 1993Q1 1994Q1 1995Q1 1996Q1 1997Q1 1998Q1 1999Q1 2000Q1 2001Q1 2002Q1 2003Q1 2004Q1 2005Q1 2006Q1 2007Q1 2008Q1 2009Q1 2010Q1 2011Q1 2012Q1 2013Q1 2014Q1 2015Q1 2016Q1 2017Q1 2018Q1 2019Q1 2020Q1 Historical Fitted Baseline Adverse Severely Adverse

0.70% Unemployment 0.60% 0.50% CPI Inflation 0.40% 0.30% BBB Corp. Yield 0.20% 0.10% VIX 0.00% 1992Q1 1993Q1 1994Q1 1995Q1 1996Q1 1997Q1 1998Q1 1999Q1 2000Q1 2001Q1 2002Q1 2003Q1 2004Q1 2005Q1 2006Q1 2007Q1 2008Q1 2009Q1 2010Q1 2011Q1 2012Q1 2013Q1 2014Q1 2015Q1 2016Q1 2017Q1 2018Q1 2019Q1 2020Q1 Historical Fitted Baseline Adverse Severely Adverse

© 2018 Darling Consulting Group, Inc. Page 183 Case Study: C&I NCO Rate Modeling

0.70% Unemployment 0.60% 0.50% CPI Inflation 0.40% 0.30% BBB Corp. Yield 0.20% 0.10% VIX 0.00% 1992Q1 1993Q1 1994Q1 1995Q1 1996Q1 1997Q1 1998Q1 1999Q1 2000Q1 2001Q1 2002Q1 2003Q1 2004Q1 2005Q1 2006Q1 2007Q1 2008Q1 2009Q1 2010Q1 2011Q1 2012Q1 2013Q1 2014Q1 2015Q1 2016Q1 2017Q1 2018Q1 2019Q1 2020Q1 Historical Fitted Baseline Adverse Severely Adverse

0.70%2.00% Unemployment 0.60% 0.50%1.50% 3M Treasury 0.40% 1.00% Dow Jones 0.30% Growth 0.50%0.20% 0.10% HPI Growth 0.00% 1992Q1 1993Q1 1994Q1 1995Q1 1996Q1 1997Q1 1998Q1 1999Q1 2000Q1 2001Q1 2002Q1 2003Q1 2004Q1 2005Q1 2006Q1 2007Q1 2008Q1 2009Q1 2010Q1 2011Q1 2012Q1 2013Q1 2014Q1 2015Q1 2016Q1 2017Q1 2018Q1 2019Q1 2020Q1 Historical Fitted Baseline Adverse Severely Adverse

© 2018 Darling Consulting Group, Inc. Page 184 Case Study: C&I NCO Rate Modeling

0.70% Unemployment 0.60% 0.50% CPI Inflation 0.40% 0.30% BBB Corp. Yield 0.20% 0.10% VIX 0.00% 1992Q1 1993Q1 1994Q1 1995Q1 1996Q1 1997Q1 1998Q1 1999Q1 2000Q1 2001Q1 2002Q1 2003Q1 2004Q1 2005Q1 2006Q1 2007Q1 2008Q1 2009Q1 2010Q1 2011Q1 2012Q1 2013Q1 2014Q1 2015Q1 2016Q1 2017Q1 2018Q1 2019Q1 2020Q1 Historical Fitted Baseline Adverse Severely Adverse

0.70% Unemployment 0.60% 0.50% CREPI Growth 0.40% 0.30% 0.20% 0.10% 0.00% 1992Q1 1993Q1 1994Q1 1995Q1 1996Q1 1997Q1 1998Q1 1999Q1 2000Q1 2001Q1 2002Q1 2003Q1 2004Q1 2005Q1 2006Q1 2007Q1 2008Q1 2009Q1 2010Q1 2011Q1 2012Q1 2013Q1 2014Q1 2015Q1 2016Q1 2017Q1 2018Q1 2019Q1 2020Q1 Historical Fitted Baseline Adverse Severely Adverse

© 2018 Darling Consulting Group, Inc. Page 185 Trust your instincts

© 2018 Darling Consulting Group, Inc. Page 186 Statistical measures do not always tell the whole truth

Anscombe’s Quartet

Key Statistics

Mean of x = 9

Variance of x = 11

Mean of y = 7.50

Variance of y = 4.12

Correlation = 81.6% Regression Formula y = 3 + 0.5x x is the horizontal axis y is the vertical axis

© 2018 Darling Consulting Group, Inc. Page 187 Statistics Can Tell Whatever Story You Want

80,000 y = 35488.8 + 150.7x 75,000 70,000 Correlation 65,000 0.941 60,000 2 55,000 Adj. R 0.863

Human Population 50,000 100 120 140 160 180 200 220 240 260 Observed Storks

u Sometimes data can support a desired narrative, even one with far-fetched claims u In this case, the analysis ignores lurking variables

Prevalence of storks and population of Oldenburg Germany (1930-1936) Ornithologishe Monatsberichte. 44(2). Jahrgang, 1936.

© 2018 Darling Consulting Group, Inc. Page 188 QUESTIONS & ANSWERS

© 2018 Darling Consulting Group, Inc. Page 189 MODEL RISK MANAGEMENT | SEPTEMBER 27-28, 2018 | BOSTON Validation of Statistical Models Case Study

Day 2 | 10:15 am

Joe Montalbano, Quantitative Consultant, DCG

© 2018 Darling Consulting Group, Inc. • 260 Merrimac Street • Newburyport, MA 01950 • Tel: 978.463.0400 • DarlingConsulting.com Page 190 Agenda

† A Case Study 1: Statistical Soundness and Model Design Overview of common regression output, case study of time series regression

B Case Study 2: Sensitivity Analysis Credit union loss rate regression, case study of consumer default rates

C Common Validation Approaches and Tests In-sample and out-of-sample testing, ROC curves, benchmarking

© 2018 Darling Consulting Group, Inc. Page 191 Statistical Soundness

u At the most fundamental level, validation should check that no red flags are thrown by basic statistical tests u Linear regression example: Ø Independent variables and intercept Ø Coefficients Ø Standard errors Ø t-values and P-values Ø Level of statistical significance

© 2018 Darling Consulting Group, Inc. Page 192