<<

Session 046PD: Post Model Conversion Challenges vs Emerging Technology Solutions

10/15/2018 1:45-3:00 p.m.

SOA Antitrust Compliance Guidelines SOA Presentation Disclaimer

Session 46 PD Post model conversion challenges vs. emerging technology solutions

Moderator Henry Chen, FSA, FCIA, MAAA

Presenters Jeffrey Mu, FSA, FCIA Vincent Xuan, FSA, MAAA, CFA Marshall Lin, FSA, MAAA, CFA

October 15, 2018 SOCIETY OF ACTUARIES Antitrust Compliance Guidelines Active participation in the Society of Actuaries is an important aspect of membership. While the positive contributions of professional societies and associations are well-recognized and encouraged, association activities are vulnerable to close antitrust scrutiny. By their very nature, associations bring together industry competitors and other market participants.

The antitrust laws aim to protect consumers by preserving the free economy and prohibiting anti-competitive business practices; they promote competition. There are both state and federal antitrust laws, although state antitrust laws closely follow federal law. The Sherman Act, is the primary U.S. antitrust law pertaining to association activities. The Sherman Act prohibits every contract, combination or conspiracy that places an unreasonable restraint on trade. There are, however, some activities that are illegal under all circumstances, such as price fixing, market allocation and collusive bidding.

There is no safe harbor under the antitrust law for professional association activities. Therefore, association meeting participants should refrain from discussing any activity that could potentially be construed as having an anti-competitive effect. Discussions relating to product or service pricing, market allocations, membership restrictions, product standardization or other conditions on trade could arguably be perceived as a restraint on trade and may expose the SOA and its members to antitrust enforcement procedures.

While participating in all SOA in person meetings, webinars, teleconferences or side discussions, you should avoid discussing competitively sensitive information with competitors and follow these guidelines: • Do not discuss prices for services or products or anything else that might affect prices • Do not discuss what you or other entities plan to do in a particular geographic or product markets or with particular customers. • Do not speak on behalf of the SOA or any of its committees unless specifically authorized to do so. • Do leave a meeting where any anticompetitive pricing or market allocation discussion occurs. • Do alert SOA staff and/or legal counsel to any concerning discussions • Do consult with legal counsel before raising any matter or making a statement that may involve competitively sensitive information.

Adherence to these guidelines involves not only avoidance of antitrust violations, but avoidance of behavior which might be so construed. These guidelines only provide an overview of prohibited activities. SOA legal counsel reviews meeting agenda and materials as deemed appropriate and any discussion that departs from the formal agenda should be scrutinized carefully. Antitrust compliance is everyone’s responsibility; however, please seek legal counsel if you have any questions or concerns.

2 Presentation Disclaimer

Presentations are intended for educational purposes only and do not replace independent professional judgment. Statements of fact and opinions expressed are those of the participants individually and, unless expressly stated to the contrary, are not the opinion or position of the Society of Actuaries, Oliver Wyman, Ernst & Young LLP, Prudential Financial, their cosponsors or their committees. The Society of Actuaries, Oliver Wyman, or Ernst & Young LLP, Prudential Financial do not endorse or approve, and assumes no responsibility for, the content, accuracy or completeness of the information presented. Attendees should note that the sessions are audio-recorded and may be published in various media, including print, audio and video formats without further notice.

3 Agenda

I. Introductions II. Topics for discussions: I. Why transform – and what can go wrong? II. Day 2 improvements- model run time III. Day 2 improvements - automation

4 Presenter Biographies

Jeffery Mu, FSA, FCIA , Oliver Wyman Jeffrey Mu is a consultant with the Actuarial Practice of Oliver Wyman and is based in Hartford. His primary responsibilities are to provide actuarial consulting services to various entities and organizations .He specialized in variable annuities and fixed indexed annuities. His industry experiences include model building, model validation and model transformation.

Vincent Xuan, FSA, MAAA, CFA Vice President & Actuary, Prudential Financial Vincent Xuan is a currently a VP & Actuary at Actuarial Modeling Center of Excellence group of Prudential financial, leading model development for all annuity products. Prior to that, he was a VP & Actuary at the Enterprise Risk Management Group of Prudential Financial.

Marshall Lin, FSA, MAAA, CFA Senior Manager, Ernst & Young Marshall Lin is a Senior Manager in the advisory service practices of EY. He is based in the firm’s Office and specializes in automation and transformation. Prior to joining Ernst & Young in 2013, he held progressive corporate roles with two large insurance companies in various US and European offices for eight years.

5 Why transform – and what can go wrong?

6 Why did we transform our models? Historical issues influenced the new model requirements Change in technology

• Legacy systems retired • New and improved modeling platforms Model governance issues available

• Controls • Misuse of data Resources and efficiency issues • Risk insufficiently modeled • Non-modeled business or features • Labor intensive production • Material simplifications • Little time for analysis, documentation • Model validation difficulty and model improvement • Documentation • Efficiency vs. precision • Inefficient use of cores Integration issues • Production – Actuarial function • Too many platforms/models or IT function? • Manual processes and adjustments Increasing internal and external • Ad-hoc runs demands • Missing features • New, increasingly complex products • Reporting changes • New regulations • Fulfilling additional disclosure requirements • Evidence of risk management Data issues and strong governance • Reduced time to close • Inaccurate or incomplete data • Developing richer “real time” • Incompatible format information metrics • Data management process is manual and unstructured • Excessive data massaging required • Lack support from IT and other areas Where did we want our models to land? Transformations and conversions begin with well defined objectives

Asset • Automation vs. features and Setup time human intervention reinsurance and effort

• Computing requirement Riders and and constraints guarantees Run time

Desired • Inputs and outputs • Corporate Development changes Base product • • Liability Model Model • Access control to the model • features controls and results • Economic scenarios, State • Requirements for users, • Investment, reinvestment testers, administrators etc. and hedging • Management actions Assumptions Model • Income statement and output balance sheet projections Accounting/ • Statutory/Tax Granularity of output Regulatory • • GAAP • Integration with financial and bases • IFRS ledger reporting process • Capital • Cash Flow Testing What challenges were faced along the way?

Scope changes Shifting regulatory Data constraints requirements

Factors influencing Disconnect between successful Design changes IT and finance transformations

Updated timelines Competing demands on staff

Competing priorities, constraints and changes to the external environment do not lead to transformation/conversion failures; however, opportunities arise for day 2 improvements

© Oliver Wyman 99 End-to-end process

1 2 3 4 Raw source data Input warehousing Calculation engine Output

Source data 1 Valuation Model output Policy / Source data 2 liability data Forecasting

Source data 3 ALM Reporting/analytics Product specifications Source data 4

Source data 5 Liability Downstream tools assumptions Source data 6 Model components: Source data 7 1. Management of data into warehouses Asset Data Source data 8 2. Loading input data into calculation engine 3. Calculation engine Source data 9 Economic 4. Outputs, post-valuation processes (PVPs) assumptions Source data 10 and analytics 5. Governance process (surrounds all)

10 Two areas of focus today

1 1 Model run time

2 Automation

2 Key focuses

© Oliver Wyman 1111 Day 2 improvement – model run time

12 Why does Model Run Time Matter?

• Financial Reporting Needs • Need to report financial results within X business days after quarter close or year end. • Finance needs necessary analytics to explain financial results on a timely manner.

• Internal Management Analytics • Periodic capital, surplus and profitability analytics require quick turnaround to ensure timely management decisions. • Pricing analytics is time sensitive, especially for deal-type transactions. • Other ad-hoc analyses that require quick responses to management.

• Model Development • Iterative model development and testing demands faster run time. • Model run time also impacts the development and maintenance of the input/output infrastructure.

• Computational Expense • Intensive model run drives up the scale of expensive grid farms. • Cloud computing is an alternate for oscillating demand but is still costly. • Modeling and technology staffing needs positively associates with run time.

13 Why Is Your Model Slow? And……………………….……How to Spool It Up?

Size and complexity of the business • Large size of in-force business. In-force population compression instead of seriatim. • Various lines of business. One model versus separate models. • Generations of products within each line of business. Product mapping by grouping similar products. • Complex and exotic product features. Potential product feature simplification.

Multiple uses of the same model • Multiple use cases, including external reporting, internal forecasting, capital management, capital market hedging, etc. • Multiple external reporting basis, e.g. US Statutory, US GAAP, IFRS and Tax. • Significant amount of economic scenarios and assumption sensitivity testing.

Model structure inefficiencies • Calculate the same logic multiple times due to model limitation. • Suboptimal programming logic.

Infrastructure, process and control • Many manual intervention steps involved. • Input, output and reporting automation not in place. • Excessive control and approval steps.

14 Why is Your Model Slow? And……………………….……How to Spool it Up?

Size and complexity of the business

• Large size of in-force business. In-force population compression instead of seriatim. • Various lines of business. One model versus separate models. • Generations of products within each line of business. Product mapping by grouping similar products. • Complex and exotic product features. Potential product feature simplification.

Multiple uses of the same model One model versus separate models. • Multiple use cases, including external reporting, internal forecasting, capital management, capital market hedging, etc. • Multiple external reporting basis, e.g. US Statutory, US GAAP, Calculate once when possible, e.g. combining US Stat and Tax IFRS and Tax. for certain products. • Significant amount of economic scenarios and assumption Scenario compression and reducing numbers of sensitivity sensitivity testing. testing when possible.

Model structure inefficiencies • Calculate the same logic multiple times due to model limitation. • Suboptimal programming logic.

Infrastructure, process and control • Many manual intervention steps involved. • Input, output and reporting automation not in place. • Excessive control and approval steps.

15 Why is Your Model Slow? And……………………….……How to Spool it Up?

Size and complexity of the business • Large size of in-force business. In-force population compression instead of seriatim. • Various lines of business. One model versus separate models. • Generations of products within each line of business. Product mapping by grouping similar products. • Complex and exotic product features. Potential product feature simplification.

Multiple uses of the same model • Multiple use cases, including external reporting, internal One model versus separate models. forecasting, capital management, capital market hedging, etc. • Multiple external reporting basis, e.g. US Statutory, US GAAP, IFRS and Tax. Calculate once when possible, e.g. combining US Stat and Tax • Significant amount of economic scenarios and assumption for certain products. sensitivity testing. Scenario compression and reducing numbers of sensitivity testing when possible.

Model structure inefficiencies • Calculate the same logic multiple times. Calculate commonly shared components once. • Suboptimal programming logic. Peer review or seek vendor experts’ guidance on code efficiency. Infrastructure, process and control • Many manual intervention steps involved. • Input, output and reporting automation not in place. • Excessive control and approval steps.

16 Why is Your Model Slow? And……………………….……How to Spool it Up?

Size and complexity of the business • Large size of in-force business. In-force population compression instead of seriatim. • Various lines of business. One model versus separate models. • Generations of products within each line of business. Product mapping by grouping similar products. • Complex and exotic product features. Potential product feature simplification.

Multiple uses of the same model • Multiple use cases, including external reporting, internal One model versus separate models. forecasting, capital management, capital market hedging, etc. • Multiple external reporting basis, e.g. US Statutory, US GAAP, IFRS and Tax. Calculate once when possible, e.g. combining US Stat and Tax • Significant amount of economic scenarios and assumption for certain products. sensitivity testing. Scenario compression and reducing numbers of sensitivity testing when possible.

Model structure inefficiencies • Calculate the same logic multiple times. Calculate commonly shared components once. • Suboptimal programming logic. Peer review or seek vendor experts’ guidance on code efficiency. Infrastructure, process and control • Many manual intervention steps involved. Input, output and Seek ways to automate and productionalize the process such reporting automation are not in place. as using run inventory to batch model runs. • Excessive control and approval steps. Balance control with agility.

17 In-force Compression Versus Seriatim

In-force Compression In-force Compression Seriatim - Random Selection - Clustering Algorithm

Description Run each policy. Run every K-th policy either directly in population Define clusters based on a set of criteria (e.g. file (pseudo-random) or by sorting assigned random product type, gender, age group, in-the- numbers (true-random), and scale the policy up. moneyness), choose a subset of policies that best represent the cluster, and scale them up. Demonstration Policy 1 Policy 1 Policy 1 Policy 2 Policy 2 Policy 2 (Best represents the cluster) Policy 3 Policy 3 Policy 3 ……. ……. ……. Policy K Policy K (Every Kth policy selected) Policy K

Run Time Longest. Short; Time saving depends on depth of Short; Time saving depends on the chosen algorithm compression but modelers have more control. and modelers have less control. Implementation Easiest. Easy; Mostly calculation engine configuration. Fair to difficult; May require infrastructure change

Ongoing None. Periodically monitor results and consider revising Periodically monitor results and consider revamping Maintenance the depth of compression upon results deviation. algorithm upon results deviation; Algorithm requires significant research efforts. Results Accuracy Exact. Monitoring required; May not be accurate for Monitoring required. smaller blocks.

Best Fit Situation • Small block. • Large block of business. • Mid size business. Not large enough to utilize law • Run time not an • Run time is a major issue. of large numbers for compression. issue. • Tight timeline does not allow additional research • Run time is an issue but still allows time for • Regulation requires time for clustering method. clustering research and testing. seriatim. • Expensive to change infrastructure. • Use compression for analytics and model • Use compression for analytics and model development, even if not for reporting. development, even if not for reporting.

18 One Model Versus Separate Models

One Model Separate Models

Description One consolidated model for multiple purposes and/or multiple One model for each purpose and/or line of business. lines of business.

Run Speed Depends on the extent of shared components across purposes Depends on the extent of shared components across purposes and and business lines. The more similarity, the more efficient one business lines. The more diversion, the more efficient separate model can be, e.g., one model for multiple purposes such as models can be, e.g. separate models for two wildly different valuation and forecasting could use the same valuation module. ≈ products such as annuities and LTC could tailor model structures to its own run speed needs . Implementation More challenging as modelers need to reconcile model Easier as each model’s modeler only needs to consider his/her own structure needs among multiple purposes. < needs. Modeling Demands high modeling expertise across purposes or business Easier as one only needs to be the expert for his/her own modeling Expertise lines. Hard to acquire such talents. < area. Model Minimum to none, as model structure is shared. Heavy needs for reconciliation, as implementation could be done in Reconciliation very different ways, or even in different vendor system or > programming languages which make attributions difficult. Consolidated Easier, as the one model will have a consolidated input and Challenging to reconcile different reporting formats out of separate Reporting output infrastructure which allows consistent reporting models into the same template. capabilities. >

Best Fit • Multiple purposes share the same calculation logic. • Vastly different product features or methodologies. Situation • Talents identified with broad expertise in multiple areas. • Very few shared components across different uses. • Vendor software or programming language flexible enough • Talents spread across the decentralized organization with very to accommodate multiple needs. specific knowledge in each area. • Strong demand for reconciliation such as pricing against • Existing modeling platform does not have enough flexibility to valuation. do multiple things. • High needs for results consolidation. • Report results separately.

19 Day 2 improvement – automation

20 Why automate now?

Businesses moving Focus on value- faster added activities

Manage costs Improve controls

21 Automation and transformation/modernization

• Because the business needs have become increasingly complex and highly model- , data- and technology- dependent, most life insurers are in some form of an actuarial transformation or modernization journey • Automation is an important objective, supported by many of the components of the overall solution, and a specific component of the end-to-end solution • Automation techniques can also be applied tactically, without requiring a larger, more complex transformation program

Actuarial Actuarial Reporting system(s) transformation

Analytics Data Automation“ Talent & Computing sourcing Organizational structure

22 Automation tools

Various classes of tools are becoming available which makes it easier to automate

Data visualization & Automation reporting tools toolkit Robotics

Enterprise tools as part of actuarial Data analytics tools modeling software

BPM tools

23 Automation opportunities are abundant in modeling processes

• Input preparations • Current period data Model • Model runs Upstream • Historical data production • Output processing processes • Experience studies • Review of results • Early indications

• Larger test samples • Reporting Model • More frequent validations Downstream • Analytics validation • Additional tests processes • Documentation • Better selections • Archiving

24 Automation of production processes Example 1: Robotics

Model input Actuarial models Model output Final results Current • Actuary runs SAS programs to • Actuary loads model inputs • Actuary combines model output, • Actuary sends out email notifications manual format data extracts into model • Actuary deploys model runs post-model adjustments and non- • Actuary loads results into reporting work steps inputs and produce analytics • Actuary monitors model runs modeled results systems: ledgers, financial data • Actuary documents work performed • Actuary documents work performed • Actuary documents work performed marts

RPA • Robot runs SAS programs and sends • After receiving actuary’s sign-off, • Robot processes model output and • After receiving actuary’s sign-off, email notifications when input data robot loads model inputs, deploys sends email notifications when robot loads results into reporting and analytics are ready for review model runs and sends notifications if results are ready for review systems and sends email there are issues notifications

Each work step robot performs is exactly as programmed and is automatically documented in system logs that can be provided as an audit trail

Policy admin systems Non-modeled

Actuarial Assumptions Financial Model output models results

Market data Adjustments

25 Automation of production processes Example 2: Next gen data tools

26 Automation of production processes Example 3: A combination of tools

• Data preparations • Data quality review • Assumptions review • Proxy modeling • Results analysis

27

Automation of Modeling Processes

Marshall Lin, FSA, MAAA, CFA October 15, 2018 Disclaimer

• The views expressed by the presenters are not necessarily those of Ernst & Young LLP or other members of the global EY organization.

• These slides are for educational purposes only and are not intended to be relied upon as accounting, tax, or other professional advice. Please refer to your advisors for specific advice.

2 Why automate now?

Businesses moving Focus on value- faster added activities

Manage costs Improve controls

3 Automation and transformation/modernization • Because the business needs have become increasingly complex and highly model- , data- and technology- dependent, most life insurers are in some form of an actuarial transformation or modernization journey • Automation is an important objective, supported by many of the components of the overall solution, and a specific component of the end-to-end solution • Automation techniques can also be applied tactically, without requiring a larger, more complex transformation program

Actuarial Actuarial Reporting system(s) transformation

Analytics Data Automation“ Talent & Computing sourcing Organizational structure

4 Automation tools Various classes of tools are becoming available which makes it easier to automate

Data visualization & Automation reporting tools toolkit Robotics

Enterprise tools as part of actuarial Data analytics tools modeling software

BPM tools

5 Automation opportunities are abundant in modeling processes

• Input preparations • Current period data Model • Model runs Upstream • Historical data production • Output processing processes • Experience studies • Review of results • Early indications

• Larger test samples • Reporting Model • More frequent validations Downstream • Analytics validation • Additional tests processes • Documentation • Better selections • Archiving

6 Automation of production processes Example 1: Robotics

Model input Actuarial models Model output Final results Current • Actuary runs SAS programs to • Actuary loads model inputs • Actuary combines model output, • Actuary sends out email manual format data extracts into model • Actuary deploys model runs post-model adjustments and non- notifications work steps inputs and produce analytics • Actuary monitors model runs modeled results • Actuary loads results into reporting • Actuary documents work performed • Actuary documents work performed • Actuary documents work performed systems: ledgers, financial data marts RPA • Robot runs SAS programs and sends • After receiving actuary’s sign-off, • Robot processes model output and • After receiving actuary’s sign-off, email notifications when input data robot loads model inputs, deploys sends email notifications when robot loads results into reporting and analytics are ready for review model runs and sends notifications if results are ready for review systems and sends email there are issues notifications Each work step robot performs is exactly as programmed and is automatically documented in system logs that can be provided as an audit trail

Policy admin Non-modeled systems

Actuarial Financial Assumptions Model output models results

Market data Adjustments

7 Automation of production processes Example 2: Next gen data tools

8 Automation of production processes Example 3: A combination of tools

• Data preparations • Data quality review • Assumptions review • Proxy modeling • Results analysis

9