Cross-Media Audience Measurement Standards (Phase I Video)

Total Page:16

File Type:pdf, Size:1020Kb

Cross-Media Audience Measurement Standards (Phase I Video) MRC Cross-Media Audience Measurement Standards (Phase I Video) September 2019 Final Sponsoring associations: Media Rating Council (MRC) American Association of Advertising Agencies 4A’s Association of National Advertisers ANA Interactive Advertising Bureau (IAB) Video Advertising Bureau (VAB) Final Table of Contents 1 Executive Summary ......................................................................................................................... 1 1.1 Overview and Scope ............................................................................................................................ 3 1.2 Standards Development Method ...................................................................................................... 5 1.3 Note on Privacy .................................................................................................................................... 5 2 General Top-Line Measurement ................................................................................................... 6 2.1 Cross-Media Components ................................................................................................................... 6 2.1.1 Duration Weighting .......................................................................................................................................... 9 2.1.2 Cross-Media Metrics Definitions .............................................................................................................. 10 2.1.3 Household vs. Individual Metrics .............................................................................................................. 11 2.1.4 Segregation of Content/Advertising Vehicles and Media ................................................................ 11 2.1.5 Audio Considerations .................................................................................................................................... 12 2.2 Impression Counting .......................................................................................................................... 13 2.2.1 Viewable Definition for Video Ads in Cross-Media ............................................................................ 14 2.3 Content Measurement ...................................................................................................................... 16 2.3.1 Viewability for Content ................................................................................................................................ 16 2.3.2 Content Duration Weighting ...................................................................................................................... 17 2.3.3 Use of Content Metrics ................................................................................................................................ 17 2.4 Duration ............................................................................................................................................... 20 2.5 Audience Assignment ........................................................................................................................ 21 2.5.1 Qualifying for Audience ............................................................................................................................... 22 3 Cross-Media Universe Estimates – Basis for Projection ......................................................... 22 3.1 Universe Estimates ............................................................................................................................. 22 3.2 Coverage .............................................................................................................................................. 23 3.2.1 Device Identification ..................................................................................................................................... 24 3.2.2 IP-Enabled Television or OTT Devices ..................................................................................................... 25 3.2.3 Accounting for Duplication Across Media ............................................................................................. 26 4 Cross-Media Measurement Standards – Technical Details ................................................... 26 4.1 Tracking of Advertising and Content Access – Technical Details ................................................ 26 4.1.1 Client-Initiated (and viewable) .................................................................................................................. 26 4.1.2 Audience vs. Ad Measurement ................................................................................................................. 28 4.1.3 Script-based Tracking Method/Assets .................................................................................................... 29 4.1.4 Encoding or Watermarking, Fingerprinting and Meter-based Tracking Method/Assets ...... 30 4.1.5 STB, RPD and Smart TV data ...................................................................................................................... 32 4.1.6 Video Usage ..................................................................................................................................................... 33 4.1.7 Measurement in Applications .................................................................................................................... 33 4.1.8 Repurposed TV Content ............................................................................................................................... 34 4.1.9 Comparative Presentation .......................................................................................................................... 34 4.2 Duration ............................................................................................................................................... 35 4.2.1 Inactivity ............................................................................................................................................................ 36 4.2.2 Duration Editing .............................................................................................................................................. 36 4.3 Tracking of Users (Sources and Attribution) – Technical Details ................................................ 37 4.3.1 Adjustment of Uniques ................................................................................................................................ 38 4.3.2 Identifying Users Across Devices .............................................................................................................. 38 4.3.3 Data Enrichment Source Selection ........................................................................................................... 39 4.3.3.1 Data Enrichment Quality Checking and Monitoring ................................................................................. 41 © Copyright Media Rating Council, Inc. All rights reserved. Final 4.3.4 Registration ...................................................................................................................................................... 41 5 Data Preparation and Quality Checking ................................................................................... 41 5.1 Data Collection ................................................................................................................................... 43 5.1.1 Validation Procedures .................................................................................................................................. 43 5.1.2 Quality control and oversight of MVPD STB/RPD data ..................................................................... 43 5.1.2.1 Oversight and QC over audience data vendor processes (STB validation) ....................................... 45 5.1.2.2 Channel lineup considerations ......................................................................................................................... 45 5.1.2.3 Program lineup considerations ........................................................................................................................ 46 5.1.3 Server to Server or API integrations ........................................................................................................ 46 5.2 Data Editing ......................................................................................................................................... 47 5.2.1 Empirical Support ........................................................................................................................................... 48 5.2.2 Documentation and Consistent Application ......................................................................................... 48 5.3 Quality Control Over Other Data Sources ...................................................................................... 48 5.4 Data Aggregation Controls ................................................................................................................ 48 5.4.1 Quality control integrity checks throughout multiple data transfer points ............................... 49 5.4.2 Tests of Significance for Missing Data ..................................................................................................... 49 6 Computation of Audience Estimates ......................................................................................... 49 6.1 Weighting, data adjustment and modeling
Recommended publications
  • Hybrid Online Audience Measurement Calculations Are Only As Good As the Input Data Quality
    People First: A User-Centric Hybrid Online Audience by John Brauer, Manager, Product Leadership, Measurement Model Nielsen Overview As with any media measurement methodology, hybrid online audience measurement calculations are only as good as the input data quality. Hybrid online audience measurement goes beyond a single metric to forecast online audiences more accurately by drawing on multiple data sources. The open question for online publishers and advertisers is, ‘Which hybrid audience measurement model provides the most accurate measure?’ • User-centric online hybrid models rely exclusively on high quality panel data for audience measurement calculations, then project activity to all sites informed by consumer behavior patterns on tagged sites • Site-centric online hybrid models depend on cookies for audience calculations, subject to the inherent weaknesses of cookie data that cannot tie directly to unique individuals and their valuable demographic information • Both hybrid models assume that panel members are representative of the universe being measured and comprise a base large enough to provide statistical reliability, measured in a way that accurately captures behavior • The site-centric hybrid model suffers from the issue of inaccurately capturing unique users as a result of cookie deletion and related activities that can result in audience measurement estimates different than the actual number • Rapid uptake of new online access devices further exacerbates site-centric cookie data tracking issues (in some cases overstating unique browsers by a factor of five) by increasing the likelihood of one individual representing multiple cookies Only the user-centric hybrid model enables a fair comparison of audiences across all web sites, avoids the pitfall of counting multiple cookies versus unique users, and employs a framework that anticipates future changes such as additional data sets and access devices.
    [Show full text]
  • EFAMRO / ESOMAR Position Statement on the Proposal for an Eprivacy Regulation —
    EFAMRO / ESOMAR Position Statement on the Proposal for an ePrivacy Regulation — April 2017 EFAMRO/ESOMAR Position Statement on the Proposal for an ePrivacy Regulation April 2017 00. Table of contents P3 1. About EFAMRO and ESOMAR 2. Key recommendations P3 P4 3. Overview P5 4. Audience measurement research P7 5. Telephone and online research P10 6. GDPR framework for research purposes 7. List of proposed amendments P11 a. Recitals P11 b. Articles P13 2 EFAMRO/ESOMAR Position Statement on the Proposal for an ePrivacy Regulation April 2017 01. About EFAMRO and ESOMAR This position statement is submitted In particular our sector produces research on behalf of EFAMRO, the European outcomes that guide decisions of public authorities (e.g. the Eurobarometer), the non- Research Federation, and ESOMAR, profit sector including charities (e.g. political the World Association for Data, opinion polling), and business (e.g. satisfaction Research and Insights. In Europe, we surveys, product improvement research). represent the market, opinion and In a society increasingly driven by data, our profession ensures the application of appropriate social research and data analytics methodologies, rigour and provenance controls sectors, accounting for an annual thus safeguarding access to quality, relevant, turnover of €15.51 billion1. reliable, and aggregated data sets. These data sets lead to better decision making, inform targeted and cost-effective public policy, and 1 support economic development - leading to ESOMAR Global Market Research 2016 growth and jobs. 02. Key Recommendations We support the proposal for an ePrivacy Amendment of Article 8 and Recital 21 to enable Regulation to replace the ePrivacy Directive as research organisations that comply with Article this will help to create a level playing field in a true 89 of the General Data Protection Regulation European Digital Single Market whilst increasing (GDPR) to continue conducting independent the legal certainty for organisations operating in audience measurement research activities for different EU member states.
    [Show full text]
  • Pioneering Tool to Manage Media Industry's Digital Carbon Footprint 13 January 2020
    Pioneering tool to manage media industry's digital carbon footprint 13 January 2020 industry understand and manage the carbon impact of digital media. Mapping the carbon footprint of digital services like advertising, publishing and broadcasting is difficult because the underlying technological systems are hugely complex and constantly shifting. Media content passes through content delivery networks, data centres, web infrastructure and user devices, to name just a few, with each element of the delivery chain having different owners. With climate change high on the agenda, DIMPACT The online tool with help media industry manage its will allow participating companies to understand digital carbon footprint. Credit: Pixabay/ University of their 'downstream' carbon impacts, right through to Bristol the end-user. This, in turn, will enable more informed decision-making to reduce the overall carbon footprint of digital services. A collaboration between computer scientists at the University of Bristol and nine major media "We know that more and more of our interactions companies, including ITV and BBC, will help the happen online, and screens play an ever more media industry understand and manage the important role in our lives. We can say with significant carbon impacts of digital content. absolute certainty that the digital economy will continue to grow. What we don't know is how those The 12-month collaboration, facilitated by modes of digital consumption translate into carbon sustainability experts, Carnstone, will see impacts and where the 'hotspots' reside. DIMPACT University of Bristol researchers working with will change that," said Christian Toennesen, Senior sustainability and technology teams at the BBC, Partner at Carnstone and DIMPACT's initiator and Dentsu Aegis Network, Informa, ITV, Pearson, product manager.
    [Show full text]
  • Where the Viewability Standard Sits Today and Where the Industry Goes from Here
    SPECIAL SECTION Where the viewability standard sits today and where the industry goes from here by ALIA LAMBORGHINI ANA.NET // 9 VIEWABILITY So What Is IS 2015’S BUZZWORD, AND FOR Viewability? GOOD REASON. Last year, according to The Media Rating Council (MRC) es- tablished today’s viewability standards eMarketer, marketers spent more than $145 and guidelines, determining that at least Vbillion in online advertising and $42 billion 50 percent of a display ad’s pixels be in in mobile, refecting 122 percent year-over- view for at least one second. year growth for mobile specifcally. Con- Mobile, however, is still uncharted territory, with no Formal standard yet set. sumers are spending more time on digital Interim guidance largely mirrors the dis- devices, and advertisers are allocating larger play standard. budgets to reach them. The massive shiFt to all things mobile What makes viewability important? Mar- drives a serious need For the medium to be more measurable. Increasingly, mar- keters expect that the right, real person sees keters and the agencies that represent an ad that appears within the appropriate them are calling For 100 percent view- content — no wasted impressions, no wasted ability in ad buys, increasing their expec- ad budgets. tations For their media partners and ad platforms alike. That pressure translates According to a recent Millennial Media to an industry laser-focused on solving white paper on the topic of viewability, for 100 percent viewability in mobile marketers would allocate even larger bud- even though standards have not been set gets if they could be guaranteed they were yet.
    [Show full text]
  • Rating the Audience: the Business of Media
    Balnaves, Mark, Tom O'Regan, and Ben Goldsmith. "Bibliography." Rating the Audience: The Business of Media. London: Bloomsbury Academic, 2011. 256–267. Bloomsbury Collections. Web. 2 Oct. 2021. <>. Downloaded from Bloomsbury Collections, www.bloomsburycollections.com, 2 October 2021, 14:23 UTC. Copyright © Mark Balnaves, Tom O'Regan and Ben Goldsmith 2011. You may share this work for non-commercial purposes only, provided you give attribution to the copyright holder and the publisher, and provide a link to the Creative Commons licence. Bibliography Adams, W.J. (1994), ‘Changes in ratings patterns for prime time before, during, and after the introduction of the people meter’, Journal of Media Economics , 7: 15–28. Advertising Research Foundation (2009), ‘On the road to a new effectiveness model: Measuring emotional responses to television advertising’, Advertising Research Foundation, http://www.thearf.org [accessed 5 July 2011]. Andrejevic, M. (2002), ‘The work of being watched: Interactive media and the exploitation of self-disclosure’, Critical Studies in Media Communication , 19(2): 230–48. Andrejevic, M. (2003), ‘Tracing space: Monitored mobility in the era of mass customization’, Space and Culture , 6(2): 132–50. Andrejevic, M. (2005), ‘The work of watching one another: Lateral surveillance, risk, and governance’, Surveillance and Society , 2(4): 479–97. Andrejevic, M. (2006), ‘The discipline of watching: Detection, risk, and lateral surveillance’, Critical Studies in Media Communication , 23(5): 392–407. Andrejevic, M. (2007), iSpy: Surveillance and Power in the Interactive Era , Lawrence, KS: University Press of Kansas. Andrews, K. and Napoli, P.M. (2006), ‘Changing market information regimes: a case study of the transition to the BookScan audience measurement system in the US book publishing industry’, Journal of Media Economics , 19(1): 33–54.
    [Show full text]
  • Monthly Business & Tech-Enabled Services Sector Summary Report
    BUSINESS AND TECH-ENABLED SERVICES SECTOR REPORT March 2018 1 BUSINESS & TECH-ENABLED SERVICES DEAL DASHBOARD $94.4 Billion 788 M&A Volume YTD M&A Transactions YTD Quarterly M&A Volume ($Bn) and Deal Count Select M&A Transactions 100 $94.4 Announced Date Acquirer Target EV ($MM) 80 3/29/2018 NA 60 $48.5 $45.1 $45.5 $37.1 3/29/2018 NA 40 $31.1 $24.6 $27.4 $26.9 $19.1 $19.2 $21.7 Volume ($Bn) Volume $14.9 20 3/28/2018 $4,000 (Sig. M inority Stake) 0 3/26/2018 NA Q1 '15Q2 '15Q3 '15Q4 '15Q1 '16Q2 '16Q3 '16Q4 '16Q1 '17Q2 '17Q3 '17Q4 '17Q1 '18 (Investment) 1200 3/21/2018 NA 966 944 964 995 986 1000 901 929 917 875 894 797 788 788 800 3/21/2018 NA 600 3/19/2018 $1,180 Deal Count Deal 400 200 3/15/2018 $383 0 Q1 '15Q2 '15Q3 '15Q4 '15Q1 '16Q2 '16Q3 '16Q4 '16Q1 '17Q2 '17Q3 '17Q4 '17Q1 '18 3/15/2018 NA (1) Last 12 Months Business & Tech-Enabled Services Performance vs. S&P 500 3/14/2018 NA (Investment) 125.0% 3/13/2018 $205 120.0% 115.0% 3/13/2018 NA 110.0% 105.0% 3/12/2018 $108 100.0% 3/6/2018 $564 95.0% 90.0% 3/5/2018 NA Apr-17 May-17 Jun-17 Jul-17 Aug-17 Sep-17 Oct-17 Nov-17 Dec-17 Jan-18 Feb-18 Mar-18 Apr-18 Business & Tech-Enabled Services S&P 500 3/1/2018 NA (Investment) Notes: 2 Source: Capital IQ and PitchBook.
    [Show full text]
  • Big Data and Audience Measurement: a Marriage of Convenience? Lorie Dudoignon*, Fabienne Le Sager* Et Aurélie Vanheuverzwyn*
    Big Data and Audience Measurement: A Marriage of Convenience? Lorie Dudoignon*, Fabienne Le Sager* et Aurélie Vanheuverzwyn* Abstract – Digital convergence has gradually altered both the data and media worlds. The lines that separated media have become blurred, a phenomenon that is being amplified daily by the spread of new devices and new usages. At the same time, digital convergence has highlighted the power of big data, which is defined in terms of two connected parameters: volume and the frequency of acquisition. Big Data can be as voluminous as exhaustive and its acquisition can be as frequent as to occur in real time. Even though Big Data may be seen as risking a return to the paradigm of census that prevailed until the end of the 19th century – whereas the 20th century belonged to sampling and surveys. Médiamétrie has chosen to consider this digital revolution as a tremendous opportunity for progression in its audience measurement systems. JEL Classification : C18, C32, C33, C55, C80 Keywords: hybrid methods, Big Data, sample surveys, hidden Markov model Reminder: The opinions and analyses in this article are those of the author(s) and do not necessarily reflect * Médiamétrie ([email protected]; [email protected]; [email protected]) their institution’s or Insee’s views. Receveid on 10 July 2017, accepted on 10 February 2019 Translation by the authors from their original version: “Big Data et mesure d’audience : un mariage de raison ?” To cite this article: Dudoignon, L., Le Sager, F. & Vanheuverzwyn, A. (2018). Big Data and Audience Measurement: A Marriage of Convenience? Economie et Statistique / Economics and Statistics, 505‑506, 133–146.
    [Show full text]
  • CARAT Client: MANAPPURAM a Trading Division Under Dentsu Manappuram Finance Ltd Aegis Network Communications Lndia Private Limited
    6*ep COTIFIDENTIAL DENTSU AEGIS NETWORK SERVICES AGREEMENT COMMERCIAL TERM SHEE]' .: . l:r : :r '::: :PARi[IES:, Agepc CARAT Client: MANAPPURAM A trading division under Dentsu Manappuram Finance Ltd Aegis Network Communications lndia Private Limited Cl N : U74300MH1986PTC039002 Address: 601-8 Wing, Poonam Chambers, Addr.€.' w l 47O (old) W63BA (New), l:, l:- l1 . ,:1:: l'1. Manappuram House, Valapad, DR. A B Road, Worli, Mumbai- :,:.1:; :.1 r:::::i.: l; !;ii, r. '- 400018 ir:. r::r:-:. l Thrissur, l(erala,lndia - 680 567 TERIV'Ii' rl GI Start,Date: .TUN€ >o\'t O I Tu uY wlV 30 ;l X AUTO RENEW EI --l-lrrlf-r- : - -:: _::-: CI SERV-leEil ,:,.;i; - -,. ;:N -"i Qr- AII lndia except Kerala iht d t- *i tr lVledia Services ril ;i D Creative Services ;; it ]> "-t Z E other Services 1.) )j U p F The Services are more particularly described in each applicable Statement of Wofk.s* BilbS U Mediir I Offline # nEffitr i i,lr, i : r::. r tr'Out of Home H $HHtr i"il s s#fi* li ili El Digital Services {including Display, Performance, SEM, Social, Mobile) ff oSi*x 'l itii H !s!)ttr , $t*="' tf ("T&Cs") nd schedrrles ttw,rijq$-.i This Commercial Term Sheet must be read alongside the Terms and Conditions and schedullgs H are attached to this CommercialTerm Sheet. The T&Cs and schedules, together with this Commercial TEm S h'dct fr paYties; aand' entered into on 08 day of April 2019 constitute a binding agreement f'Agreement") between the 0l will apply to any media schedule, work order and/or any services supplied to the Client by Agency duringI trne Executed for and on behalf of CIi nt//":;.
    [Show full text]
  • Playbook Contents
    THE VIDEO ADVERTISER’S PLAYBOOK CONTENTS PART 1 PART 2 10 things to look for in a 10 Steps to Killer Video video advertising provider Ad Campaigns 1. Completed Views 1. Set Objectives 2. Viewability 2. Establish KPIs 3. Reach 3. Strategize 4. Audience Targeting 4. Assemble Assets & Information 5. Ad Personalization 5. Get Measurements in Place 6. Programmatic Campaign Scalability 6. Execute Strategies 7. Strategies 7. Monitor Data & Results 8. Data & Insights 8. Optimize Performance 9. Transparent Pricing 9. Analyze & Discover Insights 10. Service 10. Pay it Forward PART 1 10 things to look for in a video advertising provider PART 1 10 THINGS TO LOOK FOR IN A VIDEO ADVERTISING PARTNER Completed Views First and foremost, you want a provider that charges you only for completed views. 1 Face it, you’re paying to get the right people to watch your message and engage with it. So, can your provider deliver on that? The Devil’s in the Details How does the provider define a completed view? Are the ads forced on viewers or do viewers get to choose whether to watch? For YouTube’s TrueView ads—the prime inventory we access through our TargetView™ platform—it’s defined it as a viewer Viewers really dislike forced view ads and they often show their watching your complete ad or 30 seconds of it, whichever is feelings by abandoning the video or tuning it out. Neither shorter. Advertisers can run longer form TrueView ads and still outcome is beneficial, is it? learn how many viewers watch the whole message but it’s considered a full or complete view at the 30-second mark.
    [Show full text]
  • Independent Study on Indicators for Media Pluralism in the Member States – Towards a Risk-Based
    Independent Study on Indicators for Media Pluralism in the Member States – Towards a Risk-Based Approach Prepared for the European Commission Directorate-General Information Society and Media SMART 007A 2007-0002 by K.U.Leuven – ICRI (lead contractor) Jönköping International Business School - MMTC Central European University - CMCS Ernst & Young Consultancy Belgium Preliminary Final Report Contract No.: 30-CE-0154276/00-76 Leuven, April 2009 Editorial Note: The findings reported here provide the basis for a public hearing to be held in Brussels in the spring of 2009. The Final Report will take account of comments and suggestions made by stakeholders at that meeting, or subsequently submitted in writing. The deadline for the submission of written comments will be announced at the hearing as well as on the website of the European Commission from where this preliminary report is available. Prof. Dr. Peggy Valcke Project leader Independent Study on “Indicators for Media Pluralism in the Member States – Towards a Risk-Based Approach” AUTHORS OF THE REPORT This study is carried out by a consortium of three academic institutes, K.U. Leuven – ICRI, Central European University – CMCS and Jönköping International Business School – MMTC, and a consultancy firm, Ernst & Young Consultancy Belgium. The consortium is supported by three categories of subcontractors: non-affiliated members of the research team, members of the Quality Control Team and members of the network of local media experts (‘Country Correspondents’). The following persons have contributed to the Second Interim Report: For K.U.Leuven – ICRI: Prof. Dr. Peggy Valcke (project leader; senior legal expert) Katrien Lefever Robin Kerremans Aleksandra Kuczerawy Michael Dunstan and Jago Chanter (linguistic revision) For Central European University – CMCS: Prof.
    [Show full text]
  • Measuring the Audience: Why It Matters to Independent News Media and How It Can Contribute to Media Development
    Measuring the Audience: Why It Matters to Independent News Media and How It Can Contribute to Media Development BY MICHELLE J. FOSTER October 2014 Measuring the Audience: Why It Matters to Independent News Media and How It Can Contribute to Media Development ABOUT CIMA OCTOBER 2014 The Center for International Media Assistance (CIMA), at the National Endowment for Democracy, CONTENTS works to strengthen the support, raise the visibility, and improve the effectiveness of independent Introduction . 1 media development throughout the world . The center provides information, builds networks, The Exposure Model of Audience Research . 3 conducts research, and highlights the indispensable Audience Data Move Directly into the Ad Buying Process . 5 role independent media play in the creation and development of sustainable democracies . An Business Development . 9 important aspect of CIMA’s work is to research ways Audience Development . 13 to attract additional U .S . private sector interest in and support for international media development . Informing and Evaluating Media Development Efforts . 15 CIMA convenes working groups, discussions, and Limits to Audience Research Tool panels on a variety of topics in the field of media for Media Development . 18 development and assistance . The center also issues Conclusions and Recommendations . 19 reports and recommendations based on working group discussions and other investigations . These Endnotes . 21 reports aim to provide policymakers, as well as donors and practitioners, with ideas for bolstering the effectiveness of media assistance . Center for International Media Assistance ABOUT THE AUTHOR National Endowment for Democracy Michelle Foster consults and works with media 1025 F STREET, N W. ., 8TH FLOOR companies to develop strong management practices WASHINGTON, DC 20004 and business models .
    [Show full text]
  • Insurtech Q4 2019 Earnings Call Synopsis
    PPI QUARTERLY EARNINGS SYNOPSIS Insurance Technology (InsurTech) Presented by 7 Mile Advisors & Paradigm Partners International | Q4 2019 PPI 1 Securities offered through 7M Securities, LLC - Member FINRA/SIPC 1 PPI CONTENT Team & Transaction Experience Summary Public Basket & Valuation Trends Earnings Call Overview Transactions 7 Mile Advisors and Paradigm Partners International appreciate the opportunity to present this confidential information to the Company. This document is meant to be delivered only in conjunction with a verbal presentation, and is not authorized for distribution. Please see the Confidentiality Notice & Disclaimer at the end of the document. All data cited in this document was believed to be accurate at the time of authorship and came from publicly available sources. Neither 7 Mile Advisors nor 7M Securities make warranties or representations as to the accuracy or completeness of third-party data contained herein. This document should be treated as confidential and for the use of the intended recipient only. Please notify 7 Mile Advisors and Paradigm Partners if it was distributed in error. 2 PPI THE TEAM Veteran of more than forty years in the insurance Managing Director with over 30 years of domestic business. He became Chief Operations Officer for and international insurance industry experience. Jim Transamerica Life Insurance and Annuity Company Galli has focused heavily in product development, in 1978; he then served in several senior underwriting, channel marketing and distribution management and board positions, lastly as expansion. Jim has served Legal & General America, SIMON BAITLER JIM GALLI Executive Vice President and Chief Administrative MetLife, AIG, and MassMutual. Managing Director Officer for the Transamerica Life Companies, and Managing Director SCOR Reinsurance.
    [Show full text]