<<

BEST PRACTICES IN CROSS PLATFORM EFFECTIVENESS MEASUREMENT

By: Michele Madansky, PHD

Kathryn Koegel

May 2011

CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

EXECUTIVE SUMMARY Cross Platform Advertising Effectiveness Measurement is critically important in today's media environment due to the rapid changes in consumer behavior with respect to all forms of media. Mobile devices – and the consumption of video anywhere – are particularly heightening interest in this type of analysis. While initially used to help demonstrate effectiveness of digital media, these sorts of studies are now used to adjust media mix, understand the synergistic role of media and in the planning process.

No constituents (marketers, agencies or media) are entirely satisfied with current solutions, which include opportunity to see methodologies (OTS), lab testing and some syndicated solutions. But, all are eager to improve upon current methodologies and push beyond to experiment with new methodologies. Single source data for passive media measurement is the Holy Grail for Cross Platform Advertising Effectiveness, but isn’t going to be available in the near term for these executives who have product/inventory to sell, marketing budgets to manage and media plans to justify each quarter.

Until true single source becomes available, there is a desire to “build a better mousetrap” with what is available now, especially for various opportunity to see (OTS) methodologies. There is much innovation around solutions for well known OTS challenges which include apples to oranges comparison between media, murky experimentation (especially in respect to control group development), the current low incidence of survey response (and potential bias among responders) and high drop-out rates due to survey complexity. Research vendors are applying sophisticated modeling techniques, using persistent data tags, mining new sources for survey samples (including Facebook and mobile recruitment) and going beyond mega panels to find exposed cells in “panels of panels.” There are also vendors who are applying a back-to-basics approach such as simplifying survey design or simply synching up exposure windows for various media to improve reliability and comparability of results.

The piqued interest in Cross Platform Advertising Effectiveness Measurement has created dramatic opportunity in the space that can best be explored through the following:

i. Support & Test Data Mash-Up Methodologies: By finding connections between panels that offer media and ad exposure data with purchase or attitudinal data, we can make smaller strides towards single source that work for individual product categories. What we learn from these tests may help with ultimate connections between data sets such as set top box and online ad exposure. ii. Improve Current Methodologies: Work with the here and now and develop better confidence and better practices with studies being conducted today. We understand the limitations – let’s fix what can be fixed and validate these improved methodologies. iii. Validate/Establish Methodological Best Practices: Techniques like weighting and Bayesian modeling may offer relatively simple solutions to problems with current methodologies. Test their validity (through research on research) and demonstrate best practices in using these techniques. iv. Tackle Missing Data Issues: Work with new sources of data – especially from the mobile arena – to continually develop a greater understanding of how media is used now, with a goal towards preparing ourselves for the media landscape over the next decade. v. Foster Shared Learning and Best Practices: An organization like CIMM can help establish best practices by educating the community on current state of Cross Platform Advertising Effectiveness methodologies as well as best practices on integrating these results with other research programs (including marketing mix modeling, tracking studies and single media impact studies). This can serve as an educational forum for all that Cross Platform Advertising Effectiveness Measurement can accomplish now and will accomplish in the future.

2 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

Table of Contents EXECUTIVE SUMMARY ...... 2 INTRODUCTION AND BACKGROUND: ...... 4 The Role of Cross Platform Advertising Effectiveness Measurement in a World of Increasing Media Choices ...... 4 Goal Of Cross Platform Advertising Effectiveness Measurement Project: ...... 5 I. METHODOLOGY AND LANDSCAPE OVERVIEW: ...... 6 White Paper Methodology ...... 6 Landscape Overview ...... 7 II. STRENGTHS, OPPORTUNITIES & CHALLENGES OF EACH METHODOLOGY: ...... 14 OTS – Opportunity to See: ...... 14 i. Apples to Oranges ...... 17 ii. Murky Experimentation ...... 19 iii. Freaks and Geeks ...... 21 iv. Dropouts ...... 25 Lab Testing: ...... 27 Other Methodologies: ...... 29 III. FUTURE OPPORTUNITIES FOR CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT: ...... 34 i. Support Data Mash-Up Methodologies ...... 35 ii. Improve Current Methodologies ...... 35 iii. Validate/Establish Methodological Best Practices ...... 38 iv. Tackle Missing Data Issues ...... 38 v. Foster Shared Learning and Best Practices ...... 39 APPENDIX: ...... 42 A. Cross Platform Advertising Effectiveness Measurement Vendor List: ...... 42 B. Cross Platform Advertising Effectiveness Measurement Landscape: ...... 43 Cross Platform Advertising Effectiveness Measurement Landscape – OTS Capabilities ...... 43 Cross Platform Advertising Effectiveness Measurement Landscape – Lab Testing Capabilities ...... 48 Cross Platform Advertising Effectiveness Measurement Landscape – Other Research Capabilities... 49 C. Interviewee List: ...... 50 Agencies ...... 50 Marketers...... 51 Media ...... 51 Research Vendors ...... 52 D. The Authors: ...... 53

3 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

INTRODUCTION AND BACKGROUND:

The Role of Cross Platform Advertising Effectiveness Measurement in a World of Increasing Media Choices

The media landscape has changed more in the past 10 years than in the previous five decades altogether. The proliferation of devices and new digital media opportunities has created additional touchpoints for marketers not only to reach their audiences but also engage with them in more meaningful ways. Digital media allows the consumer to interact with a brand – whether conducting an online search, visiting a corporate website, becoming a “fan” of the brand on Facebook, or actively choosing to view which video ad to watch before a program on Hulu. Additionally, much of this media activity is happening simultaneously. A teenager is likely viewing MTV’s “16 & Pregnant” while she is browsing the web, status updating on Facebook and texting with her cell phone. One of her parents could be viewing video or e-reading on a tablet and also using their smartphone to program the DVR. While this is an exciting time to be a marketer, media planner, researcher or media provider, this proliferation of media choice and consumption modes presents obvious challenges.

This explosion of choice and interactivity has called into question how we look at ad effectiveness overall, and has fueled demand for a higher level of accountability for every media dollar. The longstanding question of “did I spend my advertising dollars effectively?” is now a query with a multiplicity of variables that are nowhere near being accounted for.

As an industry, we have attempted to provide best practices in Cross Platform Advertising Effectiveness Measurement in different ways at various points in the development of the advertising industry. [See chart 1 for a visualization of the history of Cross Platform Advertising Effectiveness Measurement created by InsightExpress.]

Chart 1

4 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

In the 1960s, the concept of reach and frequency was introduced as a way to account for television effectiveness: reach a consumer so many times and you achieved a desired impact. This also provided a currency for television (the gross rating point) that is still in use today. Reach of given audiences as an effectiveness measurement was also applied to radio and print. In the 1970s, brand tracking gave marketers an idea of how advertising effected their brand perception over time. The 1980s brought a dramatic leap forward with the integration of actual sales data as retailers – especially grocery stores – began to scan barcodes and make purchase data available. Package goods marketers developed media mix models that were able to show the relative contribution of various media to incremental sales lift.

All of these efforts have helped marketers optimize billions of dollars in media spending over the past fifty years. And, while some of these methods have evolved to include online search, display and video advertising, the complexities of today’s media landscape require additional tools to help marketers understand not only the relative contribution of their media investments, but also the synergistic and complementary aspects.

For the past ten years there have been various attempts to get at Cross Platform Advertising Effectiveness Measurement so that marketers can increase the efficiency of their media mix. Some of the earliest of these studies were the XMOS research commissioned by the ARF and IAB and executed from 2001-2006. Even since those studies were released, the proliferation of social media, streaming video, new ad units and mobile devices has further added to the complexity. While it is easy to normalize GRPs against M25-34 for a :30 or a :15 TV ad viewed on linear television, it is more challenging to compare a front page takeover on Yahoo! to a video ad on NBC.com, an overlay on a YouTube video to an iAd on an Apple phone or the “Like” of a Facebook post created by a marketer.

In order to help make sense of the myriad of marketing investments, marketers (and their agencies) are using various methodologies to understand the brand and sales impact of their cross media investments. Media companies are helping their clients better understand the ROI and cross platform synergy of their multi-platform campaigns. Cross Platform Advertising Effectiveness Measurement research serves a plurality of roles:

• Helping advertisers adjust media mix in the future (or preferably mid-campaign) to increase effectiveness • Understanding the synergistic role of media • Helping with cross platform media planning

While Cross Platform Advertising Effectiveness Measurement can also be used in pre-planning, the focus of this paper is on its usage for measuring post-campaign effectiveness. We will also reference brand tracking and media mix modeling in the report. However, the focus is on other research methodologies that isolate the branding and sales impact of cross platform media campaigns.

Goal Of Cross Platform Advertising Effectiveness Measurement Project:

In this rapidly evolving media environment, the goal of this project is develop a deeper understanding of the current tools and techniques for Cross Platform Advertising Effectiveness

5 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

Measurement as well as some emerging methodologies and opportunities for improving our understanding. The document is divided into three key sections:

I. Methodology and Landscape Overview: Current thoughts on the state of Cross Platform Advertising Effectiveness Measurement from an end user marketer, agency and media executive perspective. II. Strengths, Opportunities & Challenges Of Each Methodology: Overview of current methodologies and the strengths and limitations of each with a focus on some emerging best practices culled from key research vendors. III. Future Opportunities: Some specific ideas to help CIMM members (and the industry) improve on existing methodologies and increase collective knowledge in this space.

With disparate methodologies and capabilities, this report seeks to shed light on the best practices currently employed and illuminate areas where there are potential breakthroughs. All of these developments will be viewed through the lens of the end users: marketers, agencies, and media executives who are eager to gain a more thorough understanding of how media works in what combinations to increase the effectiveness of ad campaigns.

The methodologies assessed fall into three basic forms: Opportunity to See (referred to as “OTS” throughout this report), Forced Exposure (often conducted through lab environments) and Syndicated Research/Tracking Studies that are based on panel data. Again, media mix modeling, brand tracking and media heavy up tests are mentioned, but are not the focus of this report.

I. METHODOLOGY AND LANDSCAPE OVERVIEW:

White Paper Methodology

The Coalition for Innovative Media Measurement sent out a request for proposal in October 2010 to vendors they knew to be conducting cross media advertising effectiveness measurement. This RFP included a series of questions including media platforms measured, methodologies, derivation of cross media panels, sampling approach, key metrics captured, granularity of usage (in the incidence of television viewing and online). Responses were received from comScore, InsightExpress, Ipsos/OTX, Knowledge Networks, Marketing Evolution, Millward Brown and The Nielsen Company. Additionally, due to references from agency, media and marketer executives interviewed for the project: SymphonyAM, MetrixLab and 3DAccountability were also included. CIMM issued another RFP for a consultant(s) to conduct interviews with a range of marketers, agencies and media executive who have used Cross Platform Advertising Effectiveness Measurement, assess all materials provided and produce an analysis that would include potential best practices. CIMM chose the team of Michele Madansky, Ph.D., media researcher and media mix modeling specialist and Kathryn Koegel, a media researcher and contributor to Advertising Age. Both of them have experience working with data from both traditional (print and television) and interactive media [see bios, Appendix D].

During March through April 2011, Michele Madansky conducted interviews with 46 area experts based on question guidelines developed by Madansky and Koegel and approved by CIMM. The

6 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

interviews encompassed end users (Marketer, Agency, and Media executives) as well as research vendors and some industry consultants (e.g., Dr. Paul Lavrakas). A full list of interviewees is provided in Appendix C. This document is based on information provided in these interviews as well as materials supplied by vendors. This white paper describes some of the latest developments in each methodology. In order to provide an at-a-glance comparison of key methodologies across vendors we have compiled a landscape document in Appendix B which has been filled in by the vendors at our request and to our specifications.

Landscape Overview

Based on our initial interviews with advertisers, agencies and publishers we found several key themes that helped us develop our line of questioning for the research vendors.

. Strong Interest in Cross Platform Advertising Effectiveness Measurement on the Part of All End Users:

All parties interviewed with experience in Cross Platform Advertising Effectiveness Measurement stated strong interest in this type of research due to acknowledgement of the rapid changes in media usage brought about first by the Internet and now by wireless connectivity through mobile devices. Digital-focused personnel were the most experienced with current Cross Platform Advertising Effectiveness Measurement vendors, techniques and actual usage, and had been conducting these sorts of studies over the past 10 years (often to justify digital investments for advertisers and prove the value of cross platform offerings from the publisher side). There is also increasing interest expressed by television executives (and “traditional” media researchers) due to the increasing consumption of “non-linear” TV. There is great desire to understand how media of all forms work in conjunction and cognizance that so many precepts of media effectiveness date to eras now long gone in terms of how people actually use media.

“Models are based on media consumption at best 10 years ago”

• Marketer

“Market need is enormous – only 30% of CMOs think they are truly measuring ROI.”

• Research Vendor

“Cross media provides quicker and more efficient answers to questions – particularly when data availability is limited.”

• Agency Executive

“Legacy systems [media mix modeling and reach and frequency] do not measure the frequency or the isolated and synergistic effects of media and message impact at all levels of the purchase funnel.”

• Research Vendor

7 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

. Competing Needs Sometimes Cause Tension:

Advertisers, agencies and publishers often have competing needs and motivations for commissioning Cross Platform Advertising Effectiveness Measurement [see chart 2 for a summary of some key motivations expressed by end users].

Chart 2

Media companies tend to use Cross Platform Advertising Effectiveness Measurement to demonstrate the premium value of cross-platform media offerings or tentpole events. These might include major live television events that extend the advertising relationship to other media such as online or mobile, provide a case study to use for future sales initiatives or show the value on non-TV inventory.

“What is the effect of ad exposure? Is an exposure to online video the same as TV or vs. two TV ads? Digital is having to prove how well they do.”

• Media Executive

Agencies want actionable insights to help them improve future media planning and understand the best combinations of media vehicles for their clients. Some feel that often times the main takeaway for clients is that the campaign “worked” (but not insights into how to make adjustments in the future). Agencies also cited that many of their clients who were dabbling in newer media forms required evidence via small-scale tests before investing further in the media.

“I want to find the best combination of media for my clients. Not just ordered pairs, but triplets all the way to octuplets.”

• Agency Executive

8 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

“Give us added value and help us look smart to the client since we are spending so much money with you.”

• Agency Executive

“Our client wants the evidence of the small scale test before they invest more.”

• Agency Executive

Marketers displayed keen interest in understanding media impact on a neutral basis and cross media synergy. They too, however, strongly noted the reality of the market (and their bosses) as they used the research to justify spend (and in some cases increase spend in new directions). They were also more than happy to accept research as added value for a program.

“I need something to take back to management and Marketing Mix Modeling takes too long – management wants to hear: ‘We spent all this money and it works.’"

• Marketer

The competing research needs between media, agency and marketers result in tensions revealed when they are asked to discuss what they would change about Cross Platform Advertising Effectiveness Measurement research. Media companies feel trapped in a loop of providing research they sometimes have little faith in and are often being asked to pay for. Because they are in selling mode, they acknowledge that it’s not always in their best interests to do the research in the way they think it should be done. They would like to develop new learnings but cannot if they are unsure of the outcomes. They also feel the disconnect between agency and marketer partner needs. They can get stuck in the middle of agency/client conflicts and are frustrated with the perceptions that these studies are easy and inexpensive to execute, the reality of which is very different. Finally, online publishers are often asked to contribute ad impressions for control group development as added value. Given low response rates, they end up burning through valuable inventory and often don’t get to see the results of the research study.

“These cross media studies are done as convenience studies based on advertising needs, they are designed to support ads – the final goal is to show that this works vs. help learn how to be more effective. You are not getting collective, more structured learning.”

• Media Executive

“We aren’t learning anything new.”

• Media Executive

“It becomes a tchotchke – sometimes an ad deal is on the line, but mainly it is about building the relationship.”

• Media Executive

“CMOs assume the media part of the research is easy – but it is not.”

• Media Executive

9 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

“9.9 out of 10 of the publishers don't get to see results.”

• Media Executive

Agencies reveal the chronic challenges of maintaining client relationships and the role that producing research can play in that. They often must produce short-term tactical thinking on research to justify their jobs rather than contribute to long-term strategic impact. They also want to be able to produce new insights into cross media planning (especially in light of the new complexities of usage of television and interactive), not just deliver another piece of research.

“The decisions which we are trying to make are more granular – within TV which daypart, [within interactive] search or display – a survey won't allow for that level of granularity.”

• Agency Executive

“They [clients] are looking out for their own hide: ‘help me keep my job’ versus thinking strategically about long term growth.”

• Agency Executive

“Clients want their AOR to give them best thinking re: cross media planning – not necessarily Cross Platform Advertising Effectiveness Measurement.”

• Agency Executive

"Intercept studies are a necessary evil…there are so many issues with sampling design and experimental design, especially when other media are running concurrently."

• Agency Executive

“We checked the box. Neither the client nor us felt that there was much of an action plan.”

• Agency Executive

The Marketers we spoke with, overall, were the most satisfied with the results. Particularly when these marketers worked collaboratively with their agencies and managed the research process and vendor selection they had some great success.

“Cross Platform Advertising Effectiveness Measurement has helped us figure out how much to put into paid search vs. other digital activity. Also, we are starting to learn the impact of social media…We have had pretty substantive recommendations about how to optimize media mix.”

• Marketer

[Re: a long term branding campaign] “We wanted to know whether people understand the program, perception of brand over time…We started surveying before the campaign launched for ‘ghost awareness’ and were able to improve creative messaging and media mix throughout the campaign.”

• Marketer

10 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

However, some of the marketers we spoke with did not manage their own Cross Platform Advertising Effectiveness Measurement studies. They acknowledge that although they may not pay for the research through their own budgets, they would like more control over vendor selection and methodology. When not managed internally, marketers sometimes believe that data that comes back to them is not pure research but has been filtered to tell a story the media company and agency think they want to hear. They are also frustrated with the lag time to produce any results and believe that some of this is due to the filtering process of agencies and media companies.

“I don't know if I ever trusted the data.”

• Marketer

• Single Source Data Is The Holy Grail…But We Are Not There Yet

The interviews conducted for this report attest to growing levels of experimentation, more sources of data available and new tools and techniques being tried to get at the grail of some single source of data for media usage – either by a true single source, overlaying various panels or through modeling techniques. Interviewees expressed desire for access to new sources of data including those controlled by Apple, the cable companies and ISPs (set top box), Google and never-before-available to market shopper data from non-grocery retailers.

“Ideally we would have passive measures for everything.”

• Media Executive

“The holy grail for the industry is single source across media and purchases. The problem is you can’t do it.”

• Research Vendor

“Holy grail is single source measurement: Log level data, pulling together client databases – linking that to set top box data.”

• Agency Executive

While the industry is bullish on single source, there is still recognition that it has limitations for lower reach vehicles.

“Tracking with fragmentation is a problem – the minute you go away from prime it is unreadable.”

• Agency Executive

It should be noted that CIMM is embarking on two single source panel tests (with comScore and Arbitron) and that Nielsen is enhancing their cross-platform panel. There is general excitement about these tests and approaches, but also an acknowledgement that this is a longer-term solution and that more immediate solutions are needed.

11 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

Agencies are also starting to develop their own single source solutions for planning. Universal McCann recently announced a partnership with Nielsen to enhance econometric models with more granular data and is in the process of reworking itself from a reach and frequency based planning and buying agency to one where data insights (gleaned from both its partnership with Nielsen and purchasing data from sources like Catalina) drive decisions in real time – seeing its model as a Wall Street-like Exchange. As reported in MediaPost on April 19, 2011, "[We can] observe rather than infer in real-time, and analyze the impact of extremely discrete media buys – even the daily contribution that TV advertising budgets are having on a marketer's sales for one show vs. another," according to UM’s Hari Abhyankar, senior vice president-business insights & analytics. (See http://www.mediapost.com/publications/?fa=Articles.showArticle&art_aid=148926&nid=125868).

. Mobile is Pushing the Cross Platform Advertising Effectiveness Measurement Issue

For all constituents interviewed, 2011 is the long-promised “year of mobile.” While mobile devices have been around for over 15 years, their increasing usage for media access and the consumer excitement over them has fomented interest from both research vendors and end users of cross media research. The most common item on the “wish list” of all end users of Cross Platform Advertising Effectiveness Measurement was greater understanding of and integration of mobile usage and advertising data. As noted previously, this is particularly pronounced among those looking at television viewership on tablets and smartphones as this mobile activity goes uncounted towards larger reach numbers. Research vendors are in turn excited about the potential for a new mode of capturing consumer media usage that has the promise of in-the- moment, media consumption and in-store activity. They also note that there are major player in the market who have data on usage that would be enormously helpful to the market (Apple) but have to date decided not to share any of that usage.

“3rd screen mobile – we need to get there, but we're not sure how.”

• Marketer

“We don't know much about mobile at all – we are going to be left behind – we think it skews towards teens.”

• Marketer

“Why are they using tablets?...I am watching XXX and then launch something on my tablet. What is that additional ad experience?”

• Media Executive

“Mobile is an enormous problem, and becoming more important – if we had the right sort of measurement we'd be on pace to surpass online. We are already halfway there.”

• Media Executive

12 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

“Apple is inhospitable to third party measurement.”

• Research Vendor

• Desire to Build a Better Mousetrap

While the shiny object of single source measurement is on the horizon, there is a consensus that we need better tools today for helping marketers effectively measure cross media campaigns. Agency and media constituents expressed weariness with “same old” approaches to cross media (especially those with no additive learnings) and eagerness to try new methodologies. Some also noted new vendors in market that were added to the interviewee list.

“I'm all for having more competitors – they all push each other”

• Marketer

Agency representatives were particularly eager to get at measurement that more closely reflects how people use media today and hoped that they could move their clients away from what one agency executive noted was “addiction to these studies.”

"I wish as an industry we would work together to come up with a solution and stop making publishers waste time...We don't expect perfection…We expect attempts.”

• Agency Executive

What is the ideal scenario for Cross Platform Advertising Effectiveness Measurement expressed by the majority? Few interviewed had extremely high ideals that included perfection of research design within Cross Platform Advertising Effectiveness Measurement projects, but instead sought workable, affordable tools that created new insights. They acknowledged findings might not be empirical, but instead useful and directional.

“If it's done right, it's fine – it seems to be an achievable methodology.”

• Media Executive

This stands in marked contrast to the sorts of discussions conducted on a listserve popular among media researchers: “Wonks,” which was founded in 2006 by then Director of Research at Google, Rick Bruner. The 300+ member listserve currently operates as a no-holds-barred forum for information and debate on research issues of which perhaps the longest standing is the efficacy of passive cookie based measurement vs. panel-based measurement in the online sector. A recent post on the issue generated 62 heated comments from those representing all constituencies involved.

Few of the executives interviewed for this report chose to get down to this sort of highly granular level of debate on research data capturing methodologies. They acknowledge the challenges to all methodologies and have moved on. They want to develop a solid level of confidence that the methodology will produce valuable insights, not continue to focus on discussions that they see as sidetracking the larger issue they are trying to solve for: Am I making the smartest decisions possible with advertising dollars?

Several interviewees expressed a distinct knowledge gap about what precisely is going on with the current state of Cross Platform Advertising Effectiveness Measurement given all of the

13 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

vendors and evolving methodologies. The lack of standardization along with the simple need for updated knowledge of current tools and techniques was viewed as a way forward.

“There are no standards [to cross media]. Even after 10 years it hasn’t worked…Why is there such a disconnect between time spent with digital and spend? Partly because there are no standards across which cross media can be evaluated.”

• Research Vendor

“We don't know what we don't know – if we can package it up in a way that is helpful we may be halfway there. Can we simplify and educate?”

• Marketer

“Let’s try to understand what is out there. Let’s try to learn together.”

• Research Vendor

II. STRENGTHS, OPPORTUNITIES & CHALLENGES OF EACH METHODOLOGY:

In this section we outline some of the challenges and concerns with OTS, lab based and other methodologies used for Cross Platform Advertising Effectiveness Measurement. Given the feedback from end users, we were aware of many of the limitations going into the interviews with research vendors. However, we were also pleasantly surprised at the level of innovation and thinking that has been going on to try to address these challenges amongst all key Cross Platform Advertising Effectiveness Measurement vendors.

OTS – Opportunity to See:

Traditional “opportunity to see” Cross Platform Advertising Effectiveness Measurement methodology relies on surveying users who have been exposed to digital advertising (as well as a control group) and linking media exposure to broadcast and print schedules in the survey design. The survey includes questions about media usage and specific programs watched/print publications read in order to measure opportunity to see offline media. For online media and all forms of digital advertising, passive ad exposure is obtained (based on tagging and cookie based tracking). The corresponding control group should have not been exposed to the digital advertising.

“The amount of advertising the consumer has seen is therefore measured in terms of Opportunity to See (OTS), the industry’s agreed-upon measure of audience. (Ephron,

14 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

2004) OTS is defined as the number of times a member of the target audience is exposed to the advertisement (campaign).”

• Economist Glossary of Press Research & Planning

Typical brand metrics produced from OTS studies align with a funnel perspective of advertising impact along the visualization of chart 3.

Chart 3

Unaided and aided awareness and recall are the first consumer impact achieved, message association and brand attributes and favorability along with general search activity fall mid funnel. Towards the bottom of the funnel you have more focused search activity, preference being developed, the influence of word of mouth and finally the purchase and brand advocacy.

There are some differences in the metrics that different vendors measure – specifically, some focus on ad recognition (more on that later), and those vendors with large panels are able to provide richer, passively-measured online behavioral data. [The output of these studies typically looks like chart 4 – see next page].

In some cases the focus is on percentage lift in brand metrics among different exposed cells. In the example provided by Millward Brown, they have translated percentage lifts into cost per brand metric lift.

15 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

Chart 4

OTS is the most common methodology currently used for Cross Platform Advertising Effectiveness Measurement. As one research vendor said, “the survey approach is great, because it truly is single source…also, single source behavioral data doesn’t get at the attitudinal impact of the advertising.”

Although OTS is the most common approach to Cross Platform Advertising Effectiveness Measurement, there are challenges with implementation and limitations to getting at true experimental design. Dr. Paul Lavrakas, who was interviewed for this whitepaper, authored a report for the IAB (An Evaluation of the Methods Used to Assess the Effectiveness of Advertising on the Internet—IAB, May 2010) that outlines several of the issues/considerations for online ad effectiveness studies – these same issues (and potentially many more) also apply in the Cross Platform Advertising Effectiveness Measurement space. Most vendors are aware of each of these limitations to the methodology and some have come up with distinct approaches for addressing these issues.

Below we list some of the challenges as well as what we consider emerging potential best practices for overcoming the issues. It’s important to note that we are highlighting some of the unique approaches that vendors are using for OTS studies. In order to validate some of these best practices we later make recommendations on additional “research on research” to help move overall understanding forward. Nielsen has also outlined some of these challenges in chart 5 [see next page].

16 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

Chart 5

Below we summarize some of the concerns (and opportunities) that fall into four major buckets:

i. Apples to Oranges – Digital media and offline media OTS are captured in different ways within different time frames making it difficult to compare media on a level playing field. ii. Murky Experimentation – Difficulty of getting clean control groups and exposed groups between different combination of media exposure cells. iii. Freaks and Geeks – Respondents to site-based survey recruitment may not be representative of those exposed to OTS due to low response rates, survey length/complexity or other reasons. iv. Dropouts – Respondent fatigue due to length of survey causes incompletes and respondent bias (related to iii – these may be “Freaks & Geeks”).

Fortunately, there is a lot of innovation to try to address each of these issues.

i. Apples to Oranges

Challenge: Different Data Gathering Techniques Based On Media – TV, Print, Radio Based On OTS, Online Passively Tracked Based On Cookies

One common critique of OTS is that offline media is measured in a different way than digital media exposure (which is passively tracked through cookies). Typically, survey questions are asked about whether respondents viewed specific programming or read specific issues of magazines or newspapers. In this case OTS can be swayed by memory biases or platform delivery (if programming is viewed on a DVR and the ads are avoided).

17 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

“OTS is not fair to digital. Offline media impact gets overstated.”

• Agency Executive

“When you have a tagged exposure for one, a probability exposure for another and an in-market exposure for a third, it is not ideal.”

• Agency Executive

• Potential Best Practice: Design Survey Questions To Best Get At Actual Viewing Of Ads

Ask questions about “recognition of ads” vs. opportunity to see. SymphonyAM shows creative executions in the survey from all media types to get closer to whether respondents actually saw the creative. By changing the key denominator from OTS to recognition, they believe they have created a level playing field, especially, when OTS based media exposure groupings are hard to establish. An OTS based experimental design is hard to execute when a campaign has been running for a long time or when sufficient ‘control’ impressions are not available or a particular media buy (e.g., TV) is so large that almost everyone has an opportunity to see an ad within that medium. SymphonyAM has validated this approach across many studies and found statistical similarity between “non ad recognizers” and non-OTS groups across key brand metrics.

“To implement this approach, SymphonyAM shows the original commercials to all respondents and asks whether they recognize the ad in the real media context. To ensure unbiased responses on all key brand metrics questions (e.g., Awareness, Opinion, Purchase Intent, etc.), SAM shows these commercials only after respondents have finished answering those questions.”

Below we recommend further research to get at the best solution. As Marketing Evolution points out, sources of awareness can be skewed for higher reach vehicles like television [see chart 6].

Chart 6

18 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

Challenge: Varying Times from Exposure To Recruitment For Different Media

Online survey recruitment typically happens immediately after ad exposure (although it is possible to recruit at the end of a session on a specific website or at a later time through retargeting). TV may have been one day previous and print up to a month ago, creating a bias toward recall of the digital ad impressions.

• Potential Best Practice: For Online, Measure Some Time After Exposure And Ideally Build Response Curves To Look At Impact Over Time

By recruiting through a panel (or retargeting survey invitations with high reach online vehicles) you can control time delay since online exposure.

Marketing Evolution approaches the problem by modeling the decay effect of time since exposure and factors this into their outcomes.

In order to compare the impact of smaller media and non-taggable media like search advertising, MetrixLab has developed specific approaches. The basics for this approach is that people’s behavior is measured and controlled and that there is a time delay of 4 days between exposure and post measure – so that they look at long term effect of the activity as they do for other media.

ii. Murky Experimentation

Challenge: Control Group Development

There are a few challenges to developing accurate control groups, the most notable of which is the increasing reluctance of online users to participate in online surveys. Those interviewed who have used this methodology cite the radical decline over the past decade in acceptance into either control or exposed groups based upon “on-site recruitment.” Among our interviewees, we heard about variability in on-site survey response rates from as low as 0.01% and as a high as 5%.

In addition, certain demographics are the most challenging to recruit online. Younger males are the hardest to reach based on on-site survey recruitment. Respondents tend to skew older and more female, a demographic that has its own behavioral traits: according to comScore, women tend to have lower levels of awareness lifts and higher levels of purchase intent lifts. Thus, if the survey respondents skew more female, discrepancies in actual effectiveness can occur.

The development of control groups and the need to devote costly ad impressions that could be sold is perhaps the most onerous issue for media executives. This can lead to the creation of control groups that do not truly match the placement of the actual campaign and thus skew the results. We also heard that site owners do not want to compromise the user experience by having too many surveys on their sites.

19 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

“The publishers don't give enough impressions to get a full control group – they will run a campaign that is tightly targeted to allergy sufferers – then they give ROS for control impressions”

• Research Vendor

Within the research community, there is also concern over the impact and incidence of cookie deletion. Various studies over the past few years have noted cookie deletion rates of 24% - 40+% [see chart 7], which can contaminate control groups and understate frequency of exposure.

Chart 7

Materials supplied by comScore showed that from their panel, among cookie deleters, there were an average of five cookies dropped from the same web site and seven different ad server cookies from the same campaign. Another challenge is that it is not known precisely what is the demographic skew of cookie deleters(cookie deleters were once thought to be young, male and tech savvy, but since cookie deletion is now a simple action on most computer browsers, that may no longer be the case).

“Cookie deletion causes contamination in both directions for the control and exposed groups.”

• Research Vendor

20 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

• Potential Best Practice: Beaconing/Usage of Flash Cookies

Some CIMM vendors are pre-seeding panelists with Flash cookies to determine exposed vs. non exposed cells. InsightExpress, IPSOS/OTX and MetrixLab are using this technique (which they call “beaconing”) to ensure purity of control group development. Flash cookies cannot be deleted through browsers (a user has to go several layers into the Flash player on their computer to delete them) and have been estimated to have only a 3% deletion instance, according to InsightExpress. In addition to beaconing digital ads, vendors have the ability to beacon key pages on marketer’s websites, microsites or YouTube pages to see how they impact branding metrics.

“A is generated for each of the online activities – then the cookie is scraped with questionnaire completion.”

• Research Vendor

• Potential Best Practice: Modeling for the Control Group

comScore is using Bayesian modeling techniques for control group development for online ad effectiveness studies (SmartControlTM). Essentially they are building models to forecast recall, purchase intent etc. at zero exposures (based on curves from different frequency of exposure for digital media). comScore has been testing and validating their SmartControl approach with several studies (including a paper published with Yahoo!). However, some of our respondents feel that further validation is needed. A similar approach could be applied to Cross Platform Advertising Effectiveness Measurement studies if we had access to frequency curves for offline media based on OTS measurement. This avenue is worth investigating with select vendors.

iii. Freaks and Geeks

Challenge: Respondent Bias: Survey Not Representative Of Actual Demographic Reach of Campaign

All research vendors interviewed acknowledged the growing challenge of survey recruitment and a potential bias among those who do answer (most often noted as more female and older). [See chart 8 for an example of response bias and how it can skew results.]

This leads to the challenge of those who take the survey potentially not being representative of those who saw campaign.

Chart 8

21 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

“You run 1000 survey invitations and get 16 weirdos.”

• Research Vendor

• Potential Best Practice: Weighting

Chart 9

Many vendors use weighting techniques based on their knowledge of the demography of whom the campaign actually reached. [Chart 9 from comScore shows how the results of OTS change with and without weighting.]

Weighting is likely a viable best practice, but its usage should be clearly stated upfront and if there is more than “light weighting” the results could be erroneous. Also, research vendors need to be transparent about how they are weighting their results and whether weighting could yield unreliable results. One agency researcher mentioned that she personally oversees all weighting of OTS studies on behalf of her client, and that a black box is not acceptable.

“Sometimes weighting is so far off due to sample that the results are unreliable.”

• Agency Executive

"You need to threaten them and cajole them (vendors) for transparency [in respect to weighting]."

• Agency Executive

22 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

• Potential Best Practice: Hybrid of Digital Recruitment Post Exposure and Panel Recruitment

Companies like InsightExpress and Knowledge Networks are using a hybrid approach of recruiting based on digital activity but also supplementing the sample with people who were exposed to the ad and then recruited from panels [see chart 10 for a visual from InsightExpress on hybrid recruitment].

Chart 10

This hybrid approach allows you to survey people at intervals after ad exposure and supplement the sample with respondents who fit specific demographic groups.

“To the extent possible, we want to move towards panels for recruitment.”

• Agency Executive

• Potential Best Practice: Panels of Panels: Beyond the Mega Panel

In the ‘00s there was a move to create mega panels (beyond 1 million participants) by comScore and Nielsen in order to get at the granularity of usage demanded by online. In addition to their own panel, InsightExpress is working with four different panel sources and has combined their capabilities into what they call the Ignite Network. This is essentially a mega-panel comprised of multiple panels where users with various attributes are tagged with the hard-to-be deleted Flash cookies. This not only provides a broader base of media exposure but can be enhanced with third-party database matches to find panelists within particularly hard to recruit demographics or very specific media or purchasing behaviors. This ‘panel-network’ concept

23 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

addresses the scale issues inherent to measuring ad-effectiveness and cross platform research via a panel, and solves for the size requirements needed to match with small offline transaction databases. (It should be noted that in 2006, Ipsos pioneered this methodology and presented their findings at ESOMAR.)

“[Using this method] is much easier to find sample for OTS and exposed. You can cherry pick from a panel vs. having to run tons of pop ups from websites. You can also do third-party database matches - e.g., CRM.”

• Research Vendor

• Potential Best Practice: New Respondent Sources Such as Facebook

Originally developed to measure premium online branding campaigns on Facebook, Nielsen has expanded its service to recruit respondents who were exposed to advertising anywhere online and then survey them using a lightweight online survey on Facebook (typically eight total questions with individual respondents answering one single or one two-part question). [See chart 11 for a visualization of how Brand Effect works.]

Chart 11

Given Facebook’s huge footprint of 600MM+ unique users worldwide, Nielsen is able to recruit both control and exposed groups of reliable size on one property alone. Response rates are much higher than typical survey invitations (due to user engagement with Facebook and the simple survey format). In addition, Nielsen is able to passively provide information about demographics and geography based on users’ Facebook profiles, calibrated to verified Nielsen

24 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

panels. While these studies are currently only being conducted for online ad effectiveness, Nielsen is exploring ways to model TV exposure as a proxy for OTS. In addition, Nielsen is hoping to make Brand Effect a more open platform so that respondents can be recruited on other large-scale websites (and married to site registration data or profiles). If we are able to eliminate questions about TV exposure in OTS surveys we could still passively measure the combined impact of TV and digital.

Challenge: Respondent Bias – Online Samples Overly Representative of Heavy Internet Users and Light TV Viewers

US online penetration is currently at around 78% according to Pew and ITU statistics from 2010 and thus not representative of the whole US population. Research dating back to 2003 (DoubleClick/Nielsen//NetRatings/IMS Cross-Media Reach and Frequency Planning Studies http://www.grabers.com/library/imc/archives/support/onlineReach.pdf) showed that heavy online users tended to be light TV users.

• Potential Best Practice: Probability Based Panel Recruitment That Includes Light Internet Users as Well as Heavy TV Viewers

If recruiting from a panel, ensure you have a stratified sample that includes light Internet users as well as heavy TV viewers. Knowledge Networks manages this challenge by actively recruiting non-Internet users into their panel and supplying them with netbooks and Internet access.

“[By augmenting with a non-Internet user sample] we are able to get non-Internet households including Hispanic, African American and C&D counties.”

• Knowledge Networks

This can also be done by augmenting an online sample with an RDD phone/mobile sample.

iv. Dropouts

Challenge: Long/Unengaging and Badly-Phrased Surveys Lead To Non-Response or Poorly Considered Response

One of the benefits of OTS is getting detailed responses on the attitudes of consumers, but long surveys are onerous for respondents (survey fatigue is endemic and non-response is on the rise). There is growing cognizance among research vendors that there is a self-selected group of completers for these types of surveys which likely does not reflect those actually exposed to the campaign (see point iii about “Freaks & Geeks”).

“Some of the questionnaires get long and tedious – this is an industry problem. Four years ago you could ask questions and people were happy to answer them, now it is not the case.”

• Media Executive

25 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

There is also discussion around how to ask questions about media exposure that will generate reliable responses and the difficulty of getting at actual exposure of consumers to the ads:

“Researchers must ask questions of consumers in a way that they can answer.”

• Research Vendor

“Sometimes when we describe TV viewing it can be too broad – e.g., do you normally watch TV on Wednesday night.”

• Agency Executive

“Over time it has become impossible [to get at actual exposure of ads] it would be a 10 hour survey. We've moved to recognition and ‘sources of awareness’ [such as] ‘where do you remember hearing about Brand X?’”

• Marketer

“We should be aiming for 10 minute maximum surveys – by the end, your eyes started to blur and you don’t know what planet you are on…we have proof that longer surveys have severe biases.”

• Media Executive

• Potential Best Practice: Take a Closer Look at Survey Design

Several vendors noted that survey design is one of the most overlooked aspects of the process. They attempt to make surveys fun and engaging. There are game-like elements and highly developed graphics that encourage attentiveness. Have those surveyed ever been asked to catch a virtual butterfly to begin?

“The quality of questionnaire is very important…This does not get a lot of attention. It's so important that it is engaging and people are willing to fill it in…if I think about the way research is currently being done – there is so much room for improvement.”

• Research Vendor

The following are links to examples of MetrixLab’s approach to survey design:

o SiteTurner (browsing through websites): http://websurvey2.opinionbar.com/go.asp?s=p09_siteturner o FocusTracker (online eye tracking): http://websurvey2.opinionbar.com/go.asp?s=p09_focustracker

• Potential Best Practice: Ask Consumers About Media Usage In A Way They Are More Likely to Be Able To Give Accurate Answers

3DAccountability notes that for OTS with television, consumers are often asked what network they were watching at a specific time. They have found that while consumers can tell you

26 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

whether or not they were watching at a specific time pretty reliably, recognition of a specific network is much more challenging than asking whether they watched a specific program. Consumers are loyal to programs, not networks, and with so many programs in syndication, asking them to recall where they watched the programming is not always a wise idea (although there are likely exceptions for networks such as ESPN and MTV).

Lab Testing:

To get at pure experimental design, some marketers are implementing lab based testing. These can either be in real lab settings owned by the media companies or agencies (e.g., CBS’s Television City in Las Vegas, ESPN’s lab in Austin, TX and IPG’s MediaLab in Los Angeles), rented focus group facilities or online lab-based studies.

Respondents are randomly recruited into different cells of exposure (e.g., no ad exposure, TV only, online only, TV and online). Generally they are shown the TV programming with the ad embedded and/or a webpage with the ad hardcoded (exposed cells). After a distraction task (e.g., feedback on the program itself) they are asked questions about brand recall, brand perceptions and purchase intent. Any lift compared to the control group is attributed to ad exposure [see chart 12].

Chart 12

Lab testing is favored over OTS by many agency and media executives interviewed due to the fact that it is true experimental rather than quasi-experimental design. Marketers and agencies appreciate it when media companies sometimes offer lab tests as a part of larger advertising buys. Lab studies also have the benefit of no added inventory cost to the media vendor

27 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

(although management of the lab, respondent recruitment and incentives can be costly). It is also possible to recruit hard to reach groups (young men, Hispanics, kids) through lab testing.

“True experimental design is not feasible in cross media unless it is in a lab environment.”

• Research Vendor

“We worked with Ipsos/OTX on forced exposure to a TV clutter reel and TV + online exposure for [program x]. The benefit was guaranteed sample, forced exposure and the ability to find kids.”

• Marketer

“We are looking to compare differences and you need to know the lift…you cannot control ‘what is lift over TV exposure’ using OTS.”

• Media Executive

The agencies/media companies managing their own labs include: CBS, ABC/ESPN, MTV and IPG. Of the vendors interviewed, only Ipsos/OTX offers lab testing.

Challenges: Cost, Media Complexity and the “Reality” Factor

While many interviewed favored the lab approach, they all acknowledged challenges, only some of which can be controlled for (such as cost). Several executives interviewed believed that lab testing displayed a bias towards emerging and non-intrusive media. The most notable challenge is that the lab environment – no matter how much it is designed to look like the prototypical American living room – is an artificial one. Behaviors may not match the typical day-to-day frenzied experience of people going about their lives – in addition to consuming media.

“The only way to get at cross media is to do it in a lab setting…however, it is not cost effective.”

• Media Executive

“We struggle with lab studies because digital has so much complex targeting - you can't replicate the digital campaign in a lab environment

• Research Vendor

“The issue is face validity – some clients don’t believe a lab can replicate reality”

• Media Executive

Those using labs encouraged the following for best results:

• Potential Best Practice: Ensure You Have Time to Impact the Campaign

28 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

Agency executives who use lab testing would prefer to use it to help drive creative, but the reality is that the creative is not often available on time and is instead used for last minute tweaking rather than truly having an impact on the overarching direction of a campaign.

“The right way to use the lab is to test beforehand and put the best combinations out there.”

• Media Executive

“Labs can be more actionable if you run them before your campaign starts – but the reality is that timing is usually too tight for creative.”

• Agency Executive

“Very few people are testing before their campaign launches. Reasons are time pressure and no time for tweaking/editing.”

• Media Executive

• Other Potential Best Practices:

Of the vendors and end users employing lab testing, some of the most interesting and potentially valuable extensions to the concept include:

. In-lab use of eye tracking to see how people are processing ads (ESPN is currently doing this). . Less expensive, quasi lab experiments: Send DVDs with programming to participants and then conduct surveys online (Ipsos/OTX employs this method). This can be more cost effective than bringing people into a lab, and good for recruiting a broad geographic footprint or hard-to-find audiences. . Forced online exposure (sometimes referred to as distracted exposure since respondents don’t know they will be asked about the ad), even to TV programming or mobile apps. This can be a more cost effective and faster approach compared to in-lab experimentation. In this case the respondent would be asked to go to a website where they would view television or online content/programming with the ads embedded and later answer questions about recall/brand impact.

Other Methodologies:

There are additional ways of getting at cross media research, which are for the most part beyond the scope of this report, but should, however be noted. Each of these can be important tools for effective brand and media management. In our recommendation section, we discuss best practices for leveraging ALL of these insights to paint a holistic picture for marketers.

Nielsen IAG: Testing Viewer Engagement with Ads

29 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

Nielsen IAG: Through a panel that Nielsen manages (rewardtv.com), Nielsen IAG has a syndicated research product that measures viewer engagement with television advertising and ad performance metrics such as recall, brand linkage, message linkage, and likeability. Nielsen IAG has recently launched a cross platform measurement system. Online ad exposure is passively measured through tagging panelists, and they can report on lifts between those exposed only to TV ads vs. those exposed to both TV and online ads.

Challenges: Panel Composition, Limited Metrics

Several of the media companies (and some agencies) subscribed to this service to help measure “breakthrough” of different ads and feel that this service is valuable in helping to optimize television creative. However, there are several concerns with this methodology. The representativeness of the panel is questionable. (respondents choose to answer trivia questions about TV programming for “cash and prizes”), the number of metrics are limited, and currently their offering is limited to TV ads in narrative programming as well as passively measured online media.

Brand Tracking Studies

Many clients have continuous brand tracking studies that may include ad recall, brand perceptions and purchase intent.

Challenges: Granularity, Inconsistency with other Research Methodologies

Within these studies marketers may ask questions about media habits, however, brand trackers generally do not get granular enough to get at specific programs watched or websites visited (and when).

As several executives noted, the results of the brand tracker may not be cohesive with results from OTS. One advertiser mentioned that her OTS study showed a huge lift in key brand metrics. However, when the same firm ran their quarterly brand tracking study they did not see any significant lifts. Was it that the campaign did not have enough weight to breakthrough? Or were the results of the OTS study incorrect?

“TV, print, online work really well – but may not show up on tracking results.”

• Research Vendor

“Our brand tracking has a disconnect with what we are seeing from the brand impact studies and that is difficult to reconcile.”

• Marketer

30 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

Media Mix Modeling

Many of the advertisers we spoke with (as well as agencies’ clients) are already conducting marketing mix modeling to help them determine relative ROI of different media investments and optimize their spending.

Challenges: Timing, No Understanding of Media Interaction

While MMM is an accepted tool at both the CMO and CFO level within many organizations, there are some perceived limitations: There is typically a lag time between media investments and analysis. Also, timing may not align with media planning periods.

“Models are only done annually – it could be 15 months after program before reading results.”

• Agency Executive

Another limitation is that the results tend to be focused on individual media in isolation and MMM is not ideal at picking up the synergy of multiple media (since the models use aggregate, not respondent level data).

“When you run MMM you get a main effect for each media – you don't get interaction. Over the years I have asked about interaction, but have only gotten half- baked answers.”

• Marketer

Perhaps the greatest limitation of MMM in the digital age is that many marketing mix modeling firms analyze digital media investment at an aggregate level (e.g., all display impressions, search spend over time). Often smaller or emerging tactics (e.g., advertising on mobile media – both phones and tablets – and social media initiatives) are not readable in the models either because the spend level is too low to be significant and/or the modeling firms don’t have expertise in collecting and analyzing input data.

Heavy-Up Tests

A few marketers/agencies mentioned that they conduct in-market heavy up tests to understand the relative contribution of different marketing tactics.

Challenges: Execution Complexity, Relevance in the Digital Age

We heard from our interviewees the following challenges: Difficulty of finding exactly matched markets; difficulty of manipulating multiple media types with any confidence in results; external factors that often play a part in sales impact but are not accounted for. While some media researchers relied much more heavily on these tests in the pre-Internet days, their interest in

31 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

them and confidence had waned due to challenges in execution that come with an increase in media types used for the tests and the cost involved.

“In pre-Internet days, clients would execute heavy-up test in specific markets. They were tightly controlled – they could understand impact in changes in weight levels. It is much more difficult [now]. It is difficult to change weight levels in different DMAs. I haven’t seen this rigorous design compared to days when it was simpler.”

• Media Executive

• Potential Best Practice: Connecting the Dots

There is universal recognition that there are strengths and limitations to different modes of Cross Platform Advertising Effectiveness Measurement. Some techniques are better for measuring low spend media tactics. Some only get to branding impacts while others are able to measure full ROI and marginal contribution. It is unrealistic to conceive of a single solution that will answer all of our cross media impact questions (even with the holy grail of single source measurement). Additionally, several marketers have found a disconnect between the results of their tracking studies (and/or MMM) and results of their Cross Platform Advertising Effectiveness Measurement studies and have had difficulty reconciling them.

Some advertisers, agencies and vendors are working on solutions that integrate the best solutions from different methodologies. (And indeed companies like comScore and Nielsen are continuing to either develop ancillary research modes or purchase companies that enable them to do so). Marketing mix modeling firms (like MMA and MSP) are offering OTS Cross Platform Advertising Effectiveness Measurement studies and integrating the findings into their forecasts. (for references see: http://www.marketshare.com/connect/press-releases/139-marketshare-x1- partner-to-connect-cross-marketing-analytics-with-digital-targeting-and-optimization; http://www.mma.com/m360.html).

In a study conducted to support the launch of a campaign for a Dove Men’s product, Millward Brown, sister company Dynamic Logic and client Unilever worked together to create a research program (rather than one-off project) that would determine how to best incorporate broad reach media and low reach, targeted media in the same research while being media neutral, especially when the media were delivered simultaneously. By working with all possible research techniques, including creative pre-testing, they were able to determine relative contribution of various media [see chart 13, next page] while also conducting deeper dives into digital media by individual element.

32 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

Chart 13

Their key takeaways from the program included: Pre-test as many assets as possible to insure optimization; drive continued association among their target by using very functional messaging; launch the campaign with a splashy initiative in multiple media; work to understand how the synergistic effect of multiple media channels drives success.

• Potential Best Practice: Multi-Vendor Work is Possible with the Right Expectations, Coordination and Analysis for True Insights and Recommendations

If you are working with multiple vendors for tracking, MMM, Cross Platform Advertising Effectiveness Measurement and single media brand impact studies, get everybody in the same room. Talk about how you can calibrate and integrate results and set minimum and more far reaching expectations in advance. This does not have to be a single vendor solution as long as you orchestrate how you will look at and interpret results.

“A lot of our focus is on how to integrate the analytics. We design jointly upfront so that we can talk about any potential differences in advance.”

• Research Vendor

The new Cross Platform Advertising Effectiveness Measurement-focused company 3DAccountability sees one of its roles as integrating and translating research findings from various studies to ensure management has a single set of realistic recommendations. As founder Ethan Rapp says: “You’re not done after the research execution. You need to interpret and take action.” He cited an example of a company that commissioned a $300K plus study to help them understand how their products could appeal to women. “The answers were in the report, but they were so complex they needed a translator.”

33 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

“Vendors are so busy doing so many studies that they are focused on the implementation. It seems to be leaving strategic value-add on the table.”

• Research Vendor

“A study that is in a file cabinet is not worth the paper it is printed on.”

• Research Vendor

III. FUTURE OPPORTUNITIES FOR CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT:

In all of the interviews conducted, of both end users and vendors, there is little desire to anoint one solution or vendor as the standard for Cross Platform Advertising Effectiveness Measurement – the issues are just too complex. But there is a great sense of promise felt by end users about the innovations and experimentation going on with the vendors interviewed here. Research vendors who are increasing panel size and richness plus merging their own data with new data sets (many of which include purchase behavior) hold special appeal. Marketers, agencies and media executives are all interested in moving beyond current methodologies and getting to more single-source solutions. (Note, they do not believe that one all-media usage single source is realistic, but that there may be panel-matching techniques to meet the needs of various categories of advertisers.) As those solutions are being developed, all parties want to test cost efficient tools that yield new insights rather just validating what they’ve known for the past ten years about how media work together.

Each of the vendors interviewed did much less selling than they did explaining how they are finding work-arounds to challenges. They are acutely aware that the market is demanding a new level of Cross Platform Advertising Effectiveness Measurement and that there are great business opportunities (and challenges) posed by the rapid change in consumer behavior, especially in respect to video consumption and mobile devices. Each interview closed with a “wish list” of what the industry can do to foster better solutions for CIMM.

Below we have listed ten potential ideas for helping move the state of the industry regarding Cross Platform Advertising Effectiveness Measurement forward. We are bucketing these opportunities broadly into the following areas:

i. Support & Test Data Mash-Up Methodologies ii. Improve Current Methodologies iii. Validate/Establish Methodological Best Practices iv. Tackle Missing Data Issues v. Foster Shared Learning and Best Practices

34 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

i. Support & Test Data Mash-Up Methodologies

Opportunity 1: As a Proxy for Single Source, Support & Test Data Mash-up Methodologies

While nirvana for media researchers is a single source panel that includes passive measures for all broadcast and online media as well as purchase information, the reality is that there are always some gaps in bringing everything together. But as more and more data on media usage, ad exposure and purchase behavior is captured and becomes available, there are enormous opportunities to find connection points within the data sets. Marketers, agencies and media executives should look at the breakthroughs made in connecting various data sets to passively demonstrate ad effectiveness. To date, most of these studies have typically been focused on one medium, but there is opportunity to expand these to cross-platform studies. For example, online advertising leaders formed partnerships with best-in-class data partners by vertical industry to glean new insights from “mash-ups” of their data sets. Yahoo! and Nielsen formed the ConsumerDirect product in 2003 which linked Yahoo! registered users and Nielsen Homescan panelists. This methodology now exists as a product within Nielsen that can be used to show ad effectiveness on buys that extend beyond Yahoo! The product was the first able to connect online ad exposure to retail CPG purchases. An innovation like this proved demand for such research and other products were developed to meet the needs of various marketing categories. comScore’s Retail Impact product married credit card purchases with online ad exposure for specific retailers. Similar work has been done with Polk and JD Power data for the auto industry. Set top box vendors are also developing their own database matches. TRA has a 300K HH panel match for CPG, 620K HH match for Rx and 1MM HH match for auto.

While we are still moving toward the end goal of single source, we should be parallel-pathing analysis to find “best of breed” data providers who can help us merge digital ad exposure, TV and radio program viewership (or OTS ads), print readership as well as purchase (or other bottom-funnel metrics) for specific categories. With any type of “fusion” or overlap methodology, establishing sufficient sample size is a key concern. Identifying potential partners and establishing tests with CIMM members can jumpstart innovation in this nascent area.

ii. Improve Current Methodologies

Several of the research vendors (as well as the top echelon researchers from the media, agency and marketer interviewees) have been thinking about how to improve Cross Platform Advertising Effectiveness Measurement for the past decade. If the solutions were easy they may have already been established. More work needs to go into determining the viability of moving forward with these tests, but there are several that we think are worth further exploration.

35 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

Opportunity 2: Simplify and Improve Survey Design

Validate which media questions are most closely predictive of actual OTS and whether recognition is a better metric than program/print exposure. Determine which branding metrics are most closely aligned with different KPIs (including sales). Finally, develop more engaging ways to ask respondents about their media habits than a lengthy survey. We need to conduct qualitative research and get feedback on ease of taking surveys. We can also test different survey instruments (and lengths of survey) using a single panel and determine drop-off rate, times to completion, comparison to passively observed ad exposure.

Opportunity 3: Develop Confidence in the Use of Modeling for Control Groups

Vendors are using Bayesian modeling for the development of sounder control groups for digital. How could these modeling techniques potentially work for cross media? CIMM could design experiments comparing traditional control group development for both on and offline media with modeled control groups to determine differences and potential biases.

Opportunity 4: Model OTS Instead of Asking Overt Questions via Surveys

Nielsen (with the help of Wharton and Stanford professors) is exploring the use of various data sources including metered media behavior from TV and online panels, demographics and psychographics, and online behavioral data to develop probability curves for offline media exposure. If we are able to relatively accurately predict exposure through online (or other data sources), Cross Platform Advertising Effectiveness Measurement surveys could focus on measuring changes in brand attitudes and intentions due to individual and combined media advertising exposure (based on probability models for offline media exposure).

Opportunity 5: Use Mobile Technology for OTS Data Capture

Mobile phones are to the ‘10s what TV was to the ‘50s: near magical media devices but with the unique quality of portability. They bring media into places it has not been accessed before, especially retail environments. Vendors testing surveying via mobile note the challenges of getting survey depth via feature phones (basic talk/texting devices) and the great possibility of surveying via smartphones. Knowledge Networks has been working with Techneos to develop a smartphone panel that captures usage via an app on the devices [see chart 14, next page].

36 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

Chart 14

They see this usage as a way of getting beyond recall to “in the moment” activity. Mobile, due to its power among certain demographics holds promise for the following:

. Young adult and Hispanic research: reach those not reachable through phone surveys or online panels. According to the CDC as of May of last year, 40% of people 18 – 34 had no landlines only cell phones, and Hispanics are more likely to go online via mobile than they are through wired connectivity (according to Pew, September ’10). . Longitudinal diaries of behaviors and attitudes . Measuring advertising effectiveness across all digital platforms, including mobile and traditional media . In-the-moment shopper insight and occasion-based research . GPS location-triggered media and promotion capture research . Mobile ethnographies and other tightly integrated quantitative-qualitative research . App-based media usage behaviour and attitudes

Perhaps the most encouraging aspect of the idea of using mobile for research data capture is the fact that consumers are open to it. Respondent counts echo the enthusiasm of the late ‘90s when online activity was novel enough that consumers were much easier to recruit for studies and panels.

“There is a need to capture digital and mobile media. We're very bullish on how you use technology in a way that is not self-serving. We got 65% of survey sample size requirements in the first five hours. You also get engagement from people you would not normally get (e.g., upscale males).”

• Knowledge Networks

37 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

iii. Validate/Establish Methodological Best Practices

Each of the vendors interviewed believes they have a slightly better mousetrap for Cross Platform Advertising Effectiveness Measurement. However, for the most part they did not provide validation to their methodologies. We think there are a few opportunities to help our overall understanding.

Opportunity 6: Comparison of Lab Studies to OTS

With two major forms of Cross Platform Advertising Effectiveness Measurement now in market, studies should be developed to compare results from OTS to those of lab-based studies. This is not to determine whether either methodology is more or less accurate, but whether there are consistent biases by technique that can be accounted for. This could also result in guidelines to help agencies and marketers determine which mode is more appropriate for testing given the variables included in the cross media campaign.

The majority of marketers and agencies were using OTS methodology for measuring cross platform campaigns. Aside from the issue of having the creative in time for a lab-based study, there are some marketers who don’t believe the face validity of a controlled media experiment. A test could be conducted of running side-by-side studies for three-to-five advertisers comparing the same key brand metrics. We recommend including campaigns that include TV, online and mobile elements (and testing all six permutations of exposure in addition to a control group).

Opportunity 7: Comparison of Recruitment Methodologies

What are the biases in terms of response rates and demographics as well as the tradeoffs (cost, advertising inventory required and the relative level of representativeness of the sample) from recruiting on site only, using a panel or a hybrid approach. What are the differences between using a Flash based cookie or a browser-based cookies for recruitment? These are all crucial experiments that need to be performed.

iv. Tackle Missing Data Issues

Opportunity 8: Use Collective Influence to Get At Key Mobile Data

Research vendors and Cross Platform Advertising Effectiveness Measurement end users acknowledge there are holes in US media usage data that are becoming even more significant with the boom in wireless media usage. Apple has to date not allowed panel-based companies to meter usage of their various devices. Though Macs now encompass 10.5% of the PC market by Q3 ’10 according to Gartner and IDC, these users are not a part of the comScore or Nielsen panels. Apple also now represents 25% of all smartphones in the US market according to

38 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

comScore data from December of last year. The touchscreen tablet category at this point in the evolution of the device is driven almost entirely by Apple (the iPad II alone is expected to account for 20 million units sold or 83% of US tablet sales in 2011 according to Forrester Research). Apple owners are early adopters who display behaviors that will trickle down to the rest of the US population. Understanding Apple device usage is critical to understanding how the mobile media revolution will impact all of media usage.

An organization like CIMM can use its collective influence to encourage Apple to become a more integrated piece of the media ecosystem rather than its own walled garden.

Short of Apple changing its usage monitoring policies, in terms of mobile, Google is the other research alliance to pursue. Google mobile devices are now outselling Apple’s in the US market and both their Android operating system (for mobile phones) and Honeycomb (for tablets) are as close as possible to the user experience of Apple devices and thus a viable workaround in the integration of mobile device usage. (It should be noted that according to comScore and Nielsen, Android users are younger and less affluent than Apple’s, likely due to lower pricing structures, but the basic functionality of the devices themselves are similar.) Apple and Google are currently engaged in a mobile development arms race of sorts and alliances with one may help to increase receptivity of the other to inclusion in research. They are two of the most highly capitalized global companies who morphed from technology to media companies and should be more closely integrated into the larger media ecosystem over time.

v. Foster Shared Learning and Best Practices

The vendors and end users interviewed here acknowledged the strengths and limitations to all approaches. Some of the most impressive presentations submitted were around how marketers could triangulate their media mix modeling, brand tracking, Cross Platform Advertising Effectiveness Measurement and single media brand lift studies if they are using a single vendor or multiple vendors. There is a great opportunity with CIMM members for shared learning in a few key areas:

. State of the Art Current Cross Platform Advertising Effectiveness Measurement Methodologies . Cross Platform Advertising Effectiveness Measurement in Context: The relationship to other brand and sales impact tracking and research explored

Opportunity 9: Cross Platform Advertising Effectiveness Measurement State of the Art Roadshow/Webinar Series

While the notion of education about Cross Platform Advertising Effectiveness Measurement for the greater good may seem basic, the need is there. The majority of the materials submitted by vendors for this project (over 1,000 pages in total) would be complex for someone with a typical

39 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

agency level of understanding of media research and can be vague about the specifics of what exact sort of research is conducted and outcomes produced.

Cross Platform Advertising Effectiveness Measurement vendors are obviously competitive and in their quest for product differentiation, they have created their own research language. Various companies interviewed had been re-branded in the past year, been bought and put under new company umbrellas and re-named, and have distinct names for everything from their panels to individual analytical tools that may not have much market awareness. One company offering Cross Platform Advertising Effectiveness Measurement is called SymphonyAM (until recently FactorTG) while for one of its competitors (InsightExpress) Symphony is the name for their suite of tools. There is also a social media measurement company owned by Kantar Group called Cymfony. XMOS is associated with a past IAB/ARF research initiative and is also the current name of the Cross Platform Advertising Effectiveness Measurement tool from MetrixLab. Millward Brown has trademarked the term: CrossMedia ResearchTM. Measurement outcomes are also not consistently named with some companies calling them ROI (return on investment) vs. ROMI (return on marketing investment) vs. ROMO (return on marketing investment and optimization) vs. ROO (return on optimization) and some measuring CPBE (cost per branding effect).

For a glossary of the nomenclature, see Appendix B, where each vendor was asked to fill in a landscape grid with specific attributes like product naming, panel size and derivation, sales data match partners and techniques. A simple tool like this and overall presentation that gets updated as new techniques and vendors come into market and new alliances are made – along with the normative database – would go a long way towards encouraging greater understanding of the capabilities of Cross Platform Advertising Effectiveness Measurement. These materials could be delivered in person through an agency roadshow-type format or through a series of webinars organized by CIMM.

“There is such a gap in knowledge about Cross Platform Advertising Effectiveness Measurement…within many companies. Through CIMM, can we simplify and educate? We could have a good step forward.”

• Marketer

Opportunity 10: Develop Best Practices Integrating Cross Platform Advertising Effectiveness Measurement into Overall Research Programs

Most marketers and agencies are also engaged in marketing and media mix modeling, brand tracking studies as well as one off brand impact studies for emerging media. It is sometimes difficult to reconcile results from Cross Platform Advertising Effectiveness Measurement and these other research efforts, diluting the impact of the studies. CIMM can develop a roadmap for how to develop a holistic research plan that incorporates brand tracking, MMM studies and single media brand impact or ROI studies. This can include upfront planning in understand the

40 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT role of each study, how to effectively work with multiple vendors and best practices for triangulating results to come up with actionable implications.

Although there are a lot of challenges to cross platform advertising effectiveness research, there is already a lot of innovation underway and many more opportunities (including the ones outlined above) to help us increase our understanding regarding the relative and synergistic effect of each medium. We are bullish on both improving current methodologies and pioneering new techniques and research “mash-ups”. Making marketers and agencies comfortable about their shifting budgets is essential as the media landscape continues to evolve in complexity.

As eloquently stated by one of the agency executives interviewed for this project:

“This has to be a decade of "test and learn."

41 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

APPENDIX:

A. Cross Platform Advertising Effectiveness Measurement Vendor List:

a. 3DAccountability

b. comScore

c. InsightExpress

d. Ipsos/OTX

e. Knowledge Networks

f. Marketing Evolution

g. MetrixLab

h. Millward Brown/Dynamic Logic

i. SymphonyAM (formerly FactorTG)

j. The Nielsen Company

42 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

B. Cross Platform Advertising Effectiveness Measurement Landscape: Cross Platform Advertising Effectiveness Measurement Landscape – OTS Capabilities Name of Time From Online Ad Own panel? Best practices for Link to Link to Cost Minimum Media Other Data Sources Cross Media Recruitment Methodology Exposure to Survey Partner? Sampling Approach measuring offline media Online Offline Per Campaign Additional Information Measured integrated/used Product Invitation Size? OTS Activity Activity Study Size comScore Brand Survey Recruit only people who Varies from immediately Yes - 2 million Based on online ad The more specific you can Mobile, online In addition to Brand Based on Based on $33- generally 40 Our methodology differs in several Lift - Cross have been exposed to an with some site world-wide exposure be in asking about media display, online Survey Lift we our panel our panel $200K million imps ways: 1. Smart Control allows us to Media ad from site and survey recruitments to weeks census consumption where an video, TV, print, include free in- and pixels and pixels - and up eliminate issues of test not matching panels with some survey panel calibrated offline ad is known to radio, cinema, flight campaign some control by modeling test only based recruitments. metered have appeared, the better outdoor audience partners responses and eliminates Our lift algorithm takes panel (not measurement of include contamination from people who recency into account so just survey GRPs, reach, Dunn appear control but who really just varied exposure times panel) frequency and Humby, IRI, deleted their exposure cookie 2. help create a stronger demos IXI, Kantar, Survey non-response bias is calculation Experian accounted for because we can etc. account for differences between people who saw the campaign and those who took the survey. 3. Reach/frequency and demos can be provided because they are delivered off all ads shown against a metered panel, not just an unweighted survey panel which will radically skew campaign demos 4. Campaigns include a behavioral measure called Share of Choice which is a simulated purchase exercise further down funnel than purchase intent and highly correlated to changes in market share and sales InsightExpress Symphony Ignite Panel/ Warehouse. Time from online ad Yes; plus the Sample approach is based Print: Cover recognition. All media; 3rd party data IX Unique Link to $50- 10M Imps for Cross Media Symphony and the Ignite captures exposure exposure can be Ignite on online exposure and Radio: Media plan and including social match for sales. IRI identifier offline $250K online. No Ignite Panel allows for more exposed and we deliver a survey immediate to any time Nework. media plan for offline questions around daypart, and mobile. match with online for panel exposure is real sample size...no intercepts... so invite based on exposure frame we choose. Panel size is media. There may be programs, ad re-call. TV: Mobile is based exposure for sales. members determined minimums every medium is on equal exposure (s). We supplement Ignite Therefore we can approximatle occasions when we use ad Media Plan on Pre (control) and IX flex via OTS for offline for recency. It provides the ability to with sample from a measure online y 10M. server segmentation; these Heavy/Medium/Light in market tag, etc for questions in media, but create very sophisticated control national Rep panel. The exposure on a more could be for special targets viewers, day part, (exposed). ad the survey. reach will groups...either with placebos and/or result is a hybrid approach "apples to apples" basis such as Hispanic. programs, stations. Also Social Media is exposure. drive sample analytics...and the ability to match as best practice. Cookie with offline; look at market based for heavy measured based IX uses a requirements to customer databases...which allow based survey invitation decay, measure multiple up markets. on ad exposure; permission and for the first ever analysis of the could be used on occasion campaign exposure, etc. OOH: Media Plan can be WOM (Scout ed Flash therefore impact of online (and offline) depending on the market based, questions Labs cookie for cost. advertising on offline sales ...with project/audience profile. around locations visited, partnership), accurate hard data. etc. All other offline Viral (Meteor); assignment The opportunity to truly understand media OTS question Online Behavior of exposure the brand and sales impact of a batteries are based on the - Data Sync, and campaign, at a very granular media media plan. Time Series, Omniture or longitudinal level, can provide a marketer with frequency analysis are Compete diagnostics. more actionable insights than brand also employed for trackers and significantly enhance measurement. mix models.

CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

Cross Platform Advertising Effectiveness Measurement Landscape – OTS Capabilities continued Name of Time From Online Ad Own panel? Best practices for Link to Link to Cost Minimum Media Other Data Sources Cross Media Recruitment Methodology Exposure to Survey Partner? Sampling Approach measuring offline media Online Offline Per Campaign Additional Information Measured integrated/used Product Invitation Size? OTS Activity Activity Study Size Ipsos/OTX Live|Stream Uses Ipsos's multi-sourced Recorded in cookie (1st See Use cookies to tag the Use a combination of TV, outdoor, Client provided see $15K Min. of 2MM The key factor in the success of this sample (dev'd by OTX). and last exposure via Recruitment creative we are evaluating claimed viewership of a print, radio, additional to impressions product is our access to a very large, Respondents recruited time stamping) and then by screening specific TV show, banners & rich info $100K per publisher high quality sample of consumers from a variety of sources: approx 2.5million unique readership of a specific media, social to generate a who are online. This allows us to Ipsos I-Say Panel (c.600k consumers a month issue of a magazine etc in media, pre-rolls sample size zoom in on the target audience for h'holds in USA), other through our multi-sourced combination with & overlays, of 150 relatively low weight campaigns panels (eg Research Now), sample management recognition of the mobile, branded respondents aimed at niche audiences. social networks and reward system, identify passively creative for that media content, in- in the test Moreover, the size of this 'virtual sites. Respondents those who were exposed theater cell panel' makes it possible to evaluate screened for multiple and not exposed to the cross-platform campaigns, even studies - we also read the online ad where the reach of some of the cookie from the tagged combinations of media is small. Our online creative that we are approach is NOT based on using evaluating - then allocated pop-ups for evaluating online ads to a study that they qualify which results in: a) a better quality for using an algorithm. The sample, b) a longer, comprehensive sample profile can be survey, c) faster turnaround and d) a controlled to match the better user experience for target audience for the consumers when they visit the campaign. We screen appr. website hosting the ad 2.5M unique, opted-in (to market research) consumers a month via this method, allowing us to recruit even very niche audiences. Knowledge Networks / 3DAccountability

Marketing Panel, optional in-person, Randomized via panel Yes, however Ad server segmentation (if TV-program specific, All including KN National Other National $125K- The belief that "best practice" Performance optional database recruitment at this time available),third party such Magazine-cover image, social, mobile, Shopper Lab (NSL), national Shopper $500k necessarily is defined by using more Optimization recruitment Knowledge as Ignite by Insight Express Online-cookie based, event, in-person, panels e.g. Lab (NSL) than just one cross media (in alliance Panel 's to identify exposed/not Radio-daypart/station; sponsorship registration, Ignite owned by methodology. with 50,000 exposed, media plan based OOH-OTS by market or KN 3Daccountab members are sampling plan region, ility) not utilized for OTS cross media

CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

Cross Platform Advertising Effectiveness Measurement Landscape – OTS Capabilities continued Own Name of Time From Online Ad Best practices for Link to Link to Cost Minimum panel? Media Other Data Sources Cross Media Recruitment Methodology Exposure to Survey Sampling Approach measuring offline media Online Offline Per Campaign Additional Information Partner? Measured integrated/used Product Invitation OTS Activity Activity Study Size Size? Marketing Evolution, Inc. ROMO Marketing Evolution ROMO includes a range No. We Sample is tracked The ARF, in their review of Marketing Yes, both online and Cookie We have Varies Anything CAN We not only provide data tables of (Return on examines the target of respondents who strongly continuously before during, Marketing Evolution's Evolution has offline purchase Based utilized greatly be measured, control and exposed differences, but Marketing audience and media plan have different "time believe in and after the campaign in methodology states that measured all of data can be many depen it's just a transform that data into actionable Objectives) for a given campaign and since exposures." This finding and order to ensure even our "application of the following integrated into the different ding on cost/benefit recommendations for marketers and Patent selects the most can range from utilizing the collection of both control experimental design is marketing measurement in sources the trade-off. We their agencies. For instance, patent Pending appropriate methodology immediately after panel that and exposed groups. consistent with the vehicles: viral order to understand including target measure pending methodology examines Methodology for each client. The most exposure to several provides the Where necessary to reach highest standards of marketing, social which media are client audien individual frequency curves that showcase, for appropriate methodology months from exposure. best sufficient cell sizes, measurement"…it "uses a networks, driving sales and/or databases. ce, print titles, any given metric and any given will allow control and This allows us to provide representati oversamples are employed. true experimental design mobile, email, what a person must Other media websites, TV media, the point of diminishing exposed groups to be most our clients with several on of each For instance, if we are with proper control web video, video believe first in order sources measursponsorships, returns. We let clients know when similar not only in terms of different views of media client's measuring an event, we thereby avoiding the games, pizza to make a purchase. depend on ed, mobile, etc that next dollar spent in a given demographics and performance. For each specific may utilize in-person common problem of boxes, offline the client's reporti on a regular medium isn't translating to a greater psychographics, but also in media, our clients can target interviewing at that event. comparing exposed versus events, online industry but ng basis. lift on a metric such as purchase terms of media habits. see how quickly decay audience. For measurement of digital non-exposed and other events, concerts, include granula intent. These frequency curves then Most typically, an online occurs. Respondents This allows ads, we review the media common problems rodeos, Catalina, rity, inform media mix optimizations that panel is utilized as the with different media us to select plan first and then associated with sponsorships, Nielsen, and let our clients understand how much recruitment methodology, combinations are also the best recommend audience 'pseudoexperimental' promotions, Experian, timefra to invest in each medium in order to but if the target audience is examined in order to panel for segmentation through the design approaches." websites, web etc. me. achieve the largest lifts in a given more focused or specialized offer an assessment of each of our ad server as the best ads, ads in trains metric. Our product is branded (e.g., unacculturated synergistic effects and to clients. practice, but also offer and planes, "ROMO", which stands for "Return Hispanics), it may be most make recommendations alternative approaches if stunts, street- on Marketing Objectives", because appropriate to employ on optimal media our clients prefer them or teams, roving the reality of marketing is that it phone interviews or mall flighting and weight. they are a more event busses, often has objectives that reach far intercepts. When media are being appropriate fit given the TV, Magazine, beyond just increasing sales. ROMO compared to each other, way the campaign has been Radio, OOH, isn't limited to mix we carefully apply planned (such as Newspapers, recommendations either, it also proprietary techniques forecasting to zero FSI's, internet delivers recommendations around to assess media exposure, or examining banners and sub-media (e.g., specific television effectiveness on an "non-exposed" groups more! networks, genres, etc.), motivations, apples-to-apples basis. through matching messaging, and creative. methods). MetrixLab XMOS Panel (Print, Mobile, TV, 3-5 days Yes (1M Media Plan, online OTS behavioral questions, TV, Radio, Client data propriatery did not 2 - 5MM Radio, DM, Search, Online, plus) Exposure, Address (DM) zip codes (for DM) Online, Print, tagging supply impressions Social Media), 3rd Party Mobile, Search, technology sites, i.e. banner placement DM, Social (Online, Social Media) Media, YouTube

CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

Cross Platform Advertising Effectiveness Measurement Landscape – OTS Capabilities continued Own Name of Time From Online Ad Best practices for Link to Link to Cost Minimum panel? Media Other Data Sources Cross Media Recruitment Methodology Exposure to Survey Sampling Approach measuring offline media Online Offline Per Campaign Additional Information Partner? Measured integrated/used Product Invitation OTS Activity Activity Study Size Size? Millward Brown/Dynamic Logic CrossMedia Hybrid (Panel and web Immediate (web No Based on online ad Frequency estimation TV, radio, Online browsing/ Compete IRI, Kantar $60K Generally Reach-based and de-reached Research™ intercept survey invitation) intercept) or variable exposure and media plan. based on media usage magazine, search/ purchase Retail to 50MM+ individual-level analysis of media (panel) Ad server segmentation can (e.g,, issue readership, newspaper, behavior, offline (ShopComm $250K+ impressions impact. Cell-based and modeling be used when necessary. program/channel/genre/d online (banners, purchase )--minimum online and/or analytics provide effectiveness and aypart viewing customized rich media, campaign 20-25% ROO. Path-based simulator enables to media plan) video), out of size minimum model-based scenario testing. Can home, cinema, depends on reach per be combined with standalone copy in-store, product target, medium testing for individual media (Link) or sampling, campaign within combined media (Link360). Can also product reach, and measured include creative evaluation questions integration, category/pr target within CrossMedia survey. mobile, tablet, oduct social. We have incidence the ability to measure mobile/tablet directly from its footprint. Partnerships with Cymfony and Trendrr for social media. The Nielsen Company IAG Survey 1-7 days Yes. 5k per Based on online ad Ad Recall TV, Online Modeled prior Nielsen Tag Top Annual 10m Syndicated day, 130k exposure viewing TVPC and advertisers pricing impressions uniques per TV Meter; and media month Claritas Prizm companies BrandEffect Survey 1-3 days (Note:will In Based on online ad N/A Internet ads Available as an add Nielsen Tag, Top $4.5K- 1mm unique Will launch sustained measurement change to 24 hrs) partnership exposure on (with Response Facebook agencies, $14.5K impressions in Q3 with Effect, Sales Effect, User advertisers, per week Addresses the following flaws in Facebook etc) Database ad typical online brand impact research: (1mn networks, - Cookie deletion: writes exposure to unique publishers; Facebook record instead of relying impressions min size: on cookies per week) 400 - Recency: complete control of sampling time post-exposure - Demo verification: matches response to Facebook demos (600m users) - More representative sample due to short-form survey - higher response and completion rates than comparable methods

CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

Cross Platform Advertising Effectiveness Measurement Landscape – OTS Capabilities continued Own Name of Time From Online Ad Best practices for Link to Link to Cost Minimum panel? Media Other Data Sources Cross Media Recruitment Methodology Exposure to Survey Sampling Approach measuring offline media Online Offline Per Campaign Additional Information Partner? Measured integrated/used Product Invitation OTS Activity Activity Study Size Size? The Nielsen Company continued Monitor Plus No recruitment, passive N/A (no survey - No panel Coverage/ tracking varies Ad Recognition (passive All media Npower, TV/Internet Yes, Varies No; measure listening reporting happens by market and medium listening and then AdRelevance Fusion data MediaAffectby all campaigns between 2 and 6 weeks identification) s (marries client, afterwards) ad spend to project sales) type, granula rity of data needed Watch Effect Uses the Nielsen panels N/A (no survey) Yes, (cross Based on online and TV ad TV tune in based on TV, Online Monitor Plus Nielsen Tag No $85K+; 40 mn (Online/TV Cross-platform platform exposure and TV tuning; Nielsen panel, Monitor partners; based impressions panel) panel is Exposed/Unexposed test Plus min 40mn on (min) 9000 HH) impressions project necessary scope Response Uses the Nielsen panels N/A (no survey) Yes, (cross Based on online and TV ad TV tune in based on TV, Online Monitor Plus Nielsen Tag No $65K+ 40 mn Effect (Online/TV Cross-platform platform exposure and TV tuning; Nielsen panel, Monitor Claritas Prizm partners; for impressions panel) panel is Exposed/Unexposed test Plus min 40mn online (min) 9000 HH) impressions and TV, necessary $15K+ for online only; based on project scope Media Online/TV panel fusion N/A (no survey; time to Yes Standard Nielsen panel Online impressions are TV, online, links to MRI, No $40K+ No minimum Inventory reporting varies) sampling methodology used to determine Reach magazines, Homescan, minimum Optimization and Frequency mobile (will start Moviegoers, Prizm campaign late 2011) size Cross- Online/TV panel fusion N/A (no survey; time to Yes Standard Nielsen panel Nielsen Tag TV, online Prizm No $12K+ No minimum platform Post reporting varies) sampling methodology minimum Buying campaign Reporting size

CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

Cross Platform Advertising Effectiveness Measurement Landscape – Lab Testing Capabilities

Distinctive Attributes Types of Experiments Conducted Recruitment techniques Measured Media Creative/ Copy Testing? Cost Per Study

Ipsos/OTX Three elements mark out Ipsos OTX: a) use of 1) Online distracted exposure Uses Ipsos's multi-sourced sample TV, outdoor, print, radio, banners & rich Yes $15K-$200K 'distracted exposure' in online surveys - (respondent is not aware that they are (developed by OTX). Respondents are media, social media, pre-rolls & allowing us to evaluate any media (or taking part in a survey about advertising), recruited from a variety of sources: the Ipsos overlays, mobile, branded content, in- combination of media) in an experimental 2) Online forced exposure (typically this I-Say Panel (c.600k h'holds in USA), other theater design, b) working with Pointlogic, we have happens mid-way through a test using the panels (eg Research Now), social networks bridged the gap between campaign testing Distracted Exposure approach and is used and reward sites. Respondents are screened and media planning. c) ground-breaking work for feedback on the actual creative, for multiple studies and then allocated to a with Dr Robert Heath (author of The Hidden rather than a measure of its impact on study that they qualify for using an Power of Advertising) ensure we evaluate both brand attributes), 3) DVDs sent via mail algorithm. The sample profile can be the emotive and the cognitive impact of each (for TV ads tested within TV shows) with controlled to match the target audience for element of a campaign follow-up survey controlled to take place the campaign. We screen approximately 2.5 24 hours after exposure, 4) In-person lab million unique, opted-in (to market research) test with respondents invited to our consumers a month via this method, facilities in NYC and LA in which we have a allowing us to recruit even very niche living room set-up for the purpose of audiences. testing TV, online, mobile, print and/or radio

CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

Cross Platform Advertising Effectiveness Measurement Landscape – Other Research Capabilities Ability to Link to Ability to capture Name of Cross Creative/ Copy Link to Online capture Research Type Distinctive Attributes Media Measured Offline Key Metrics Reported TV, print Key metrics online social and Cost Per Study Media Product Testing Activity alternative TV Activity streaming video viewing comScore Syndicated and comScore Multi- Single-source measurement of Mobile, Internet, TV, NA comScore- AT&T TV, Duplicated, Unduplicated and VOD, Streaming, Online video and Yes differentiate did not supply Custom Screen TV, Internet and mobile phone, Tablets provided Many CRM Exclusive Reach, Frequency, Average online video, online video between content Consumer Panel Tablets partners Audience, Minutes, GRPs, TRPs, some time- advertising reach, and ads for online Impressions shifted viewing minutes, sessions, video; capture Website and website social media advertising reach, minutes, page views, GRPs InsightExpress Media Symphony Ignite Panel and Warehouse; All media- Yes; Creative Buzz: (Scout Sales: (IRI) Primary brand metrics. Sales included VOD, time- All primary and Yes - Through $50K - $250K Measurement: Continuous tracking; Creative TV/Print/Radio/Onlin Pre-test and Labs) Viral for certain categories and products. shifted viewing secondary brand Ignite we can (sample not Continuous tracking, diagnostics; Media efficiency, e/Social/Mobile/OO Copy Testing can Effects: Frequency and time series analysis. and streaming metrics. Recall/Ad measure ad effect included). campaign specific Optimization. Norms available H/Hulu and many be measured. (Meteor Individual media impact and are all measured. Awareness. on Social others. Solutions) combination media effects; across the Networks and Data Sync tag: funnel. Cost per Branding Effect and measure the (Omniture) Optimization. content. Knowledge Networks Syndicated MultiMedia Single source cross media survey TV, Radio, No Time Spent and Average Daily Cume. Yes Time Spent and No $10K-$100K Mentor® Newspaper, TV reported in total, broadcast, cable Average Daily Magazine, Cume.Online Internet(including Activities (including social Media), social and on-line Mobile(both print publications smartphone and among others) tablet), Videogames, Cinema advertising MetrixLab Custimized Tracking Track 360 More visual and engaging TV, Radio, Online, Yes propriatery ? All KPI's needed by client - including Yes All KPI's needed by yes did not supply (including cross questionnaires, tagging Print, Mobile, Search, tagging reach, frequency, funnel metrics, client - including media deep dives) technology - online recruitment DM, Social Media, technology cross media reach and frequency and reach, frequency, not directly after exposure - YouTube Ad Evaluation funnel metrics, cross which makes all media media reach and comparable frequency and Ad Evaluation

CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

C. Interviewee List:

Agencies

Carat: Mike Hess EVP, Research Insights and Marketing Sciences

Media Storm: Judy Vogel Director of Insights and Analysis

Mediaedge:cia: Theresa LaMontagne Managing Director Analytics and Insights

Omnicom Media Group Holdings: Ian Akehurst Business Intelligence Director, OMD Adam Gitlin U.S. Director, Digital Insights & Analytics, OMD Scott Hagedorn Chief Executive Officer, Annalect Group Joe Masucci US Director, Business Intelligence, OMD Mark Reggimenti US Director, BrandScience

Starcom MediaVest Group: Cortney Henseler Human Intelligence Research Director Helen Katz SVP, Research Director Todd Kirby VP, Director of Strategic Research, Spark Communications Emma Pop VP, Director of Analytics Kate Sirkin EVP, Research

Universal McCann: Anant Mathur SVP, Business Insights Jim Oliver SVP, Advanced Analytics

50 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

Marketers

Consultant: Christyne Dzwierzynski Principal, CDZ Consulting, llc (formerly of Unilever)

ConAgra: Cindy Neumann Director, Category Manager; Director, Consumer Insights Grocery Consumer Insights

Microsoft: Chad Davis Marketing Research Director

PepsiCo North America: Jim Totten Director, Marketing Analytics

Procter & Gamble Co.: Greg Ross Director, Global Media Innovation

Media

CBS Corporation: Anne Claudio VP, Research, CBS Interactive Dave Poltrack EVP, Research and Planning

ESPN: Artie Bulgrin SVP, Research and Analytics

Google: Jim Dravillas Head of Advertising Research

MTV Networks: Beth Coleman VP, Audience Research Colleen Fahey Rush EVP, MTV Networks Research

Time Inc: Rory O'Flynn Executive Director, Digital Research

Turner Broadcasting System, Inc.: David Kudon Chief Methodologist and Corporate VP Susan Nathan Corporate VP, Media Currency

51 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

Research Vendors

3DAccountability: David Gantman Partner Ethan Rapp Partner

comScore: Harvir Bansal SVP, Survey Research Josh Chasin Chief Research Officer Joan Fitzgerald VP, Television Sales and Business Development Anne Hunter VP, Advertising Effectiveness Products

Consultant: Paul Lavrakas, Ph.D. Research Methodologist and Research Psychologist, IAB Consultant

InsightExpress: Dan Campbell VP Mark Ryan SVP, Chief Research Officer Jerome Shimizu VP, Data Science

Ipsos/OTX: Ian Wright EVP, Corporate Development

Knowledge Networks: Pat Graham Chief Strategy Officer

Marketing Evolution: Rex Briggs Founder & CEO John Matthews Chief Technology Officer

MetrixLab: Lucas Hulsebos Global Business Unit Director, Media & Advertising Kurt Schramm Business Development Manager

Millward Brown/Dynamic Logic: Bill Havlena Vice President Client Solutions, CrossMedia Solutions Bill Pink EVP Strategic Services and Marketing Sciences

SymphonyAM: Manish Bhattia Chief Executive Officer Haren Ghosh Chief Analytics Officer and GM, Cross Media Solutions

The Nielsen Company: Scott McKinley EVP, Global Advertising Effectiveness Product Lead

52 CIMM BEST PRACTICES IN CROSS PLATFORM ADVERTISING EFFECTIVENESS MEASUREMENT

D. The Authors:

• Michele Madansky, Ph.D.

Michele Madansky, Ph.D., is an industry expert in advertising effectiveness research. She began her career in BBDO’s Marketing Sciences Group and was general manager of Grey Insight Partners. Her dissertation, “Traditional Ads in an Interactive Environment” was the first scholarly paper comparing TV and Print ads to online advertising. Michele was VP, Global Marketing Research at Yahoo! from 2003- 2007 and has been an independent consultant for over three years. Her clients have included ESPN, GSN, New York Times and Huffington Post. Michele is a lecturer at University of California Berkeley’s Haas Graduate School of Business and a frequent industry speaker.

• Kathryn Koegel

Kathryn Koegel is a media and marketing consultant who has worked in online, print, TV and most recently, mobile. She is also a regular contributor to Ad Age on topics involving interactive advertising and mobile marketing including a quarterly report series being published throughout 2011. She was the VP of marketing for one of the first ad networks, Phase2Media and Director of Research & Industry Development for DoubleClick and created their first industry trend reports. She has been working with ad effectiveness research since her early career when she worked for media companies such as Conde Nast, US News & World Report and Gemstar TV Guide Television. In her consultancy, Primary Impact, she works with media and interactive marketing companies such as The Weather Channel, Placecast Mobile, FreeWheel TV, Audience Science, Collective Media and the National Newspaper Network. Her primary research work has been accepted and published by the ARF and ESOMAR.

53