Electronically Filed Docket: 19-CRB-0005-WR (2021-2025) Public Version Filing Date: 07/31/2020 03:11:52 PM EDT

Before the UNITED STATES COPYRIGHT ROYALTY JUDGES Washington, D.C.

In the Matter of:

Determination of Rates and Terms for Docket No. 19-CRB-0005- Digital Performance of Sound Recordings WR (2021-2025) and Making of Ephemeral Copies to Facilitate those Performances (Web V)

SOUNDEXCHANGE’S UNOPPOSED MOTION TO SUBMIT THE CORRECTED WRITTEN REBUTTAL TESTIMONY OF CATHERINE TUCKER

SoundExchange respectfully moves for leave to submit the Corrected Written Rebuttal

Testimony of Catherine Tucker (the “Corrected Testimony”) for the purpose of correcting two

missing redactions. The other participants in the proceeding have consented to the submission of

the Corrected Testimony. The Corrected Testimony is appended to this motion as Exhibit A

(clean version) and Exhibit B (redline showing changes).

The corrections reflect two instances of the phrase “[ ]” that

should have been redacted in Professor Tucker’s Written Rebuttal Testimony but were

inadvertently left unredacted. The two missing redactions are identified below:

 “However, as discussed above in Section III.A.2, many of these users would become

aware of Pandora’s loss of content from [ ] in reality, and would

respond differently with this awareness.” Correct Testimony WRT ¶ 34.

 “Second, Pandora’s loss of access to [ ] catalog would likely

make it harder for Pandora to retain customers.” Correct Testimony WRT ¶ 96. Public Version

SoundExchange respectfully requests that the Judges exercise their discretion under 17

U.S.C. § 801(c) to accept the Corrected Testimony, and submits that such relief is legally and factually justified. See Docket No. 19-CRB-0005-WR (2021-2025) (Web V), Order Granting Sirius

XM Radio Inc.’s and Pandora Media, LLC’s Unopposed Motion to Submit the Corrected Written

Direct Testimonies of David Reiley and Carl Shapiro at 1 (Dec. 5, 2019) (finding “that good cause exists” to grant unopposed motion to correct testimony); Docket No. 14-CRB-0001-WR (2016-2020)

(Web IV), Order Denying Licensee Services’ Motion to Strike SoundExchange’s “Corrected”

Written Rebuttal Testimony of Daniel Rubinfeld and Section III.E of the Written Rebuttal Testimony of Daniel Rubinfeld, and Granting Other Relief (“Web IV Rubinfeld Order”) at 7 (Apr. 2, 2015)

(“While acceptance after the due date of a correction to an otherwise timely filing would appear to fall squarely within the discretion afforded to the Judges under Section 801(c), the appropriate procedure for asking the Judges to exercise that discretion is to file a motion stating the relief requested and the legal and factual bases for the Judges to grant the relief.”).

First, the other participants have consented to submission of the Corrected Testimony.

Second, the correction introduces only redactions to the two phrases identified above to Professor

Tucker’s testimony. Neither of the proposed corrections affects Dr. Tucker’s analysis or conclusions. Third, as the Judges have ruled in past proceedings, it is in their and the participants’ interest to have a full, correct, and complete record. The Judges have accepted corrected filings in this proceeding and past proceedings in furtherance of that interest. See, e.g., Web IV Rubinfeld

Order at 9 (“Consistent with their prior rulings, the Judges are disinclined in this peculiar instance to allow their procedural rules to prevent them from obtaining a full evidentiary record. The Judges conclude that the interests of justice are served by examination of a more complete, informed expert

2

SoundExchange's Unopposed Motion to Submit the Corrected Written Rebuttal Testimony of Catherine Tucker Public Version

record. . . . [T]he Judges invoke section 801(c) of the Act, which provides them with discretion to

‘make any necessary procedural or evidentiary rulings in any proceeding under this chapter.’”).

If the Judges accept the corrected testimony, SoundExchange requests that the Judges remove from eCRB the public version of Volume II of SoundExchange’s Written Rebuttal

Statement (Docket No. 20257), filed January 14, 2020, which contains Prof. Tucker’s written rebuttal testimony. SoundExchange will then upload a new public version of Volume II containing

Prof. Tucker’s Correct Written Rebuttal Testimony.

CONCLUSION For the foregoing reasons, SoundExchange respectfully requests that the Judges grant its

Unopposed Motion to Submit the Corrected Written Rebuttal Testimony of Catherine Tucker.

Respectfully submitted,

Dated: July 31, 2020 JENNER & BLOCK LLP

By: /s/Previn Warren David A. Handzo (D.C. Bar No. 384023) [email protected] Previn Warren (D.C. Bar No. 1022447) [email protected] 1099 New York, Ave., NW, Suite 900 Washington, DC 20001 Tel.: 202-639-6000 Fax: 202-639-6066

Counsel for SoundExchange, Inc., American Federation of Musicians of the United States and Canada, Screen Actors Guild-American Federation of Television and Radio Artists, American Association of Independent Music, Sony Music Entertainment, UMG Recordings, Inc., Warner Music Group Corp., and Jagjaguwar Inc.

3

SoundExchange's Unopposed Motion to Submit the Corrected Written Rebuttal Testimony of Catherine Tucker Before the UNITED STATES COPYRIGHT ROYALTY JUDGES Washington, D.C.

In the Matter of:

Determination of Rates and Terms for Docket No. 19-CRB-0005- Digital Performance of Sound Recordings WR (2021-2025) and Making of Ephemeral Copies to Facilitate those Performances (Web V)

DECLARATION OF CATHERINE TUCKER

I, Catherine Tucker, declare under penalty of perjury that the statements contained in my

Corrected Written Rebuttal Testimony in the above-captioned proceeding are true and correct to the best of my knowledge, information, and belief.

Date: _07/20/2020______Catherine Tucker Public Version

EXHIBIT A Public Version

Before the UNITED STATES COPYRIGHT ROYALTY JUDGES Washington, D.C.

In re

Determination of Rates and Terms for Docket No. 19-CRB-0005-WR Digital Performance of Sound Recordings (2021-2025) and Making of Ephemeral Copies to Facilitate those Performances (Web V)

CORRECTED WRITTEN REBUTTAL TESTIMONY OF Catherine Tucker Sloan Distinguished Professor of Management Science MIT Sloan at the Massachusetts Institute of Technology

July, 2020 Public Version

TABLE OF CONTENTS

I. Introduction and Assignment ...... 1 II. Summary of Conclusions ...... 2 III. Dr. Reiley’s Label Suppression Experiments Provide an Inaccurate Guide to the Effect of the Removal of a Label, Rendering Use of These Estimates by Professor Shapiro Flawed ...... 4 A. A field experiment that does not inform subjects about the treatment is unrealistic and cannot measure the true effect of a treatment that would be known to users ...... 6 1. A field experiment needs to measure the “treatment” of interest ...... 6 2. In reality, the removal of [ ] catalog would be known by many Pandora users ...... 9 3. Many of the users in the Label Suppression Experiments were unlikely to be aware of the removal of recordings from the suppressed record company ...... 15 4. Dr. Reiley’s statements about the merits of “blinded” experiments do not apply in this context ...... 21 B. The Label Suppression Experiments do not help us understand competitive effects ...... 25 1. Field experiments cannot be used to measure competitive effects ...... 26 2. Streaming services are competitive because it is easy for users to switch ...... 26 3. Competitors could respond to a reduction in Pandora’s catalog...... 29 C. Notwithstanding that the Label Suppression Experiments did not measure the treatment of interest, the experiments as conducted do not provide precise estimates of the effect of 100 percent label suppression ...... 30 1. The Label Suppression Experiments were underpowered ...... 30 2. The Label Suppression Experiments were deliberately limited in scope because the researchers anticipated that the consequences of a larger experiment could negatively affect Pandora’s business ...... 34 3. The Label Suppression Experiments struggled to attain precision on estimates because of the inclusion of many users who received little or no suppression treatment ...... 35

i

Written Rebuttal Testimony of Catherine Tucker Public Version

4. Unlike other work by Dr. Reiley, Dr. Reiley’s analysis of the Label Suppression Experiments does not use data on the intensity of treatment ...... 40 D. The Label Suppression Experiments are not useful for estimating true long-run effects ...... 42

IV. Professor Shapiro Presents Insufficient Analysis to Conclude that No Label is a “Must-Have” ...... 46 A. Professor Shapiro’s estimates rely heavily on the Label Suppression Experiments ...... 47 B. Professor Shapiro’s ad hoc corrections to the estimates from the Label Suppression Experiments do not result in a conservative application of those estimates ...... 48 C. Professor Shapiro ignores the additional effects of losing access to content from [ ] on Pandora’s underlying business model ...... 50 D. Professor Shapiro improperly applies the results of the Label Suppression Experiments to estimate a reasonable royalty for subscription webcasters ...... 54 E. Ultimately, Professor Shapiro’s application of the results of the Label Suppression Experiments suffers from compounding errors ...... 56

V. The Submission by the National Association of Broadcasters (“NAB”) Misses the Importance of Simulcasting to Their Broadcasters...... 56 VI. The Religious Broadcasters’ Arguments for Why They Should Pay Less Are Not Based on Economics ...... 63 A. Family Radio is not representative of noncommercial webcasters ...... 65 B. Statutory royalty payments make up a very small proportion of noncommercial webcasters’ costs ...... 68 C. Though small non-profits are given discounts in some cases, those discounts may not be extended proportionally to larger non-profits ...... 70 D. Noncommercial webcasters already receive a discounted rate ...... 71 E. The argument that noncommercial webcasters should not pay excess royalties because they are too unpredictable to finance with donations is not based on data or economics ...... 73 F. The CBI and NPR settlements do not provide support for NRBNMLC’s rate proposal ...... 74

ii

Written Rebuttal Testimony of Catherine Tucker Public Version

I. Introduction and Assignment

I previously submitted written direct testimony in this matter on September 23, 2019 (“Written

Direct Testimony”).1

I have been asked to address certain issues raised in testimony submitted on behalf of the various services, including in:  Dr. David Reiley’s testimony,2 on behalf of Pandora, regarding the design and analysis of a series of field experiments (the “Label Suppression Experiments”) conducted to assess the effect on users’ listening if Pandora lost access to the entire catalog of a particular record company;

 Professor Carl Shapiro’s testimony,3 on behalf of Pandora, regarding the use of the Label Suppression Experiments in determining reasonable royalty rates for non-interactive services for the 2021-2025 time period;

 Dr. Gregory Leonard’s testimony,4 on behalf of the National Association of Broadcasters (“NAB”), regarding the role of simulcasting for terrestrial broadcasters;

 Dr. Joseph Cordes’ testimony,5 on behalf of the National Religious Broadcasters Noncommercial Music License Committee (“NRBNMLC”), regarding reasonable royalty rates for noncommercial webcasters; and

 Dr. Richard Steinberg’s testimony,6 on behalf of NRBNMLC, regarding reasonable royalty rates for noncommercial webcasters.

1 Written Direct Testimony of Catherine Tucker, Sept. 23, 2019 (“Tucker WDT”). Since submitting my Written Direct Testimony, Sirius XM and iHeartMedia reported financial results for Q3 2019. These results are summarized in Rebuttal Appendices 5 to 8. 2 Written Direct Testimony of David Reiley, Sept. 23, 2019 (“Reiley WDT”); Corrected Written Direct Testimony of David Reiley, Nov. 26, 2019 (“Reiley Corrected WDT”). Dr. Reiley is a Principal Scientist at Pandora. 3 Written Direct Testimony of Carl Shapiro, Sept. 23, 2019 (“Shapiro WDT”); Corrected Written Direct Testimony of Carl Shapiro, Nov. 20, 2019 (“Shapiro Corrected WDT”). 4 Written Direct Testimony of Dr. Gregory K. Leonard, Sept. 20, 2019 (“Leonard WDT”). 5 Written Direct Testimony of Joseph J. Cordes, Sept. 21, 2019 (“Cordes WDT”); Corrected Written Direct Testimony of Joseph J. Cordes, Dec. 20, 2019 (“Cordes Corrected WDT”). 6 Written Direct Testimony of Richard Steinberg, Sept. 22, 2019 (“Steinberg WDT”); Amended Written Direct

1

Written Rebuttal Testimony of Catherine Tucker Public Version

I am being compensated for my services in this matter at my customary hourly rate of $1,200. Certain employees of Analysis Group have assisted me in working on this report. Analysis Group is being compensated for their time in this matter at an hourly rate ranging between $225 and $775 an hour. In addition, I receive compensation based on a proportion of the total

billing of Analysis Group. A copy of my CV is attached to my Written Direct Testimony.7 My qualifications remain unchanged from my previous submission. A list of the materials I have

relied upon to date in developing my opinions contained in this report is attached as Rebuttal

Appendix 9.8

II. Summary of Conclusions

I have concluded that:

First, the Label Suppression Experiments conducted by Dr. Reiley result in misleading and unreliable estimates of the effect of Pandora’s loss of content from a record company on its

ad-supported radio service. Most users in the Label Suppression Experiments would not have been aware of the suppression treatment, while in reality the absence of music from [ ] would be widely known to users due to publicity from third-party industry commentators, other users, and competitors. The Label Suppression Experiments do not measure competitive effects or provide insights into the influence of competitors’ responses on consumer behavior. The Label Suppression Experiments were also underpowered and achieved only partial suppression of recordings from the tested record companies, and

therefore cannot provide reliable estimates of the effect of Pandora’s full loss of a record company’s content. Last, the Label Suppression Experiments do not provide any useful guide to the long-run effects of Pandora’s loss of content from [ ]. These and

Testimony of Richard Steinberg, Dec. 11, 2019 (“Steinberg Amended WDT”). 7 Tucker WDT, at Appendix 19. 8 I may amend my testimony with any new information as additional evidence becomes available, and may modify my analysis and resulting conclusions. 2

Written Rebuttal Testimony of Catherine Tucker Public Version

other errors render Dr. Reiley’s analysis and conclusions not useful for purposes of measuring the effects of the removal of a record company from Pandora’s ad-supported noninteractive service. I discuss this in detail in Section III.

Second, because Professor Shapiro’s analysis of reasonable royalty rates relies heavily on the results of the Label Suppression Experiments, errors in these experiments render Professor Shapiro’s reasonable royalty estimates flawed and unreliable. These errors are compounded by

Professor Shapiro’s unfounded and ad hoc assumptions surrounding the applicability of the estimates from the experiments and how to correct the errors within the experiments. In addition, Professor Shapiro’s analysis ignores the additional effects of losing access to content from [ ] on Pandora’s underlying business model beyond just the direct effect on listener hours, further understating the effect on Pandora’s business. I discuss this in detail in Section IV.

Third, while NAB witnesses suggest that webcasting is a small and unprofitable aspect of their business, their arguments ignore the role that simulcasting plays in protecting the NAB’s members’ radio businesses. In particular, evidence from their own executives suggests that simulcasts enhance and complement radio stations’ core businesses, including by helping radio businesses to retain listeners in the face of emerging digital technologies. I discuss this in detail in Section V.

Fourth, in their written direct testimonies on behalf of the NRBNMLC, Dr. Steinberg and Dr. Cordes argue that noncommercial webcasters should pay lower statutory rates than commercial

webcasters. However, the argument that noncommercial webcasters should not pay excess royalties because they are too unpredictable to finance with donations is not based on data or economics. Dr. Steinberg and Dr. Cordes ignore that the average per-performance rate paid by noncommercial webcasters is already lower than the rates paid by commercial webcasters, and would remain lower under SoundExchange’s proposed rates. Last, though Dr. Steinberg amended his written direct testimony to add a discussion of SoundExchange’s settlement

3

Written Rebuttal Testimony of Catherine Tucker Public Version

agreements with CBI and NPR, neither of these settlement agreements provides support for NRBNMLC’s rate proposal. I discuss this in detail in Section VI.

III. Dr. Reiley’s Label Suppression Experiments Provide an Inaccurate Guide to the Effect of the Removal of a Label, Rendering Use of These Estimates by Professor Shapiro Flawed

The “Label Suppression Experiments” refer to a series of experiments that, as described by Dr. Reiley, were “conducted to assess whether selectively suppressing user access to music from particular record companies has an impact on consumer listening hours, the extent of any

impact, and whether the impact varies by record company.”9 Following instructions from Professor Shapiro, Pandora’s economic expert, Dr. Reiley ran five experimental treatments with the goal of suppressing music from one of each of the following record companies: [

]10

Dr. Reiley asserts that the Label Suppression Experiments address “[w]hat effect, if any, there would be on users’ listening if Pandora stopped playing the entire catalog of a particular record

company on Pandora’s ad-supported service.”11 Dr. Reiley’s research objective is based on Professor Shapiro’s instruction that the “goal of these experiments is to measure the responses of Pandora listeners if a Pandora advertising-supported statutory service were to lose access to

the music of a given record company.”12

9 Reiley Corrected WDT, at 1-2. 10 Reiley Corrected WDT, at 7. As I discuss in Section III.A, Dr. Reiley’s label suppression treatment is unrealistic and does not measure the true effect of Pandora’s loss of content from [ ] because, among other things, many of the users in the treatment groups were unlikely to have been aware of the suppression treatment. Further, as discussed in Section III.C, while Dr. Reiley indicates that he intended the label suppression treatment to completely suppress recordings from the tested record company, it appears that few, if any, users received this treatment as intended. 11 Reiley Corrected WDT, at 6-7. 12 Reiley Corrected WDT, at 26.

4

Written Rebuttal Testimony of Catherine Tucker Public Version

Dr. Reiley finds that the label suppression treatment resulted in [

] of the five treatment groups relative to the control groups.13 For the [ ], Dr. Reiley estimates a reduction in listening hours of [ ] percent for the [ ] treatment group and an increase in listening hours of [ ]

percent for the [ ] treatment group as compared to the control groups.14 Dr. Reiley obtained [

]. Dr. Reiley states that “none of these estimates [

].”15

In this section, I explain why the Label Suppression Experiments result in misleading and unreliable estimates of the effect of the removal of a record company from Pandora’s ad- supported radio service. Most users in the Label Suppression Experiments would not have known that [ ] was being suppressed, but in reality such lack of recordings from [ ] would be known by users because of the amount of publicity it

would receive from third-party industry commentators, other users, and competitors. The Label Suppression Experiments do not measure how competitors would react or measure how these competitive reactions would affect behavior. The Label Suppression Experiments were underpowered and only achieved partial suppression of recordings from the tested record companies, and therefore cannot provide precise estimates of the effect. Finally, the Label

Suppression Experiments—which ran for a period of only three months—do not provide any useful guide to the long-run effects of Pandora’s loss of content from [

13 Reiley Corrected WDT, at 11-12. 14 Reiley Corrected WDT, at 11-12. On its face, the fact that [ ] calls into question the validity and reliability of the Label Suppression Experiments. Dr. Reiley testified that [

] SoundExchange Ex. 231, Deposition of David H. Reiley, Jr., Ph.D., Dec. 16, 2019 (“SoundExchange Ex. 231 (Deposition of David Reiley)”), at 125. As discussed below in Section III.A.3, this result is consistent with evidence that many listeners in the treatment group experienced few suppressed recordings and were likely unware of the suppression treatment. 15 Reiley Corrected WDT, at 11-12. 5

Written Rebuttal Testimony of Catherine Tucker Public Version

]. These and other errors render Dr. Reiley’s analysis and conclusions not useful for the purpose of measuring the effects of the removal of a record company from Pandora’s ad- supported noninteractive service.

A. A field experiment that does not inform subjects about the treatment is unrealistic and cannot measure the true effect of a treatment that would be known to users

1. A field experiment needs to measure the “treatment” of interest

A field experiment, often referred to as a randomized controlled trial (“RCT”), is a controlled study in which a firm or organization randomizes whether or not a user receives a specific

“treatment.”16 The random assignment of participants to a treatment group that receives the treatment of interest and a control group that does not receive the treatment ensures that no unobservable characteristics of the participants are reflected in the assignment, and therefore that any difference between treatment and control groups reflects the effect of the treatment

itself.17 Done correctly, this allows the researcher to address causal questions, such as the

expected effect of taking a specific course of action. Reflecting the value of field experiments in digital economics, I have conducted and analyzed many field experiments during the course

of my research.18 I co-authored the chapter on how to conduct field experiments in marketing

16 See, e.g., Abhijit Banerjee & Esther Duflo, An Introduction to the ‘Handbook of Field Experiments,’ (Aug. 2016), available at https://www.povertyactionlab.org/sites/default/files/documents/handbook_intro.pdf; Glenn W. Harrison, Cautionary Notes on the Use of Field Experiments to Address Policy Issues, 30 Oxford Review of Economic Policy 753-763, 753 (2014); see also, Reiley Corrected WDT, at 5. 17 Abhijit Banerjee & Esther Duflo, An Introduction to the ‘Handbook of Field Experiments,’ at 1 (Aug. 2016), available at https://www.povertyactionlab.org/sites/default/files/documents/handbook_intro.pdf. 18 See, e.g., Catherine Tucker & Juanjuan Zhang, Growing Two-sided Networks by Advertising the User Base: A Field Experiment, 29 Marketing Science 805-814 (Sept.-Oct. 2010); Catherine Tucker & Juanjuan Zhang, How Does Popularity Information Affect Choices? A Field Experiment, 57 Management Science 828-842 (May 2011); Catherine Tucker & Anja Lambrecht, When Does Retargeting Work? Information Specificity in Online Advertising, 50 Journal of Marketing Research561-576 (Oct. 2013); Catherine Tucker, Social Networks, Personalized Advertising, and Privacy Controls, 51 Journal of Marketing Research 546-562 (Oct. 2014); Anja Lambrecht and Catherine Tucker, Paying with Money or with Effort: Pricing When Customers Anticipate Hassle, 49 Journal of Marketing Research 66-82 (2012); Catherine Tucker, The Reach and Persuasiveness of Viral Video Ads, 34 Marketing Science 281-296 (Mar. 2015); Christian Catalini and Catherine Tucker, When Early Adopters Don’t Adopt, 357 Science, 135-136 (July 2017).

6

Written Rebuttal Testimony of Catherine Tucker Public Version

for the Handbook of Marketing Analytics.19 I am also a regular keynote speaker at the

Conference on Digital Experimentation.20

However, while there are many potential benefits of field experiments, there are also pitfalls that must be avoided. Any properly designed field experiment can provide an estimate of something, but those results may not be relevant for the policy question or hypothesis of interest. As stated by researchers at Princeton,

[W]hether, and in what ways, an RCT result is evidence depends on exactly what the hypothesis is for which the result is supposed to be evidence, and that what kinds of hypotheses these will be depends on the purposes to be served. This should in turn affect the design of the trial itself.21

A field experiment that provides an estimate of the wrong thing is uninformative for the researcher’s question or, worse, may be misleading if the researcher assumes they are measuring the effect of interest. Any ad hoc attempts to adjust the estimates of the poorly

designed field experiment to approximate the variable that should have been measured in the first place will likely result in unreliable and/or biased estimates of what the researcher or firm

was trying to measure. [ ], Dr. Reiley testified that he [

]22

19 Anja Lambrecht & Catherine Tucker, Field Experiments in Handbook of Marketing Analytics (Natalie Mizik & Dominique Hanssens eds., Edward Elgar Publishing 2018). 20 “2019 Conference on Digital Experimentation (CODE),” MIT Initiative on the Digital Economy (Nov. 1-2, 2019), http://ide.mit.edu/events/2019-conference-digital-experimentation-code; “Conference on Digital Experimentation (CODE),” MIT Initiative on the Digital Economy (Oct. 14-15, 2016), http://ide.mit.edu/events/conference-digital- experimentation-code-0. 21 See, e.g., Angus Deaton & Nancy Cartwright, Understanding and Misunderstanding Randomized Controlled Trials, 210 Social Science & Medicine 2-21, 10 (Aug. 2018) (emphasis in original). 22 SoundExchange Ex. 231 (Deposition of David Reiley), at 45:17-25 ([

]).

7

Written Rebuttal Testimony of Catherine Tucker Public Version

In this case, Dr. Reiley and Professor Shapiro explain that the Label Suppression Experiments are meant to capture the effect on Pandora listeners if Pandora’s ad-supported service no longer

offered music from a particular record company.23 The right experiment, therefore, is one that measures the effect of the loss of a record company’s catalog on Pandora’s ad-supported

service.24 A properly designed experiment would administer a treatment that accurately reflects

the characteristics of such a situation in the real world.25 As I explain in Section III.A.2, one

important characteristic of this situation is that Pandora’s loss of content from [ ] would be known to many users in the real world, in part because competitors and third parties would be motivated to publicize this change.

The Label Suppression Experiments, however, do not measure the effect of suppressing recordings from a record company when users are aware of that absence. At best, the Label Suppression Experiments measure only the short-run effect on listening hours of removing certain recordings without informing users that this is happening.

Many researchers have noted the importance of context on participants’ behavior in experiments. Harrison and List (2004) cautioned that “[i]t is not the case that abstract, context- free experiments provide more general findings if the context itself is relevant to the

performance of subjects.”26 Others have shown that information about the treatment can influence participants’ behavior. For example, one study found evidence of systematic

23 Dr. Reiley’s research objective is based on Professor Shapiro’s instruction that the “goal of these experiments is to measure the responses of Pandora listeners if a Pandora advertising-supported statutory service were to lose access to the music of a given record company.” Reiley Corrected WDT, at 26. 24 In economics, we refer to this as measuring a “treatment effect” of interest. This idea of a treatment effect is something that Dr. Reiley has emphasized in his own work. Randall Lewis & David Reiley, Online Ads and Offline Sales: Measuring the Effects of Retail Advertising via a Controlled Experiment on Yahoo!, 12 Quantitative Marketing and Economics 235-266 (Sept. 2014). 25 In addition, a properly designed experiment would need to carefully consider potential spillover effects on the Plus and Premium tiers. 26 Glenn Harrison & John List, Field Experiments, 42 Journal of Economic Literature 1009-1055, 1028 (Dec. 2004) (emphasis in original).

8

Written Rebuttal Testimony of Catherine Tucker Public Version

differences in behavior of participants in a medical trial depending on the information they

received about the probability of receiving the treatment as opposed to the placebo.27 The concept of “ecological validity” captures the idea that the degree to which an experiment’s

design resembles the real world influences the generalizability of its results.28

2. In reality, the removal of [ ] catalog would be known by many Pandora users

Dr. Reiley chose to make the Label Suppression Experiments “blind” by not informing participants in the treatment group that they would no longer have access to music from the

suppressed record company on Pandora’s ad-supported service.29 In reality, however, the removal of [ ] catalog from Pandora’s service would be made public by Pandora, its competitors, and/or third parties and would be known to many users. Therefore, measuring the effects of removing recordings from [ ] on users’ listening behavior when those users are under the impression they are receiving the same

Pandora ad-supported service and music catalogs they have always received, as Dr. Reiley did, is not useful for measuring the real-world impact of losing access to [ ] catalog. Dr. Reiley agreed that, [

]30

27 Sylvain Chassang, et al., Accounting for Behavior in Treatment Effects: New Applications for Blind Trials, 10 PLoS ONE 1, 4-5 (June 2015). 28 See, e.g., Kathryn Zeiler, Cautions on the Use of Economics Experiments in Law, 166 Journal of Institutional and Theoretical Economics 178-193, 181-82 (Mar. 2010). 29 Reiley Corrected WDT, at 4. 30 SoundExchange Ex. 231 (Deposition of David Reiley), at 162:6-163:4 ([

]).

9

Written Rebuttal Testimony of Catherine Tucker Public Version

Consumers are aware of, or can learn, a substantial amount of information about their

streaming service of choice and the breadth of the recordings and music catalogs it offers.31 Many third-party reviews and articles discuss the breadth of music offered by competing streaming music services. For example, Consumer Reports highlights and compares the size of major streaming services’ music libraries in its article advising consumers how to choose

between available streaming options.32 Similarly, a 2019 article from Digital Trends compares

Spotify and on several dimensions, including their music libraries. The article attributes ’s success to the size of its catalog, explaining that “Spotify first took its dominant position on the strength of its impressive 30 million-plus song catalog,” but notes that Apple Music now “touts around 45 million songs, which is superior to Spotify’s current 35 million-plus figure, and also outdoes newer contenders like Amazon Prime Music and Jay-

Z’s .”33 Although Amazon Prime Music is not representative of other streaming services

as it is bundled with Amazon Prime,34 third-party discussions of Amazon Prime Music have

focused on its relatively small catalog of approximately two million recordings, often characterizing its limited selection as a drawback of the service. For example, a June 2019 article by Business Insider describes the “pretty meager” recording selection of Amazon Prime Music as “[t]he most notable difference” between it and Amazon Music Unlimited, warning

that “you may feel let down by Prime Music’s small selection.”35 The article notes that Amazon

31 For example, TIME published an article in 2015 detailing “11 Wildly Popular Albums You Can’t Get on Spotify.” Victor Luckerson, 11 Wildly Popular Albums You Can’t Get on Spotify, TIME (Mar. 29, 2016), https://time.com/4274430/spotify-albums/. 32 Thomas Germain, Best Music Streaming Services, Consumer Reports, Sept. 18, 2019, https://www.consumerreports.org/streaming-media/best-music-streaming-service-for-you/. See also, Ty Pendlebury & Xiomara Blanco, Best music streaming: Spotify, Apple Music and more, compared” CNET (Nov. 24, 2019), https://www.cnet.com/how-to/best-music-streaming-service-of-2019-spotify-pandora-apple-music/. 33 Josh Levenson & Quentyn Kennemer, Apple Music vs. Spotify: Which service is the streaming king?, Digital Trends (Nov. 11, 2019), https://www.digitaltrends.com/music/apple-music-vs-spotify/. 34 Amazon Price offers members numerous retail and other benefits. 35 Remi Rosmarin, Here are the main differences between Amazon’s two music streaming services, Prime Music and Amazon Music Unlimited, Business Insider (June 26, 2019), https://www.businessinsider.com/prime-music-vs- amazon-music-unlimited.

10

Written Rebuttal Testimony of Catherine Tucker Public Version

Music Unlimited, in contrast, offers “50 million songs” and “fills all the holes” in Prime

Music’s library.36

In addition, if Pandora lost access to [ ] catalog of sound recordings, third-party industry commentators and competitors (as described in Section III.B.3) would have incentives to publicly promote this event as it would represent a meaningful difference between Pandora’s service and competing services. Further, artists whose recordings were

removed from Pandora’s catalog would similarly have incentives to publicize this change and encourage consumers to listen to other music streaming services instead.

When a single artist, Taylor Swift, removed her recording catalog from Spotify in November

2014,37 it was covered in at least 260 news stories that week, including on the websites associated with Time, The Guardian, Rolling Stone, The New York Times, The Washington

Post, and others.38 The return of Taylor Swift’s catalog to Spotify on June 9, 2017 was covered

in approximately 215 news stories that week.39

36Similarly, Professor Shapiro notes that Amazon Prime Music differs from standalone interactive subscription services because it has a much smaller music catalog (less than 10 percent of recordings) and “customers do not expect to find all their favorite artists and recordings on Amazon Prime as they do with a standalone interactive subscription service.” Shapiro Corrected WDT, at 37-38. 37 Hannah Ellis-Petersen, Taylor Swift takes a stand over Spotify music royalties, The Guardian (Nov. 5, 2014), https://www.theguardian.com/music/2014/nov/04/taylor-swift-spotify-streaming-album-sales-snub; Pamela Engel, Taylor Swift Pulls All Of Her Albums From Spotify, Business Insider (Nov. 3, 2014), https://www.businessinsider.com/taylor-swift-pulled-all-of-her-albums-from-spotify-2014-11. 38 I estimated the number of news articles discussing Taylor Swift’s decision to remove her music catalog from Spotify by performing a Factiva search on November 14, 2019 for articles published between November 3, 2014 and November 10, 2014 containing “Taylor Swift + Spotify + (Remove* or Drop*).” See also Charlotte Atler, Taylor Swift Just Removed Her Music From Spotify, TIME (Nov. 3, 2014), https://time.com/3554438/taylor-swift-spotify/; Steve Knopper, Taylor Swift Abruptly Pulls Entire Catalog From Spotify, Rolling Stone (Nov. 3, 2014), https://www.rollingstone.com/music/music-news/taylor-swift-abruptly-pulls-entire-catalog-from-spotify-55523/; Ben Sisaro, Taylor Swift Announces World Tour and Pulls Her Music From Spotify, N.Y. Times (Nov. 3, 2014), https://artsbeat.blogs.nytimes.com/2014/11/03/taylor-swift-announces-world-tour-and-pulls-her-music-from- spotify/; Cecilia Kang, Taylor Swift has taken all her music off Spotify, Wash. Post (Nov. 3, 2014), https://www.washingtonpost.com/news/business/wp/2014/11/03/taylor-swift-has-taken-all-her-music-off-spotify/. 39 Factiva search on November 14, 2019 for articles published between June 9, 2017 and June 16, 2017 containing “Taylor Swift + Spotify.”

11

Written Rebuttal Testimony of Catherine Tucker Public Version

Similarly, Adele’s choice not to release her “25” album on streaming services such as Spotify and Apple Music in November 2015 was widely reported, appearing in over 480 news articles across websites associated with major news publications such as The Guardian, The New York

Times, CNN, and The Wall Street Journal in the week following the announcement.40 The availability of Adele’s “25” album on Pandora’s noninteractive streaming service was also covered by several news outlets, with some noting that Pandora’s ability to offer the new album

while it was absent from competing services increased the company’s stock price by up to five

percent.41

If Pandora stopped playing music from [ ] or artist, similar publicity would result. This publicity would increase consumer awareness of Pandora’s loss of access to [ ] music catalog, which would influence consumer behavior in ways that cannot be measured by the Label Suppression Experiments.

Further, publicity of Pandora’s diminished catalog is likely to be amplified by listeners’ discussions on social media, further contributing to users’ awareness of the removal of a record company’s catalog from Pandora’s ad-supported service and influencing their listening behavior. Several research studies have demonstrated that social media and peer influence can have a substantial impact on user behavior towards music. Studies have shown that online

40 Factiva search on November 14, 2019 for articles published between November 19, 2015 and November 26, 2015 containing “Adele + (Spotify or Apple).” See also Nigel Smith, Adele’s new album 25 will not be streamed on Spotify, The Guardian (Nov. 19, 2015), https://www.theguardian.com/music/2015/nov/19/adele-new-album-25-not-stream- spotify-apple-music; Ben Sisaro, Adele Is Said to Reject Streaming for ‘25’, N.Y. Times (Nov. 19, 2015), https://www.nytimes.com/2015/11/20/business/media/adele-music-album-25.html; Frank Pallotta & Brian Stelter, Adele won’t allow ‘25’ album to be streamed,” CNN Business (Nov. 19, 2015), https://money.cnn.com/2015/11/19/media/adele-streaming/; Mike Ayers, Adele’s ‘25’ Won’t Be Available on Spotify or Apple Music, Wall Street Journal (Nov. 19, 2015), https://blogs.wsj.com/speakeasy/2015/11/19/adeles-25-wont- be-available-on-spotify-or-apple-music/. 41 Kia Kokalitcheva, Thanks to Adele, Pandora Says ‘Hello’ to a Stock Price Bump, Fortune (Nov. 25, 2015), https://fortune.com/2015/11/25/adele-pandora/; Kristen Scholer, Adele Says Hello to Pandora, Wall Street Journal (Nov. 25, 2015), https://blogs.wsj.com/moneybeat/2015/11/25/pandora-up-as-adele-says-hello-to-the-streaming- service-bye-to-others/. I understand that because Pandora was operating under the statutory license at the time, Adele had no choice but to make “25” available on Pandora.

12

Written Rebuttal Testimony of Catherine Tucker Public Version

music listening is socially driven and is influenced by the opinions of the community as a

whole, as well as the opinion of the user’s immediate social network friends.42 A study of users of a music-listening social network, Last.fm, found that the opinions and behavior of peers substantially influence listening behavior and upgrade decisions, with “peer influence caus[ing] more than a 60% increase in odds of buying the service due to the influence coming

from an adopting friend.”43 Together, this research suggests that social media commentary can

influence listeners’ behavior in response to Pandora’s loss of content by, for example, making them more likely to reduce their listening time on Pandora or switch away from Pandora altogether.

[

42 Sanjeev Dewan, Yi-Jen Ho, & Jui Ramaprasad, Popularity or Proximity: Characterizing the Nature of Social Influence in an Online Music Community, 28 Information Systems Research 117-136 (Mar. 2017); see also Gal Oestreicher-Singer & Lior Zalmanson, Content or Community? A Digital Business Strategy for Content Providers in the Social Age, 37 Management Information Systems Quarterly 591-616 (June 2013). 43 Ravi Bapna & Akhmed Umyarov, Do Your Online Friends Make You Pay? A Randomized Field Experiment on Peer Influence in Online Social Networks, 61 Management Science 1902-1920, 1902, 1904 (Aug. 2015). 44 SoundExchange Ex. 210, [ ], PANWEBV_00005093, at 00005105; see also SoundExchange Ex. 206, [ ], PANWEBV_00005332, at 00005338 [ ]), 800005339 ([ ]; SoundExchange Ex. 205, [ ], PANWEBV_00004571, at 00004605 [

]; SoundExchange Ex. 207, [ ], SXMWEBV_00004833, at 00004863 (stating that on Pandora, [ ]); SoundExchange Ex. 208, [ ], PANWEBV_00006711, at 00006727 [ ]; SoundExchange Ex. 209, [ ], PANWEBV_00006865, at 00006889 [ ].

13

Written Rebuttal Testimony of Catherine Tucker Public Version

] This further suggests that listeners’ awareness of the missing content in the real world would likely affect their behavior.

Indeed, when a treatment or experiment is actually publicized, there are often insights that would not have been available while users were blind to the experiment. For example,

Amazon’s random consumer price experiment conducted in September 2000, during which it charged consumers different prices for the same DVD, caused outrage when it was made

public.46 Similarly, consumers responded negatively to Netflix’s pricing tests in Australia in May 2017 under the mistaken belief that the company was experimenting with “surge pricing”

during popular usage times, such as weekends.47 In both cases, the experiments themselves would not have captured the outrage that customers felt in response to the pricing adjustments if the field tests had been conducted in a manner such that the users did not find out.

In sum, because of their “blind” design, Dr. Reiley’s Label Suppression Experiments cannot capture any of these effects specifically associated with listeners’ awareness of the removal of recordings from [ ] from Pandora’s ad-supported service. As such, Dr. Reiley’s Label Suppression Experiments provide an inaccurate and downward biased estimate of even the short run effect of losing access to [ ] content.

45 SoundExchange Ex. 206, [ ], PANWEBV_00005332, at 00005340. 46 Wendy Melillo, Amazon Price Test Nets Privacy Outcry, AdWeek (Oct. 2, 2000), https://www.adweek.com/brand- marketing/amazon-price-test-nets-privacy-outcry-30060/; Bezos calls Amazon experiment ‘a mistake’, BizJournals (Sept. 28, 2000), https://www.bizjournals.com/seattle/stories/2000/09/25/daily21.html; Linda Rosencrance, Testing reveals varied DVD prices on Amazon, CNN (Sept. 7, 2000), https://www.cnn.com/2000/TECH/computing/09/07/ amazon.dvd.discrepancy.idg/hs~index.html. 47 Claire Reilly, Netflix quietly tests price hikes in Australia, CNET (May 14, 2017), https://www.cnet.com/news/ netflix-quietly-tests-weekend-price-increases-australia/; Joanna Robinson, Now That You’re Hooked, Netflix Is Looking to Raise Its Prices Again, VanityFair (May 16, 2017), https://www.vanityfair.com/hollywood/2017/05/ netflix-raising-prices-weekend-surge-pricing.

14

Written Rebuttal Testimony of Catherine Tucker Public Version

3. Many of the users in the Label Suppression Experiments were unlikely to be aware of the removal of recordings from the suppressed record company

[

].48 [

].49 Indeed, many aspects of the experiments obscured the label suppression treatment, making it less likely that users in the treatment group would become

aware of the removal of recordings from the suppressed record company.50 First, public information about Pandora’s music library revealed no change in its music catalog during the experiment. Second, many users’ listening patterns suggest that many users in the treatment group were unlikely to have many recordings suppressed during the experimental period. Third, many users in the treatment groups were exposed to recordings from the suppressed

record company either through errors in the implementation of the experiment, Premium Access, subscription tiers, or “miscellaneous provider tracks” in Pandora’s system. Dr. Reiley does not provide any evidence that listeners might have independently ascertained that certain recordings were suppressed. Even if they identified or suspected a pattern, they may have misattributed this to Pandora’s song selection algorithm or other factors rather than label

suppression. As such, the Label Suppression Experiments do not provide a useful predictor of

48 SoundExchange Ex. 231 (Deposition of David Reiley), at 109:24-110:5 ([

]). 49 SoundExchange Ex. 231 (Deposition of David Reiley), at 160:18-25 ([

), 161:3-6 ([ ]). 50 Professor Shapiro also recognizes that users would not have been aware of losing access to certain material and that this lack of knowledge could affect the results, noting: “in the Label Suppression Experiments, listeners were presumably not aware of the blackout, and they might react more strongly if they were aware.” Shapiro Corrected WDT, at 19. 15

Written Rebuttal Testimony of Catherine Tucker Public Version

what would happen in reality if recordings from [ ] were removed from Pandora’s ad-supported service.

a. Public information suggested that Pandora was supplying tracks from all record companies

As Pandora’s ad-supported service is noninteractive in nature, there is always some degree of uncertainty around which track will be played next. Pandora’s ad-supported service relies on

algorithms to select recordings for each user based on her listening habits and expressed preferences. This element of unpredictability inherently obscures the label suppression treatment and makes it more difficult for users to detect the absence of the missing record company’s catalog over a short period of time.

However, despite the many reasons that users were likely to believe that Pandora was still offering sound recordings from [ ], even if some listeners who received the label suppression treatment noticed a difference in the recordings played on their

Pandora stations, surmised that the difference might reflect a modification in Pandora’s service or music catalog, and attempted to investigate this difference, by Dr. Reiley’s design, they would have found no mention of any change in Pandora’s music catalog on its website or any other public sources during the experiment. Articles published during this period suggested

that Pandora’s music library continued to offer “over 40 million songs.”51 For example, a July 2019 Digital Trends article noted that Spotify’s and Pandora’s “libraries are very comparable,

and there aren’t any notable artists who appear on one service and not the other.”52

Similarly, as only a small subset of Pandora listeners were in the treatment groups, it is unlikely that friends and family members of users in the treatment groups also received the label

51 Nelson Aguilar, Spotify vs. Apple vs. Pandora vs. Tidal vs. vs. Amazon, Gadget Hacks (July 19, 2019), https://web.archive.org/web/20190721232028/https://smartphones.gadgethacks.com/news/best-music-streaming- services-spotify-vs-apple-vs-pandora-vs-tidal-vs-deezer-vs-amazon-0199737/. 52 Ryan Waniata & Parker Hall, Spotify vs. Pandora: Which music streaming service is better for you?, Digital Trends (July 25, 2019), https://www.digitaltrends.com/music/spotify-vs-pandora/. 16

Written Rebuttal Testimony of Catherine Tucker Public Version

suppression treatment. As a result, treatment users who noticed a reduction in variety or the absence of certain artists on Pandora’s service and attempted to confirm these suspicions by consulting other users likely would have found that their friends and family members experienced no change during the experiment.

Therefore, even if listeners detected some change during the experimental period and attempted to search for information about Pandora’s music library online or from friends or

family members, they would have been led to believe that Pandora still offered a full catalog of music, including recordings from the suppressed record company. This may have prevented users from becoming aware of the label suppression or modifying their behavior in response to the label suppression during the experiment. In contrast, in the real world, discussion in third-party reporting and friends comparing experiences would reinforce users’ perceptions of the missing content.

b. Many of the users in the experiment spent very little time listening to Pandora’s ad-supported service and were unlikely to have many recordings suppressed in the short timeframe of the experiment

Given the listening patterns of users in the treatment group, the average user likely would have not been aware that a record company’s catalog was being suppressed. [

53 Dr. Reiley testified that the Label Suppression Experiments were started partway through the day on June 3, but emphasized June 4 as the first full day of treatment when reporting the results. I refer to the experimental period as the 89 days between June 4 (the first full day of treatment) and August 31, 2019. Reiley Corrected WDT, at 9. 54 Tucker WRT, Appendix 1.

17

Written Rebuttal Testimony of Catherine Tucker Public Version

] It is not surprising that users were not able to identify (without being told) that the service they were listening to had suppressed less than one track per day on

average during the experimental period.

However, as discussed above in Section III.A.2, many of these users would become aware of Pandora’s loss of content from [ ] in reality, and would respond differently with this awareness.

c. Many of the users in the treatment groups in the experiment were exposed to recordings from the suppressed record company, further obscuring the treatment

In addition, Dr. Reiley identifies a number of ways in which users in the experiments did not receive the treatment as intended and still had access to recordings from the suppressed record company’s catalog. There were at least four main ways in which users in the treatment groups could have heard recordings from the suppressed record company during the experimental

period: (1) through recordings attributed to “miscellaneous providers” rather than the

55 Tucker WRT, Appendix 2. 56 Shapiro Corrected WDT, at 26, 30. 57 [ ]. These numbers reflect account-level information. To the extent that multiple individual users share the same account, this could mean that individuals listen to [ ] and would be even less likely to notice the label suppression treatment over the three-month experimental period. 58 For example, [ ] accounts for approximately [ ] percent of plays on Pandora, implying that, on average, [ ]. Shapiro Corrected WDT, at 26, 30.

18 Public Version

suppressed record company; (2) through Premium Access sessions; (3) through paid subscription tiers; and (4) through errors in the implementation of the experiment.

First, Dr. Reiley explains that there are a number of legacy “miscellaneous provider” tracks that are “not yet tied to our current direct license agreements [but] continue to be spun on the

Pandora service because of the long history of user data associated with those tracks.”59 Dr. Reiley acknowledges that some of these “miscellaneous provider” tracks “might actually be

tracks from the record company we were attempting to suppress—resulting in something less

than a full 100% suppression.”60 Dr. Reiley observes that the “‘miscellaneous provider’ legacy spin share in the [ ] treatment group approximately doubled, from about [ ] in the

control group to approximately [ ] in the treatment group”61 and notes that these spins included some recordings from the suppressed company:

In the [ ] treatment group, our analysis identified six out of the 60 “miscellaneous provider” tracks examined (10%) that appear to be covered by our [ ] license (and thus should have been suppressed). In the [ ] suppression group, they identified nine out of 60 tracks (15%) that appear to be covered by our [ ] license.62

Dr. Reiley considers this magnitude to be unimportant because only a small proportion of spins ([ ]) should have been suppressed but were not

because of this issue.63

However, Dr. Reiley fails to recognize the fact that miscellaneous provider tracks could have played a disproportionate role in preventing those in the treatment group from realizing that anything had changed. If Pandora were still delivering miscellaneous provider tracks from

59 Reiley Corrected WDT, at 17. 60 Reiley Corrected WDT, at 18. 61 Reiley Corrected WDT, at 18. 62 Reiley Corrected WDT, at 19. 63 Reiley Corrected WDT, at 19.

19

Written Rebuttal Testimony of Catherine Tucker Public Version

Adele on a user’s “Adele” channel,64 and furthermore, if they were delivering the type of Adele

tracks that had elicited the most user response,65 such as receiving “thumbs up,” then it would have been less apparent to an Adele fan that Pandora was no longer delivering other recordings by Adele. Even playing one track from a suppressed record company, particularly if it was of the type that had elicited the most user response, could prevent a user from concluding that they no longer had access to the suppressed record company. According to the data produced

by Dr. Reiley, [ ] in the [ ] treatment groups

received spins from miscellaneous providers during the experimental period.66

Second, users may have been exposed to recordings from the suppressed record company through Premium Access listening sessions, during which they could gain access to on-demand functionality for a limited period in exchange for viewing a video advertisement. Dr. Reiley describes that “[b]ecause interactive plays in the ‘Premium Access’ sessions fall outside the statutory license, we did not suppress music played in that feature or in the tracks that ‘auto-

play’ in a Premium Access session after an interactive spin.”67 While Dr. Reiley asserts that continued access to the suppressed record company via Premium Access listening sessions “should not have a significant impact on the results of the Label Suppression Experiments,” he again fails to recognize that these spins could have played a substantial role in preventing users

from detecting a change in the catalog of music available on their Pandora service.68 According to Dr. Reiley’s data, approximately 14 percent of users in the [ ] treatment group and 20 percent of users in the [ ] treatment group received at least one spin from the

64 For the purposes of this example, I am assuming that all Adele tracks are owned by the same record company. 65 Dr. Reiley describes that the likely explanation for the increase in spins from miscellaneous providers was that its “playlist algorithms, confronted with the inability to play [ ] tracks, had to turn to other tracks that our data suggested the user would enjoy—and the body of ‘miscellaneous’ legacy tracks with the deepest history of usage data were a natural place for our algorithms to look for substitutes.” Reiley Corrected WDT, at 18. 66 Rebuttal Appendix 1. This excludes users who, according to Dr. Reiley’s data, listened to zero tracks on the ad- supported radio service during the experimental period. 67 Reiley Corrected WDT, at 8. 68 Reiley Corrected WDT, at 21.

20

Written Rebuttal Testimony of Catherine Tucker Public Version

suppressed record company during Premium Access sessions during the 89-day experimental

period.69

Third, users who upgraded to Plus or Premium subscription tiers during the experiment no longer received the suppression treatment. Despite being analyzed as part of the treatment group, these users received recordings from the suppressed record company as normal. According to Dr. Reiley’s produced data, [

]70

Fourth, many users were exposed to recordings from the suppressed record company due to technical issues with the implementation of the experiment. Dr. Reiley describes that users may have received recordings from the suppressed record company due to various software upgrades on June 13-16, 2019 and June 26, 2019 or due to “routine changes and updates in

ownership information for recordings” on other days during the experimental period.71 As

shown in Rebuttal Appendix 1, [ ] were erroneously exposed to at least one spin from the suppressed record company on Pandora’s ad-supported radio service

while the experiment was running.72

4. Dr. Reiley’s statements about the merits of “blinded” experiments do not apply in this context

Dr. Reiley argues that conducting experiments in a “blind” manner represents the “best method

for determining the causal impact of the manipulated experience.”73 He states that:

“Blind” means experimental subjects are unaware of their assignment to treatment versus control conditions, so that their

69 Tucker WRT, Appendix 1. 70 Tucker WRT, Appendix 1. 71 Reiley Corrected WDT, at 19-20. 72 Tucker WRT, Appendix 1. 73 Reiley Corrected WDT, at 3. 21

Written Rebuttal Testimony of Catherine Tucker Public Version

behavior is influenced only by the different policies adopted by Pandora for each group in the experiment, and not by any communication about the experiment itself (such as listeners changing their behavior in an attempt to influence the results of the experiment, or because of the perceived unfairness of their receiving a different treatment from other customers). We prefer, for scientific reasons, not to bring attention to the fact that consumers are receiving different experiences from each other, so in general they are not aware that they are part of an experiment.74

However, these reasons for making the experiment “blind” underscore why the experiment did not measure the correct effect here. In randomized field experiments, researchers sometimes worry that subjects may change their behavior simply because they are aware that they are participating in an experiment, particularly if they are aware of the purpose of the experiment

and what the researchers are looking for, rather than because of the treatment itself.75 Researchers refer to these effects as “Hawthorne effects,” named after a study of different ways of improving productivity at a factory in which the researcher noticed that productivity increased in both the control and treatment conditions simply because the workers knew they

were being studied.76 As a result, researchers sometimes prefer to keep participants “blind” to the fact that they are part of an experiment and the purpose of that experiment.

In this case, however, Dr. Reiley’s concerns about Hawthorne effects are misplaced and do not justify his choice not to inform users about the label suppression treatment. As Dr. Reiley explained in deposition, [

74 Reiley Corrected WDT, at 4. 75 Glenn Harrison & John List, Field Experiments, 42 Journal of Economic Literature 1034 (Dec. 2004). 76 Glenn Harrison & John List, Field Experiments, 42 Journal of Economic Literature 1034 (Dec. 2004).

22

Written Rebuttal Testimony of Catherine Tucker Public Version

].77 However, recent re-analysis of this data has questioned whether Hawthorne

effects even existed in the original Hawthorne experiments.78

Indeed, researchers such as myself actually emphasize that there is likely to be an empirical distinction between users who are aware of the precise treatment and those who are not. For example, my co-author and I ran a field experiment where we used popularity information to reorder results on a webpage. In some treatment conditions, we made it clear that we had used

popularity information to reorder the results by giving data on the popularity of the links. In another condition, we simply reordered the links by popularity without alerting the users to the fact that we had done so. We found that users had a greater response to the “treatment” of re-

ranking by popularity when they were aware that was what we had done.79 In fact, the effect of knowledge of a treatment is so profound that many field experiments are designed to

measure the effect of knowledge of a treatment rather than the treatment itself.80 For example, several field experiments on tax non-compliance have examined the effect of informing

participants about the probability of audit on compliance behavior without changing the actual

probability of audit.81

The purpose of the Label Suppression Experiments is to estimate the real-world effects if Pandora stopped offering recordings by [ ]—an event which, as established in Section III.A.2 above, would be publicized by various sources and would be

77 SoundExchange Ex. 231 (Deposition of David Reiley), at 113:9-16 ([

]). 78 Steven Levitt & John List, Was There Really a Hawthorne Effect at the Hawthorne Plant? An Analysis of the Original Illumination Experiments, 3 American Economic Journal: Applied Economics 224-238 (Jan. 2011). 79 Catherine Tucker & Juanjuan Zhang, How Does Popularity Information Affect Choices? A Field Experiment, 57 Management Science 828-842 (May 2011). 80 Gordon Burtch, Anindya Ghose, & Sunil Wattal, The Hidden Cost of Accommodating Crowdfunder Privacy Preferences: A Randomized Field Experiment, 61 Management Science 949-962 (May 2015). 81 See, e.g., Luis Castro & Carlos Scartascini, Tax Compliance and Enforcement in the Pampas Evidence From a Field Experiment, 116 Journal of Economic Behavior & Organization 65-82, 66, 69 (2015); see also Joel Slemrod, Marsha Blumenthal, & Charles Christian, Taxpayer Response to an Increased Probability of Audit: Evidence from a Controlled Experiment in Minnesota, 79 Journal of Public Economics155-483 (2001). 23

Written Rebuttal Testimony of Catherine Tucker Public Version

known by many users. Therefore, while there may be reasons to keep users blind to the existence or purpose of the experiment, a properly-designed experiment in this instance must make users aware of the treatment—i.e., the fact that music from a given record company is no longer available—to understand how they would react in reality. Otherwise, Dr. Reiley is simply measuring the effect of an experimental treatment that is meaningfully different from the conditions that users would experience in the real world. To the extent that Dr. Reiley

worried that users would alter their behavior if they knew they were participating in an experiment, he could have made users aware of the absence of recordings from [ ] while keeping them “blind” to the experiment itself and its purpose.

Dr. Reiley acknowledges the possibility that if users had been aware of the experiment, they may have intentionally “chang[ed] their behavior in an attempt to influence the results of the

experiment.”82 This is telling as it suggests that Dr. Reiley expected that participants would have been concerned about the results of the experiment if they understood the nature and

purpose of that experiment. If Dr. Reiley believes that suppression of [ ]

has a “relatively small impact on listening hours” as he claims,83 then his concerns that users would alter their behavior to drive the experiment’s results are unfounded.

Similarly, Dr. Reiley indicates that another reason for keeping the field experiment secret was to avoid the “perceived unfairness” of one group of customers receiving a different treatment

from other customers.84 This suggests that Dr. Reiley believed that users care about having recordings from [ ] and would prefer to have access to those recordings,

such that it could be perceived as “unfair” if one group of users has access and another does not. Dr. Reiley anticipated that if users had known about the treatment, they may have responded

82 Reiley Corrected WDT, at 4. 83 Reiley Corrected WDT, at 16. 84 Reiley Corrected WDT, at 4.

24

Written Rebuttal Testimony of Catherine Tucker Public Version

differently than they did in the experiment when they were unaware of the treatment. [

].85

B. The Label Suppression Experiments do not help us understand competitive effects

In general, blinded field experiments are not known by competitors and therefore cannot be used to measure competitive responses to the tested treatment. If Pandora were to stop offering music from [ ], its competitors could respond in a number of ways. For example, competitors would have incentives to publicize the resulting hole in Pandora’s catalog and emphasize that their own service compares favorably on the breadth and depth of its catalog. Competitors may also target existing Pandora users by running advertising campaigns or offering promotional prices emphasizing this gap to entice users to leave Pandora and switch to their own service. In addition, competitors could respond to Pandora’s

diminished catalog by changing their own offerings, introducing new offerings, and/or changing their prices. In deposition, Dr. Reiley [

].86 The Label Suppression Experiments, however, cannot measure the impact of these potential competitive responses as they were deliberately conducted in secret and not visible to competitors.

85 See also SoundExchange Ex. 231 (Deposition of David Reiley), at 162:23-163:4 ([

]). 86 SoundExchange Ex. 231 (Deposition of David Reiley), at 163:5-164:4 ([

]).

25

Written Rebuttal Testimony of Catherine Tucker Public Version

1. Field experiments cannot be used to measure competitive effects

A field experiment cannot be used to measure competitive effects, and therefore cannot be used to make market share predictions. Indeed, I highlight this as a major caveat of the use of field experiments in my teaching at MIT. As field experiments are small in scale and difficult to detect, competitors are typically unaware as to what is occurring and consequently do not respond. If Pandora were to implement the “experiment” at a commercial scale in the in real

world, competitors presumably would find out about it, and might react.87 Competitive

reactions could include pricing changes, launching new promotional campaigns, or even new products.

2. Streaming services are competitive because it is easy for users to switch

Competitive reactions are important because, as I discuss in my own research and teaching,

streaming services have become increasingly competitive.88 Although there may be reasons

why users might refrain from switching between paid streaming services in the short term,89

these reasons are less prominent for ad-supported services. There are also solutions that reduce some of the nonfinancial costs associated with switching, such as by enabling users to transfer

playlists between certain services.90 As a result, users can move readily between services.

Furthermore, it is also easy for a user to have multiple accounts on different services. For example, a user (such as myself) easily might have a Spotify account, a Google Music account,

87 Professor Shapiro recognizes but does not adjust for this deficiency of the Reiley experiments, noting: “the experiments do not account for the strategic responses of Pandora, the record company, and perhaps other industry participants, to the blackout. This factor is ambiguous; I make no additional adjustment based on this factor.” Shapiro Corrected WDT, at 19-20. 88 Catherine Tucker, Network Effects Matter Less Than They Used To, Harvard Business Review (June 22, 2018), https://hbr.org/2018/06/why-network-effects-matter-less-than-they-used-to. 89 Written Direct Testimony of Aaron Harrison, Sept. 22, 2019, at 14-16. 90 There are even apps that allow users to transfer playlists between Pandora and Spotify. “Soundiiz General Features,” Soundiiz, https://soundiiz.com/features; see also Amber Neely, How to transfer playlists from Spotify to Apple Music, Apple Insider (Aug. 18, 2019), https://appleinsider.com/articles/19/08/18/how-to-transfer-playlists-from-spotify-to- apple-music.

26

Written Rebuttal Testimony of Catherine Tucker Public Version

an Amazon music account, and a Pandora account. [

].91 In the language of platform economics, we call this behavior “multihoming”— where users have multiple homes on different platforms—and economics has shown that

multihoming on both sides of a platform can lead to more intense competition.92 To understand this concept, it is useful to consider the example of Lyft and Uber. It is easy for passengers to

multihome by downloading both apps to their smartphones. This means that Uber and Lyft have to compete on price to avoid passengers just using the other service exclusively. When users multihome streaming services, those services have to compete for user engagement to

decrease their churn.93

91 See, e.g., SoundExchange Ex. 62, [ ], PANWEBV_00004249, at 00004314 ([ ]); SoundExchange Ex. 395, [ ], PANWEBV_00004469, at 00004499 ([ ]); see also SoundExchange Ex. 58, [ ], PANWEBV_00003357, at 00003407 ([ ]); SoundExchange Ex. 402, [ ], PANWEBV_00004024, at 00004054 ([ ]); SoundExchange Ex. 400, [ ], PANWEBV_00009100, at 00009139 ([

]); SoundExchange Ex. 288, [ ], SXMWEBV_00005444, at 00005447 ([ ]). 92 Jean-Charles Rochet & Jean Tirole, Platform Competition in Two-Sided Markets, 1 Journal of the European Economic Association 990-1029 (June 2003); see also, Marco Iansiti & Karim R. Lakhani, Managing Our Hub Economy, Harvard Business Review (Sept.–Oct. 2017), https://hbr.org/2017/09/managing-our-hub-economy (“If multihoming is common, the market is less likely to tip to a single player, preserving competition and diffusing value capture.”). 93 SoundExchange Ex. 206, [ ], PANWEBV_00005332, at 00005335 ([ ]); SoundExchange Ex. 205, [ ] (Jan. 2019), PANWEBV_00004571, at 00004604 ([ ]); see also SoundExchange Ex. 288, [ ], SXMWEBV_00005444, at 00005447([

]); SoundExchange Ex. 401, [ ], PANWEBV_00009182, at 00009185, 00009213 ([

27

Written Rebuttal Testimony of Catherine Tucker Public Version

Competitive pressures can also be seen in the increasing similarity of service offerings across different providers. For example, as I discussed in my Written Direct Testimony, Pandora has recently started to emulate Spotify’s personalized offerings. In September 2018, Pandora introduced “The Drop” on its premium subscription service. This feature, much like Spotify’s

“Release Radar” feature, curates new releases based on the individual user’s listening history.94 Pandora also introduced dozens of personalized playlists tailored to moods, activities, and

favorite genres, similar to Spotify’s popular curated playlists.95

In addition, competitive pressures are also reflected in the prevalence of discounts offered by streaming services to attract users. For example, Spotify has run promotions where it offered

new users three months on its Premium subscription service for $0.99 per month.96 In 2018, Pandora partnered with T-Mobile to offer a promotion where it offered T-Mobile customers a

year-long subscription to Pandora Plus for free.97 Amazon has also offered discounts on its

service..98 Several services, including Spotify, Apple Music, SiriusXM, and Pandora, offer

discounted student and family plans.99

]). 94 Tucker WDT, at 15-17. 95 Tucker WDT, at 16-17. 96 Hilda Scott, “Act Fast: Spotify Premium Just $1 for 3 Months,” Tom’s Guide, May 16, 2019, https://www.tomsguide.com/us/best-spotify-premium-deals,news-30097.html. 97 Pandora teams up with T-Mobile as an Un-carrier partner for unlimited ad-free music, Pandora Blog (Aug. 15, 2018), http://blog.pandora.com/us/pandora-teams-up-with-t-mobile-as-an-un-carrier-partner-for-unlimited-ad-free- music/. 98 See, e.g., Sally Kaplan, Amazon is offering 4 months of access to its music-streaming service for $1 as a Cyber Monday deal — here’s how to take advantage, Business Insider (Dec. 2, 2019), https://www.businessinsider.com/ amazon-music-unlimited-deal. 99 Pandora Premium Student, Pandora (2019), https://www.pandora.com/upgrade/premium/student; Pandora Premium for Families, Pandora (2019), https://www.pandora.com/upgrade/premium/family-plan?TID= PM:PSE:Google&gclid=Cj0KCQiAw4jvBRCJARIsAHYewPP_2y9rQDmMsvF09oDTAUp-fZkm7fU_ ixzim2U6mjrK0ku4n2nx0eYaApywEALw_wcB; Spotify Premium, Spotify, https://www.spotify.com/us/premium/ (last visited January 10, 2020); Apple Music, Apple, https://www.apple.com/apple-music/ (last visited January 10, 2020); I’m already a subscriber. Do I get a discount on additional subscriptions?, SiriusXM (2019), https://listenercare.siriusxm.com/app/answers/detail/a_id/3680/~/im-already-a-subscriber.-do-i-get-a-discount-on- additional-subscriptions%3F.

28

Written Rebuttal Testimony of Catherine Tucker Public Version

3. Competitors could respond to a reduction in Pandora’s catalog

As discussed in Section III.A.2 above, if Pandora were to lose access to [ ] music catalog, its competitors would have incentives to publicly promote this event to consumers. This type of competitive reaction is common in the broader content streaming industry. For example, Hulu has advertised that its service offered content that was

unavailable on Netflix, including popular shows such as Seinfeld and 30 Rock.100 Similarly, Spectrum, a cable company, advertises that they are the exclusive broadcaster of Los Angeles

Dodgers games.101

More generally, it is common for competitors to promote each other’s weaknesses, as demonstrated by examples in other industries. For example, Apple and Samsung have highlighted features missing from the other’s smartphones for several years. Just recently, in September 2019, Samsung launched an advertising campaign coinciding with the release of the iPhone 11 “trying to tempt iPhone users to pick up the Galaxy Note 10 by highlighting

camera features the new iPhones don’t have.”102 Similarly, Apple and Microsoft have a long history of advertising each other’s weaknesses in the desktop and laptop space. A recent advertisement by Microsoft compares its Surface laptop to an Apple MacBook, highlighting

that the Microsoft Surface lasts longer, is faster, and has a better touch screen.103

100 Hulu (@hulu), Twitter (Sept. 16, 2019), https://twitter.com/hulu/status/1173724121726738433; Hulu (@hulu), Twitter (Oct. 4, 2017), https://twitter.com/hulu/status/915648154854404097; Hulu (@hulu), Twitter (Oct. 3, 2017), https://twitter.com/hulu/status/915294183363170306; Hulu (@hulu), Twitter (Oct. 3, 2017), https://twitter.com/hulu/ status/915283841098682368. 101 Bill Shaikan, For the sixth year in a row, most Dodgers fans can’t watch their team on television, L.A. Times (Mar. 8, 2019), https://www.latimes.com/sports/dodgers/la-sp-dodgers-20190308-story.html. 102 Michael Potluck, Samsung Galaxy ad uses missing iPhone 11 camera feature as bait to switch, 9to5Mac (Sept. 13, 2019), https://9to5mac.com/2019/09/13/samsung-iphone-11-missing-camera-feature/. 103 Lisa Eadicicco, Microsoft hired a man named Mac Book to star in its latest ad slamming Apple’s laptops, Business Insider (Aug. 1, 2019), https://www.businessinsider.com/microsoft-slams-apple-macbook-laptops-ad-2019-8; see also Seth Stevenson, Mac Attack: Apple’s mean-spirited new ad campaign, Slate (June 19, 2006) https://slate.com/business/2006/06/apple-s-mean-spirited-ad-campaign.html.

29

Written Rebuttal Testimony of Catherine Tucker Public Version

In addition, the affected record company and its artists may also be motivated to promote Pandora’s diminished catalog in an effort to encourage consumers to use alternative streaming services that offer their music instead. For example, Taylor Swift appeared in several advertisements promoting Apple Music in 2016 when Apple Music was the only service

streaming her music catalog.104

Negative advertisements from competitors, like those observed in other industries, and publicity from record companies and artists could substantially affect Pandora’s brand and reputation among consumers. Indeed, research has shown that comparative advertising can be

effective in damaging a targeted brand.105 Dr. Reiley’s Label Suppression Experiments cannot capture such effects.

C. Notwithstanding that the Label Suppression Experiments did not measure the treatment of interest, the experiments as conducted do not provide precise estimates of the effect of 100 percent label suppression

1. The Label Suppression Experiments were underpowered

In the case of the Label Suppression Experiments, Dr. Reiley uses a sample size of

approximately 15,000 listeners for each of the treatment groups for [ ].106 According to Dr. Reiley, this sample size was selected “based on the tradeoff between statistical power (more data gives more precise estimates) and the potential business impact of

exposing large groups of listeners to a full suppression [ ].”107 Dr. Reiley calculated, ex ante, that this sample size would give him an “80% probability of detecting a

104 Don Reisinger, Here’s the Latest Taylor Swift Apple Music Ad to Go Viral, Fortune (Apr. 18, 2016), https://fortune.com/2016/04/18/taylor-swift-apple-music/. 105 Simon P. Anderson, et al., Push‐Me Pull‐You: Comparative Advertising in the OTC Analgesics Industry, 47 RAND Journal of Economics 1029-1056 (Nov. 2016). 106 Reiley Corrected WDT, at 9. [ ]. 107 Reiley Corrected WDT, at 9.

30

Written Rebuttal Testimony of Catherine Tucker Public Version

statistically significant change in listening hours, relative to the control, if the true change were

4% in the [ ] suppression treatments.”108

As established above in Section III.A, the Label Suppression Experiments are not useful for measuring the real-world impact of losing access to [ ] catalog because, among other things, the participants were “blind” to the suppression and many users were unlikely to experience many suppressed recordings during the short experimental period.

Given these flaws, the Label Suppression Experiments could have measured, at best, a small effect on listener hours in the short timeframe in which the experiment was conducted. In reality, Pandora’s loss of access to [ ] catalog would have a much larger effect on listening hours.

However, the Label Suppression Experiments were underpowered to detect small effects over the three-month experimental period. Dr. Reiley’s experimental design does not adequately account for the highly skewed distribution of listening patterns in the population, meaning that

a large number of users in the treatment groups were likely to have very few recordings actually suppressed. Further, [

].109

Dr. Reiley’s previous work has highlighted the difficulty of establishing ex ante an appropriate sample size for a field experiment. In his paper about experiments in online brand advertising, Dr. Reiley and his co-author commented, “[w]e began this research project expecting that an experiment with more than a million customers would give precise statistical results—we now

108 Reiley Corrected WDT, at 9. 109 SoundExchange Ex. 231 (Deposition of David Reiley), at 138:23-24: ([ ]), 141:12-21 ([

]).

31

Written Rebuttal Testimony of Catherine Tucker Public Version

think otherwise.”110 Dr. Reiley and his co-authors observed that “an economically significant (i.e., profitable) effect of advertising could easily fail to be statistically significant even in a

clean experiment with hundreds of thousands of observations per treatment.”111

A different study by two of Dr. Reiley’s co-authors found that because clicks on online advertisements are a rare event and individual-level sales are highly volatile, “informative advertising experiments can easily require more than ten million person-weeks” to detect the

effect of advertising”.112 The authors found that identifying a five percent return on investment for a campaign would require a sample so large that “the total U.S. population and the

advertiser’s annual advertising budget are binding constraints in most cases.”113

Consistent with this, Dr. Reiley’s published studies on advertising effects required samples many times larger than the sample sizes he selected for the Label Suppression Experiments. For example, to measure consumer sensitivity to audio advertising on Pandora’s service, Dr. Reiley conducted a randomized experiment over 21 months and included approximately 1.8

million Pandora users in each treatment group.114 Similarly, Dr. Reiley conducted an experiment to measure causal effects of online advertising for a major retailer with a treatment

group of approximately 1.3 million individuals.115 These sample sizes are approximately 86 to 2,300 times larger than the sample of approximately 15,000 users for each of the [

110 Randall Lewis & David Reiley, Online Ads and Offline Sales: Measuring the Effects of Retail Advertising via a Controlled Experiment on Yahoo!, 12 Quantitative Marketing and Economics 235-266, 238 (Sept. 2014). 111 Randall Lewis & David Reiley, Online Ads and Offline Sales: Measuring the Effects of Retail Advertising via a Controlled Experiment on Yahoo!, 12 Quantitative Marketing and Economics 235-266, 238 (Sept. 2014). 112 Randall Lewis & Justin Rao, The Unfavorable Economics of Measuring the Returns to Advertising, 130 Quarterly Journal of Economics 1941-1973, at Abstract, at 1941 (Nov. 2015). 113 Randall Lewis & Justin Rao, The Unfavorable Economics of Measuring the Returns to Advertising, 130 Quarterly Journal of Economics 1941-1973, 1964 (Nov. 2015). 114Jason Huang, David Reiley, and Nickolai Riabov, Measuring Consumer Sensitivity to Audio Advertising: A Field Experiment on Pandora Internet Radio (Apr. 21, 2018), https://ssrn.com/abstract=3166676, at Abstract, 3-4. 115 Randall Lewis and David Reiley, Online Ads and Offline Sales: Measuring the Effects of Retail Advertising via a Controlled Experiment on Yahoo!, 12 Quantitative Marketing and Economics 235-266, 239, 247 (Sept. 2014).

32

Written Rebuttal Testimony of Catherine Tucker Public Version

] treatment groups in the Label Suppression Experiments.116 Indeed, Dr. Reiley testified in deposition that [

]117

Further, as noted by Dr. Reiley, the samples used in the Label Suppression Experiments were considerably smaller than those used by Pandora’s witness in the Web IV proceedings, Dr. McBride, in a series of similar “Steering Experiments” that [

].118 Dr. McBride’s treatment groups for [ ] included approximately 5 to 8 percent of Pandora listeners, approximately 250 to 400 times

larger than those used in the Label Suppression Experiments.119 In deposition, Dr. Reiley noted that Dr. McBride’s [

]120 Dr. Reiley also

testified that [

]121

116 Calculated as 1.3 million / 15,000 = 86.67 and 35 million / 15,000 = 2,333.33. 117 SoundExchange Ex. 231 (Deposition of David Reiley), at 36. 118 Written Direct Testimony of Stephan McBride, Oct. 7, 2014, In re Determination of Royalty Rates and Terms for Ephemeral Recording and Digital Performance of Sound Recordings (Web IV) (“McBride Web IV WDT”). 119 Calculated as 0.05 / 0.0002 = 250 and 0.08 / 0.0002 = 400. Reiley Corrected WDT, at 9; McBride Web IV WDT, at 7. 120 SoundExchange Ex. 231, (Deposition of David Reiley), at 124:7-15 ([

]). 121 SoundExchange Ex. 231 (Deposition of David Reiley), at 124:17-125:9 ([

33

Written Rebuttal Testimony of Catherine Tucker Public Version

2. The Label Suppression Experiments were deliberately limited in scope because the researchers anticipated that the consequences of a larger experiment could negatively affect Pandora’s business

The Label Suppression Experiments were designed to minimize potential publicity of the removal of recordings from Pandora’s catalog by restricting the sample to approximately

15,000 users for each of the [ ].122 This was done because of “the potential business impact of exposing large groups of listeners to a full

suppression of [ ].”123 As Dr. Reiley described in his deposition, [

]124 In other words, [ ]. This is at odds with Dr. Reiley’s conclusions that “a near-total suppression of spins of any single record company does not lead to large decreases in the

number of listeners or the number of hours they spend listening to Pandora.”125

To the extent that Dr. Reiley argues that he was wary of a potential business impact simply because he was not aware ex ante that the effect of the label suppression treatment would be

]). 122 Reiley Corrected WDT, at 9. 123 Reiley Corrected WDT, at 9. 124 SoundExchange Ex. 231 (Deposition of David Reiley), at 33:20-34:7 ([

]). Dr. Reiley further indicated that [ ] SoundExchange Ex. 231 (Deposition of David Reiley), at 36:19-24 ([

]). 125 Reiley Corrected WDT, at 15.

34

Written Rebuttal Testimony of Catherine Tucker Public Version

small, he could have rerun the experiment on a larger sample after observing the effect size if

he believed that these results would hold on a larger scale.126,127

3. The Label Suppression Experiments struggled to attain precision on estimates because of the inclusion of many users who received little or no suppression treatment

Due to the design of Dr. Reiley’s experiment, many of the users in the treatment group were rarely, if ever, exposed to the treatment. As shown in Rebuttal Appendix 1, [

126 Further, Dr. Reiley could have stopped the experiment if the effect was larger than anticipated. For example, a medical experiment studying potential treatments for twin-to-twin transfusion syndrome was stopped earlier than planned because the treatment proved so effective. Marie-Victoire Senat, et al., Endoscopic Laser Surgery versus Serial Amnioreduction for Severe Twin-to-Twin Transfusion Syndrome, 351 New England Journal of Medicine 136- 144 (July 2004). 127 Indeed, Dr. Reiley testified that [

]. SoundExchange Ex. 231, (Deposition of David Reiley), at 130:10-18 ([

]). 128 Tucker WRT, Appendix 1. 129 Tucker WRT, Appendix 1.

35

Written Rebuttal Testimony of Catherine Tucker Public Version

130 Tucker WRT, Appendix 1. [

]. Shapiro Corrected WDT, at 26, 30. 131 Tucker WRT, Appendix 1. [

]. Shapiro Corrected WDT, at 26, 30. 132 Tucker WRT, Appendix 1. 133 Tucker WRT, Appendix 1. 134 Tucker WRT, Appendix 1.

36

Written Rebuttal Testimony of Catherine Tucker Public Version

135 Tucker WRT, Appendix 1. 136 Tucker WRT, Appendix 1. 137 Tucker WRT, Appendix 1. [

]. 138 SoundExchange Ex. 231 (Deposition of David Reiley), at 67:25-68:10 ([

]). Dr. Reiley also testified that [ ]. SoundExchange Ex. 231 (Deposition of David Reiley), at 64:1365:9 ([

]).

37

Written Rebuttal Testimony of Catherine Tucker Public Version

139 SoundExchange Ex. 231 (Deposition of David Reiley), at 68:11-23 ([

]). As Dr. Reiley testified in deposition, [

]. Like Dr. Reiley, I am unable to observe such errors. To the extent that such errors exist, this would call into question the validity of the results. SoundExchange Ex. 231 (Deposition of David Reiley), at 49:25-50:8 ([

]), 58:20-24 ([ ]). 140 Tucker WRT, Appendix 1. 141 As noted in Section III.A, the “treatment” administered by Dr. Reiley is unrealistic and does not reflect important characteristics of Pandora’s loss of content from [ ] in the real world. 142 Tucker WRT, Appendix 1.

38

Written Rebuttal Testimony of Catherine Tucker Public Version

]

Overall, Dr. Reiley estimates that approximately [ ] of total spins in the [ ] treatment group were from the suppressed record company as compared to approximately [ ] of spins in the control group, meaning that the Label Suppression Experiments

achieved approximately [ ] suppression rather than the intended 100 percent.144 Dr. Reiley argues that [

]145

However, this claim is unsubstantiated and ignores literature suggesting that people can behave differently at the extremes of a distribution. For example, researchers have found substantial nonlinearities around zero price, such that significantly more participants choose the cheaper

143 Tucker WRT, Appendix 1. [

]. 144 Reiley Corrected WDT, at 22. 145 SoundExchange Ex. 231 (Deposition of David Reiley), at 140:5-16 ([

]).

39

Written Rebuttal Testimony of Catherine Tucker Public Version

option when it is offered for free than when it is set at a low positive price.146 Further, research of decision-making under uncertainty has demonstrated that people tend to respond differently to outcomes that are merely probable as compared to those that are certain, and tend to

overweight low probability events.147 Correspondingly, users may respond differently when they receive some recordings from the suppressed record company than when they receive no recordings from the suppressed record company.

4. Unlike other work by Dr. Reiley, Dr. Reiley’s analysis of the Label Suppression Experiments does not use data on the intensity of treatment

Unlike other published work by Dr. Reiley, Dr. Reiley’s analysis of the Label Suppression Experiments does not use data on the actual treatment administered—i.e., the number of recordings suppressed for each user. Dr. Reiley testified in deposition that [

].148 Given that the Label Suppression Experiments attempt to measure the effect of suppressing tracks on user behavior, [

]. While Dr. Reiley said that the Label Suppression Experiments were conducted in accordance with how Pandora regularly conducts experiments, it is clear that this departs from procedures used in Dr. Reiley’s published work, including a study conducted on Pandora users.

146 Kristina Shampanier, Nina Mazar, & Dan Ariely, Zero as a Special Price: The True Value of Free Products, 26 Marketing Science 742-757 (Nov.-Dec. 2007). 147 Daniel Kahneman & Amos Tversky, Prospect Theory: An Analysis of Decision Under Risk, 47 Econometrica 263- 292 (Mar. 1979). 148 SoundExchange Ex. 231 (Deposition of David Reiley), at 155:19-25 ([

]), 198:21-199:8 ([

]).

40

Written Rebuttal Testimony of Catherine Tucker Public Version

In 2018, Dr. Reiley and his co-authors published a study of the long-run effect of increasing

ad-load on consumer listening behavior on Pandora’s service (the “Ad-Load Experiments”).149 In these experiments, Dr. Reiley and his co-authors collected data on the number of ads shown to each user (i.e., the “intensity of treatment”). The major focus of the empirical work, and the method by which the authors increased the precision of their estimates, involved estimating a

specification that measured the effect of the intensity of the treatment on listening behavior.150

The authors explained the importance of using this analytical approach, noting that this “guarantees that we exploit only the experimental differences between groups to identify the causal effects, removing all within-group variation that might yield spurious correlation—for example, urban listeners might both receive more ads and listen fewer hours, on average, than

rural listeners, even though this correlation is not at all causal.”151

In another experiment published in 2014, Dr. Reiley and a co-author examined the effects of a

nationwide retailer’s online advertising campaign on Yahoo! on offline sales.152 The authors

collected data on the number of ads from the campaign shown to each user in the treatment group, allowing them to identify the amount of treatment received by each individual. The authors used this information in their empirical analysis to estimate the treatment effect on the treated (i.e., the individuals in the treatment group who were exposed to at least one ad from the relevant campaign) and to examine how the treatment effect varied with the number of ads

seen.153

149 Jason Huang, David Reiley, & Nickolai Riabov, Measuring Consumer Sensitivity to Audio Advertising: A Field Experiment on Pandora Internet Radio, at 1 (Apr. 21, 2018), https://ssrn.com/abstract=3166676. 150 Jason Huang, David Reiley, &and Nickolai Riabov, Measuring Consumer Sensitivity to Audio Advertising: A Field Experiment on Pandora Internet Radio, at 9-10 (Apr. 21, 2018), https://ssrn.com/abstract=3166676. These estimates are also cited in the abstract and conclusion of the paper. 151 Jason Huang, David Reiley, & Nickolai Riabov, Measuring Consumer Sensitivity to Audio Advertising: A Field Experiment on Pandora Internet Radio, at 8 (Apr. 21, 2018), https://ssrn.com/abstract=3166676. 152 Randall Lewis and David Reiley, Online Ads and Offline Sales: Measuring the Effects of Retail Advertising via a Controlled Experiment on Yahoo!, 12 Quantitative Marketing and Economics 235-266 (Sept. 2014). 153 Randall Lewis and David Reiley, Online Ads and Offline Sales: Measuring the Effects of Retail Advertising via a Controlled Experiment on Yahoo!, 12 Quantitative Marketing and Economics 235-266, 246-248, 260 (Sept. 2014). 41

Written Rebuttal Testimony of Catherine Tucker Public Version

In contrast, in the Label Suppression Experiments, the treatment is the number of recordings suppressed. While Dr. Reiley collected data on the number of spins from the suppressed record company that were erroneously delivered to each user, [ ]. As such, he is unable to use the intensity of treatment in his analysis, as he has done in previous studies.

D. The Label Suppression Experiments are not useful for estimating true long-run effects

Dr. Reiley conducted the Label Suppression Experiment over an 89-day experimental period, from June 4 to August 31, 2019. That is insufficient to assess the anticipated real world effect of losing access to content from [ ] that Dr. Reiley was attempting to measure. The main effect of such a change to the Pandora service likely would be to deter potential future Pandora ad-supported users, and increase the likelihood that existing Pandora ad-supported users would leave the service at some point in the future to switch to an alternative

rival service such as Spotify or Apple Music. This is a long-run process.

That is, in part, because it would likely take time for many current and potential Pandora listeners to learn about the compromised quality of Pandora’s webcasting service. Consumers who had noticed that the service offered less than before, may have thought this was just temporary and may have been waiting for things to return to normal, given the short length of the experiment. Consumer learning can lead to substantial differences in the measured effect

of a treatment over time.154

154 See, e.g., Ronny Kohavi, Trustworthy Online Controlled Experiments and the Risks of Uncontrolled Observational Studies, Microsoft, at 15 (2019), https://exp-platform.com/Documents/2019-08%20CausalDiscoveryExPRonnyK.pdf (noting the importance of designing experiments to address “customer lifetime value, not immediate revenue.”). Indeed, the large and significant increase in thumbs-down activity observed in the [ ] treatment group may be a leading indicator of future reduced listening behavior, providing potential evidence that long-run effects are larger than short-run effects. Given Dr. Reiley’s small sample sizes and the short timeframe of the experiment, this type of effect may not have been detectable for the other tested record companies, [ ]. Reiley Corrected WDT, at 14-15. Research suggests that short-term measures of

42

Written Rebuttal Testimony of Catherine Tucker Public Version

As such, the effects of Pandora’s loss of content on its business would likely develop over time. These long-run effects would be considered by both parties in a negotiation between a willing buyer and willing seller. Further, because the Web V proceeding will determine the statutory rate for a five-year time period, it is important to consider the potential cost of not having a license over a comparable period.

Because they were conducted for only 89 days, Dr. Reiley’s Label Suppression Experiments cannot measure the long-run effects of Pandora’s loss of content from [ ]. As Dr. Reiley testified in deposition, [

]155 Dr. Reiley explained:

[

].156

Academic research has highlighted the importance of measuring long-run effects and has cautioned against extrapolating observed short-run effects without understanding long-run processes. For example, a 2004 study examined the effect of price promotions on purchasing behavior over a two-year period and found that deeper price discounts increased future purchases by first-time customers (a positive long-run effect) but reduced future purchases by established customers (a negative long-run effect). The authors warned that “[i]f firms focus solely on short-run elasticities, or they fail to distinguish between first-time and established

consumer satisfaction can provide useful proxies for long-term effects of a treatment. Henning Hohnhold, Deirdre O’ Brien, & Diane Tang, Focusing on the Long-term: It’s Good for Users and Business (2015), htt p://static.googleusercontent.com/media/research.google.com/en//pubs/archive/43887.pdf. 155 SoundExchange Ex. 231 (Deposition of David Reiley), at 39:7-11 ([

]). 156 SoundExchange Ex. 231 (Deposition of David Reiley), at 121:5-11.

43

Written Rebuttal Testimony of Catherine Tucker Public Version

customers, then prices may be set incorrectly.”157 Another 2007 study conducted by three Google data scientists and identified as recommended reading by Dr. Reiley for his “Field Experiments” class at UC Berkeley similarly warns that “the short-term effect is not always

predictive of the long-term effect.”158

Dr. Reiley’s prior work has also highlighted the importance of directly measuring long-run effects. For example, in his co-written paper on the long-run effect of advertising on listener-

hours (the “Ad-Load Experiments”), Dr. Reiley emphasized the importance of distinguishing

between long-run and short-run effects and why the measurement of short-run effects cannot be

extrapolated to the long run.159 In that paper, Dr. Reiley noted that “how important it is that we ran the experiment for over a year… the treatment effect grows over the course of an entire

157 Eric Anderson & Duncan Simester, Long-Run Effects of Promotion Depth on New Versus Established Customers: Three Field Studies, 23 Marketing Science 4-20, at 1, 5 (Winter 2004). 158 Henning Hohnhold, Deirdre O’Brien, & Diane Tang, Focusing on the Long-term: It’s Good for Users and Business, at 1 (2015), https://static.googleusercontent.com/media/research.google.com/en//pubs/archive/43887.pdf; SoundExchange Ex. 404, David Reiley, Field Experiments (2015), https://docs.google.com/document/d/ 1BDUxgzEk1vWXMiV2nzpZzoa7Hyp8I9LBiYmqA9XtZpo/edit; SoundExchange Ex. 376, David Reiley, Suggestions for Further Reading, W241: Experiments and Causality (2015), https://docs.google.com/document/d/ 1IMsGTHmklhvetfJJfEm9dhoFM7bvb-YOkN_6mAM8kFM/edit#. Similarly, another recommended reading for Dr. Reiley’s course warns that partial equilibrium effects may differ from full equilibrium effects (i.e., when the treatment is rolled out to everyone and individuals have time to adjust their behavior in response). Dr. Reiley described “Sometimes experiments only manage to estimate ‘partial equilibrium’ effects instead of ‘general equilibrium’ effects…That is, sometimes the treatment has one effect when only a few people are being treated (as in an experiment), but when everyone is being treated (as a policy is rolled out to everyone), the total effects are quite different because, for example, market prices change.” David Yanagizawa-Drott & Jakob Svensson, Estimating Impact in Partial vs. General Equilibrium: A Cautionary Tale from a Natural Experiment in Uganda (Aug. 2012), https://epod.cid.harvard.edu/sites/default/files/2018-02/estimating_impact_in_partial_vs._general_equilibrium-_a_ cautionary_tale_from_a_natural_experiment_in_uganda.pdf. More generally, see, e.g., Deepa Chandrasekaran & Gerard J. Tellis, “Chapter 14: A summary and review of new product diffusion models and key findings” in Handbook of Research on New Product Development (Peter Golder and Debanjan Mitra eds., Edward Elgar Publishing, 2018); Andrew Baker & Naveen Naveen, “Chapter 15: Word-of-mouth processes in marketing new products: recent research and future opportunities” in Handbook of Research on New Product Development (Peter Golder and Debanjan Mitra eds., Edward Elgar Publishing, 2018). 159 Jason Huang, David Reiley, and Nickolai Riabov, Measuring Consumer Sensitivity to Audio Advertising: A Field Experiment on Pandora Internet Radio, at 10 (Apr. 21, 2018), https://ssrn.com/abstract=3166676 (“How much does it matter that we conducted a long-run rather than a short-run experiment? To see how our estimates change with longer exposure to the treatment, we run a 2SLS regression for each month of the experiment as if that month were the final one… The estimated effects of a 1% increase in ad load, on hours and days active, respectively, start out at around - 0.02% and -0.025%, slowly increasing to effects of -0.070% and -0.076%. Had we run an experiment for just a month or two, we could have underestimated the true long-run effects by a factor of 3.”).

44

Written Rebuttal Testimony of Catherine Tucker Public Version

year, stabilizing for the most part only after 12-15 months of treatment.”160 Dr. Reiley and his co-authors concluded that “[h]ad we run an experiment for just a month or two, we could have

underestimated the true long-run effects by a factor of 3.”161

In fact, Dr. Reiley’s Ad-Load Experiment indicates that a three-month treatment period, with measurement for only 28 days, may not have been able to measure a statistically significant effect. While the results of the Ad-Load Experiment suggest that the long-run effects on

listening hours are large and robust, it is not clear that the experiment would have returned a result that was statistically different from zero if the treatment effect had only been calculated

for the first 28 days of the experiment.162

Importantly, whereas in his prior study, Dr. Reiley observed that “the true long-run effects” were larger than short run effects “by a factor of 3,” Dr. Reiley did not make a claim, either in his prior published work or in his testimony in this matter, that this relationship is a general

rule or that it can be generalized to completely different situations.163 In fact, Dr. Reiley testified

at deposition that [

]164 He explained that [

160 Jason Huang, David Reiley, and Nickolai Riabov, Measuring Consumer Sensitivity to Audio Advertising: A Field Experiment on Pandora Internet Radio, at 7 (Apr. 21, 2018), https://ssrn.com/abstract=3166676. 161 Jason Huang, David Reiley, and Nickolai Riabov, Measuring Consumer Sensitivity to Audio Advertising: A Field Experiment on Pandora Internet Radio, at 10 (Apr. 21, 2018), https://ssrn.com/abstract=3166676. 162 Jason Huang, David Reiley, and Nickolai Riabov, Measuring Consumer Sensitivity to Audio Advertising: A Field Experiment on Pandora Internet Radio, at 7-8 (Apr. 21, 2018), https://ssrn.com/abstract=3166676. Specifically, based on Figure 4, the effect on listening hours in the first month of the treatment appears to be less than 0.5 percent. Table 4 indicates that the standard errors measured during the final month of the treatment were approximately 0.22 to 0.24 percent. Assuming these standard errors reasonably approximate (or understate) the standard errors in the first month of treatment, it is not clear that the effect on listener hours in the first month of treatment is statistically significantly different from zero. 163 Jason Huang, David Reiley, and Nickolai Riabov, Measuring Consumer Sensitivity to Audio Advertising: A Field Experiment on Pandora Internet Radio, at 10 (Apr. 21, 2018), https://ssrn.com/abstract=3166676. 164 SoundExchange Ex. 231 (Deposition of David Reiley), at 122:15-23 ([

]).

45

Written Rebuttal Testimony of Catherine Tucker Public Version

]165

IV. Professor Shapiro Presents Insufficient Analysis to Conclude that No Label is a “Must-Have”

In his Corrected Written Direct Testimony, Professor Shapiro uses results from the Label Suppression Experiments as an input to a model estimating the royalty rates that noninteractive services would be willing to pay to digitally perform record companies’ sound recordings.

Professor Shapiro uses the [ ] experiment (which shows the largest estimated impact of

the label suppression treatment on listening hours) for [ ].166 Professor Shapiro uses the upper bound of the 95 percent confidence interval from the [ ] experiment (which he claims is to account for the fact that participants “were presumably not aware of the blackout” and for “certain imperfections in the experimental

implementation”) and multiplies by a factor of three (which he claims is to address the short- term nature of the Label Suppression Experiments), to estimate the record company’s opportunity cost of licensing its music to a statutory webcaster, as well as each party’s gains

from trade from a licensing deal.167 These are an input into Professor Shapiro’s bargaining

model, which he uses to estimate a reasonable royalty rate.168

165 SoundExchange Ex. 231 (Deposition of David Reiley), at 121:20-122:14 ([

]). Dr. Reiley indicated that [ ] This does not appear to be the case, as demonstrated by the examples of other long-run experiments discussed in this section. 166 Shapiro Corrected WDT, at 19, 22. 167 Shapiro Corrected WDT, at 16-20, 27, Appendix F. 168 Shapiro Corrected WDT, at 23-27, 29-30, Appendix F. 46

Written Rebuttal Testimony of Catherine Tucker Public Version

Because Professor Shapiro’s analysis of reasonable royalty rates relies heavily on the results of the Label Suppression Experiments, errors in these experiments render Professor Shapiro’s reasonable royalty estimates flawed and unreliable. These errors are compounded by Professor Shapiro’s misuse of the data, unfounded and ad hoc assumptions, and inappropriate extrapolations.

A. Professor Shapiro’s estimates rely heavily on the Label Suppression Experiments

Professor Shapiro uses results from the Label Suppression Experiments to estimate reasonable royalty rates paid by noninteractive services for digital performances of sound recordings.

Professor Shapiro uses a willing buyer, willing seller framework,169 in which, under one of his two approaches, the bargaining approach, he determines the reasonable royalty as the mid- point of (1) Pandora’s willingness to pay to obtain access to a record company’s music and (2)

a record company’s opportunity cost of licensing its music to a statutory webcaster.170 Both elements of that calculation—Pandora’s willingness to pay and the record company’s

opportunity cost—crucially depend on the estimated fraction of plays that Pandora would lose if it cannot play recordings from a particular record company, an input that comes directly out

of the Label Suppression Experiments.171 As Pandora’s estimated lost performances increase, Pandora’s willingness to pay and the record company’s opportunity cost also increase,

ultimately driving up Professor Shapiro’s calculated reasonable royalty.172

Professor Shapiro explained the importance of Pandora’s estimated loss in performances if it lost access to a record company’s catalog (a number that is derived from the Label Suppression Experiments) to determining the opportunity cost:

[A]ll else equal, the opportunity cost to a record company of licensing its music to a statutory webcaster is proportional to the

169 Shapiro Corrected WDT, at 3. 170 Shapiro Corrected WDT, at 23. 171 Shapiro Corrected WDT, at 18-20, 26 and Appendix F, p. 1-3. 172 Shapiro Corrected WDT, at Appendix F, p. 1-3. 47

Written Rebuttal Testimony of Catherine Tucker Public Version

share of listening hours that the statutory webcaster would lose if they were not able to play that record company’s music. This economic fact is fundamental for the setting of reasonable per- performance royalty rates using the opportunity cost methodology. To see why, suppose that we calculate the opportunity cost to a given record company of licensing its repertoire to a statutory webcaster as $0.0025 per performance, under the assumption that this record company is “must-have” for the statutory webcaster. Next, suppose that we then learn that in fact the statutory webcaster would lose only 20% of its listener hours (not 100%, as with a “must-have” label) if it were unable to play this record company’s music. Then the true opportunity cost for this record company would be only 20% as large as we had previously estimated, namely $0.0005 per performance, not $0.0025 per performance.173

Ultimately, based on his application of the results on the Label Suppression Experiments, Professor Shapiro assumes that Pandora would lose at most [ ] of its listener hours

if it lost access to [ ].174 Based on this “new evidence,” he concludes that “no individual record company is even close to being “must-have” for Pandora’s

advertising-supported webcasting service.”175

However, because this “new evidence” comes from flawed experiments, Professor Shapiro’s conclusions, which flow directly from the results of these experiments, are also flawed. This implies that Professor Shapiro’s reasonable royalty estimates are both unreliable and appear highly deflated.

B. Professor Shapiro’s ad hoc corrections to the estimates from the Label Suppression Experiments do not result in a conservative application of those estimates

Professor Shapiro appears to be aware of some of the deficiencies in the Label Suppression Experiments and performs some ad hoc corrections to attempt to deal with them. However, the

173 Shapiro Corrected WDT, at 15 (emphasis added). 174 Shapiro Corrected WDT, at 26. 175 Shapiro Corrected WDT, at 12. 48

Written Rebuttal Testimony of Catherine Tucker Public Version

ad hoc corrections are arbitrary and do not address the fundamental limitations with the Label Suppression Experiments.

For example, Professor Shapiro notes that one limitation of the Label Suppression Experiments is that listeners may not have been aware of the label suppression treatment:

[L]isteners were presumably not aware of the blackout, and they might react more strongly if they were aware. I account for this factor, and for certain imperfections in the experimental implementation that are discussed in Appendix E, by applying the upper end of the 95% confidence interval from the Label Suppression Experiments to obtain a range of negotiated rates.176

While Professor Shapiro presents using the upper bound of a 95 percent confidence interval as a conservative estimate of the real world impact of Pandora’s failure to reach an agreement to license the catalog of a particular record company, he has not pointed to any evidence to support this claim. Dr. Reiley’s analysis is flawed and generates a biased estimate that likely vastly understates the effect of interest. Professor Shapiro’s ad hoc “correction” is untethered

to any valid procedure to produce reliable field experiment estimates.177 Far from producing a “conservative estimate,” Professor Shapiro produces an arbitrary estimate, which likely continues to understate the true effect of suppression.

Professor Shapiro also appears to be aware that the Label Suppression Experiments do not provide estimates of the long-run effects of label suppression that he requires, and he extrapolates the short-term results of the Label Suppression Experiments by assuming that the

176 Shapiro Corrected WDT, at 19. 177 The Label Suppression Experiments do not provide a useful guide to the effect on listening hours when users are aware of the missing recordings. Dr. Reiley and Professor Shapiro cannot ascertain how many users in the treatment groups, if any, independently realized that their Pandora ad-supported service was no longer playing recordings from the suppressed record company. Professor Shapiro’s use of the upper bound of the 95 percent confidence interval does not correct for the fact that the experiment did not measure the treatment of interest, and therefore his adjusted estimates do not provide insight into the true effect of the correct treatment.

49

Written Rebuttal Testimony of Catherine Tucker Public Version

long-term effects are approximately three times larger than the observed short-term effects.178 Professor Shapiro, however, provides no legitimate support for why this relationship, which was obtained from a different experiment involving a different treatment and a different experimental design, is applicable here. Professor Shapiro’s assumption is based on a previous paper co-authored by Dr. Reiley showing that the short-term effects of a treatment on listener

behavior can differ substantially from long-term effects.179 As discussed in Section III.D, Dr.

Reiley did not demonstrate that long-term effects are typically three times greater than short term effects, or that the speed with which listeners react to label suppression experiments has

any relationship to the speed with which listeners react to advertising experiments.180 Indeed, Dr. Reiley presents no empirical basis to conclude that the relationship between time and treatment effect is linear rather than exponential. As such, Professor Shapiro has no way to know whether his ad hoc adjustment adequately captures the potential increase in effect size for the label suppression treatment over time.

C. Professor Shapiro ignores the additional effects of losing access to content from [ ] on Pandora’s underlying business model

As I explain in my Written Direct Testimony, the value of services such as Pandora is driven

by the underlying unit economics of each customer.181 In cases when unit economics drive profits, it is important for companies to focus on all of the metrics that affect customer lifetime value, not just short-run changes, such as shifts in users or subscribers, that have an immediate

178 Reiley Corrected WDT, at 24-25; Shapiro Corrected WDT, at 19. 179 Jason Huang, David Reiley, & Nickolai Riabov, Measuring Consumer Sensitivity to Audio Advertising: A Field Experiment on Pandora Internet Radio, at 8, 11 (Apr. 21, 2018), https://ssrn.com/abstract=3166676. 180 [ ]. SoundExchange Ex. 231 (Deposition of David Reiley), at 122:15-23 ([

]). 181 Tucker WDT, at 12-13.

50

Written Rebuttal Testimony of Catherine Tucker Public Version

effect on revenues.182 These factors include the cost of acquiring a customer, the likelihood of retaining a customer, and the revenue generated from each customer. However, Professor Shapiro’s use of the Label Suppression Experiments only considers a loss of listening hours. He does not try to calculate the likely effects of Pandora’s failure to reach an agreement with a record company on other key profit drivers.

In reality, Pandora’s loss of access to [ ] music catalog would affect

other key metrics in addition to listening hours.183 First, the loss of recordings from [ ] likely would make it more expensive for Pandora to acquire customers, as Pandora would be competing with a degraded service against interactive services that offer a full repertoire of music from [ ]. As discussed above in Section III.A.2, Pandora’s competitors would have incentives to publicize the deficiencies in Pandora’s music catalog to consumers. This publicity may have the greatest effect on potential new customers, who are particularly likely to seek information on the relative strengths and

weaknesses of competing services.184 This implies that Pandora would likely find it harder to attract new users and may need to increase its spending on promotions and incentives to attract new customers.

Second, Pandora’s loss of access to [ ] catalog would likely make it harder for Pandora to retain customers. That is, Pandora would run the risk that, rather than

just reducing the amount of time spent listening to the service, some users may churn and never use the service again. This is especially true if Pandora’s diminished music library incentivizes

182 See, e.g., Ronny Kohavi, Trustworthy Online Controlled Experiments and the Risks of Uncontrolled Observational Studies, Microsoft, at 15 (2019), https://exp-platform.com/Documents/2019-08%20CausalDiscoveryExPRonnyK.pdf (noting the importance of designing experiments to address “customer lifetime value, not immediate revenue.”). 183 Dr. Reiley testified that [ ]. SoundExchange Ex. 231 (Deposition of David Reiley), at 166:8-16 ([

]). 184 Further, new users who have not built up a history of listening data or curated personalized stations over time may be quickly disappointed by the service’s failure to play recordings from the artist used to seed a new station if that artist’s music is associated with the suppressed record company. 51

Written Rebuttal Testimony of Catherine Tucker Public Version

customers to sign up for a competing service instead of continuing to use Pandora. As a result, Pandora may react to these circumstances by increasing its spending on promotional campaigns and/or by providing customers with other incentives to continue using its service. Furthermore, this is likely to be a long-run effect that cannot be captured by a 90-day experiment.

Third, Pandora’s loss of access to [ ] catalog may reduce advertisers’ incentives to advertise, and willingness to pay for advertising spots, on Pandora’s now degraded service. Advertisers might prioritize other competing services and some may stop

advertising on Pandora altogether.185 As a result, in addition to reducing listening hours,

Pandora’s diminished music library could also result in lower revenue per listening hour.186 Consistent with this, recently-shuttered music service 8tracks explained in a blog post that the

value of advertising spots on its service fell as its listener base declined.187

Fourth, the loss of access to [ ] content may affect Pandora’s ability to effectively upsell customers of its ad-supported service to its Plus and Premium offerings, as well as to Sirius XM’s offerings. Pandora’s executives have commented on the importance

of “funneling” its ad-supported users to more profitable paid subscriptions,188 emphasizing that ad-supported customers are a main source for subscribers for its Plus and Premium services

and are “virtually free of acquisition cost.”189 Dr. Reiley similarly testified [

]190 In addition, as discussed in my Written Direct

185 The loss of advertisers was one of the problems faced by Myspace as it lost users to rival Facebook. See, e.g., Dominic Rushe, Myspace sold for $35m in spectacular fall from $12bn heyday, The Guardian (June 30, 2011), https://www.theguardian.com/technology/2011/jun/30/myspace-sold-35-million-news. 186 This is measured by Pandora as RPM, or revenue per thousand listening hours. 187 David Porter, To Everything There is a Season, 8tracks Blog (Dec. 26, 2019), https://blog.8tracks.com/. 188 For example, Pandora executives described its plans to “leverage [its] existing audience to attract subscribers” as a central component of its “go-to-market strategy” for its Pandora Premium on-demand subscription tier. Pandora Media Inc. Q4 2016 Earnings Call Transcript, Feb. 9, 2017, at 3. 189 See, e.g., Pandora Media Inc. Q4 2016 Earnings Call Transcript, Feb. 9, 2017, at 3. 190 SoundExchange Ex. 231 (Deposition of David Reiley), at 40:6-15 ([

52

Written Rebuttal Testimony of Catherine Tucker Public Version

Testimony, Sirius XM’s acquisition of Pandora was, in large part, motivated by cross-selling and upselling opportunities based, in part, on Pandora’s large number of active ad-supported

listeners.191 A loss in listening hours may translate into a reduction in the spillover gains from

upsell opportunities, and is likely to have cascading effects on Pandora’s business.192

Losing access to [ ] catalog would influence not only Pandora’s ad- supported listener hours, but also Pandora’s unit economics and overall profitability in ways

not captured simply by direct losses of listener hours. The result of these additional effects on Pandora’s business, none of which is addressed by the Label Suppression Experiments or otherwise by Professor Shapiro, would be to decrease Pandora’s ability to attract, retain, and monetize customers and/or increase the costs of servicing them. As a result, not only would the real-life suppression of [ ] decrease listening hours and active listeners on Pandora’s ad-supported service, it also likely would have a substantial further effect on lifetime profit per user, and would threaten the viability of Pandora’s business more

severely than is suggested by just focusing on the effect of reduced listener hours.

Indeed, shifts in these variables (outside of listening hours) tend to have profound shifts in the value of a business, as demonstrated by share price movements of other web-based companies. For example, Spotify’s share price fell five percent in July 2019 after Spotify reported that monthly active users and Premium subscribers missed expectations by 4.1 percent and 0.5

]). See also, SoundExchange Ex. 399, [ ], PANWEBV_00008435 ([

]. 191 Tucker WDT, at 45-46. 192 The founders of 8tracks identified this as one reason why they decided to shut down, explaining that “the steady decline in our free, ad-supported audience resulted in a smaller base of active listeners that might eventually be converted to 8tracks Plus, our ad-free subscription offering.” David Porter, To Everything There is a Season, 8tracks Blog (Dec. 26, 2019), https://blog.8tracks.com/.

53

Written Rebuttal Testimony of Catherine Tucker Public Version

percent, despite exceeding revenue expectations.193 In October 2019, Spotify’s share price

increased 12 percent when Spotify reported higher than expected subscribers and profits.194 Similarly, social network Twitter has experienced several steep declines in its share price driven by its failure to meet expectations of active user growth. For example, in July 2015, Twitter’s share price fell by as much as 13 percent on the day it announced earnings, a reaction that some third-party commentators largely attributed to falling short of forecasted monthly

active user numbers by approximately 1.2 percent.195 Professor Shapiro’s methodology, in contrast, assumes that Pandora’s reduction in revenues if it fails to reach an agreement with [

] is proportional to its loss of performances.196

D. Professor Shapiro improperly applies the results of the Label Suppression Experiments to estimate a reasonable royalty for subscription webcasters

Further compounding these errors, Professor Shapiro’s opportunity cost analysis for subscription webcasters uses “as a proxy the same percentage loss of listening hours for a

subscription webcasting service that was found in the Label Suppression Experiments

conducted on Pandora’s advertising-supported service.”197 In other words, Professor Shapiro assumes that the effect of the label suppression treatment would be the same for subscribers as for users of the free ad-supported Pandora service. This ignores differences between users of ad-supported services and subscription services that could influence the resulting effect of the

193 Carmen Reinicke, Spotify slips after not adding as many paid subscribers as hoped (SPOT), Markets Insider (July 31, 2019), https://markets.businessinsider.com/news/stocks/spotify-earnings-2q-stock-price-reaction-disappointing- subscriber-growth-2019-7-1028403687 (calculated as 232 / 242 – 1 = -4.1 percent and 108 / 108.5 – 1 = -0.5 percent). 194 Neha Malara & Supantha Mukherjee, Spotify shares surge after surprise profit, rise in paid users, Reuters (Oct. 28, 2019), https://www.reuters.com/article/us-spotify-tech-results/spotify-shares-surge-after-surprise-profit-rise-in-paid- users-idUSKBN1X70X9. 195 Alexei Oreskovic, Twitter shares crash as user growth stalls, Business Insider (Oct. 27, 2015), https://www.businessinsider.com/twitter-earnings-q3-2015-2015-10 (calculated as 320 / 324 – 1 = 1.23 percent). 196 Shapiro Corrected WDT, at Appendix F, at 2-3. 197 Shapiro Corrected WDT, at 27. Professor Shapiro assumes that the percentage loss of subscribers is equal to the percentage loss of listening hours measured by the Label Suppression Experiments. Shapiro Corrected WDT, at 30.

54

Written Rebuttal Testimony of Catherine Tucker Public Version

label suppression treatment on listening hours. For example, Dr. Reiley testified that [

]198 Dr. Reiley also noted that [

].199

Indeed, Professor Shapiro acknowledges that “a blackout by a given record company could in principle have a different impact on listener hours for a subscription webcasting service than

for an advertising-supported webcasting service.”200 [

].201 Similarly, as they are paying for the service rather than receiving it for free, subscribers may be more sensitive to changes in Pandora’s music catalog and may be more likely to switch away from

Pandora altogether rather than reducing their listening hours if they become dissatisfied with

the service.202 This is especially the case as there are many interactive services offering a full catalog of music and a full range of functionalities to which Pandora is especially vulnerable to losing customers.

198 SoundExchange Ex. 231 (Deposition of David Reiley), at 41:9-19 ([

]). 199 SoundExchange Ex. 231 (Deposition of David Reiley), at 41:20-42:7 ([

]). 200 Shapiro Corrected WDT, at 27. 201 SoundExchange Ex. 210, Pandora September Engagement Update (Oct. 2019), PANWEBV_00005093, at 00005096. 202 Research has shown nonlinearities in behavior with respect to zero prices. Kristina Shampanier, Nina Mazar, Dan Ariely, Zero as a Special Price: The True Value of Free Products, 26 Marketing Science 742-757 (Nov.-Dec. 2007). 55

Written Rebuttal Testimony of Catherine Tucker Public Version

E. Ultimately, Professor Shapiro’s application of the results of the Label Suppression Experiments suffers from compounding errors

Ultimately, the reliability of Professor Shapiro’s estimates is compromised by compounding errors. As I discuss, there are large flaws with Dr. Reiley’s Label Suppression Experiments and analysis, which together mean that Dr. Reiley’s estimates do not provide insight into the true effect of the loss of content from [ ] on Pandora’s business. Professor Shapiro acknowledges that listeners “were presumably not aware of the blackout,

and they might react more strongly if they were aware” and notes that “the experiments measure the impact of the blackout for only three months, but the impact over a longer period

of time could well be larger.”203 These flaws all consistently suggest that Dr. Reiley is underestimating the true effect. Professor Shapiro’s arbitrary adjustments, however, do not correct these flaws. Therefore, his analysis does not reflect the true effect of a blackout, Pandora’s willingness to pay, or the record company’s opportunity cost.

Further, as discussed above, not only would the real-life suppression of [ ] decrease listening hours and active listeners on Pandora’s ad-supported service, it also likely would have a substantial effect on the lifetime value of each customer, and would threaten the viability of Pandora as a service more severely than is suggested by just focusing on the effect of reduced listener hours. Professor Shapiro’s analysis does not capture these effects.

V. The Submission by the National Association of Broadcasters (“NAB”) Misses the Importance of Simulcasting to Their Broadcasters

NAB witnesses suggest that simulcasting is a small and unprofitable aspect of their business. For example, Dr. Leonard states that “simulcast is an ‘add-on’ service that would not exist without the terrestrial radio broadcast. Indeed, the terrestrial broadcast (and the revenues

203 Shapiro Corrected WDT, at 19. 56

Written Rebuttal Testimony of Catherine Tucker Public Version

derived therefrom)—not the simulcast—are the primary driver of the radio station’s

business.”204

These arguments, however, ignore the role that simulcasting plays in the context of the broadcaster’s overall business. If these arguments were right—i.e., that simulcasting should be viewed as an independent and unprofitable line of business—broadcasters would not simulcast. Because broadcasters continue to simulcast, we know that simulcasting plays a more

complicated role in their overall businesses. As described in my initial report, industry participants are increasingly using digital music services as part of a wider economic

ecosystem.205 As such, digital music offerings can aid and complement other aspects of a broadcaster’s business, providing benefits beyond the direct profits generated from the service itself. For example, iHeartMedia has emphasized how its multi-platform strategy positions the

company to capture additional advertising revenue from digital advertising sectors.206 Similarly, industry analysts have noted that iHeartMedia’s digital platforms improve the

company’s ability to sell broadcast inventory.207

As discussed in my Written Direct Testimony, iHeartMedia has noted the importance of digital to its business in recent earnings releases. iHeartMedia continued to highlight the importance of its digital segment in its November 2019 earnings call discussing its Q3 2019 results. iHeartMedia’s financial results for the quarter suggest continued improvement, and

iHeartMedia executives emphasized that “[d]igital had another strong quarter”208 with digital

revenue up 33.4 percent from Q3 2018.209 [

204 Leonard WDT, at 20. 205 Tucker WDT, at 34-37. 206 Tucker WDT, at 80. 207 Tucker WDT, at 80. 208 iHeartMedia Inc. Q3 2019 Earnings Call Transcript, Nov. 7, 2019, at 3. 209 iHeartMedia, Inc. Reports Results for 2019 Third Quarter, BusinessWire (Nov. 7, 2019), https://www.businesswire.com/news/home/20191107005341/en/iHeartMedia-Reports-Results-2019-Quarter. See also Tucker WRT, Appendix 8.

57

Written Rebuttal Testimony of Catherine Tucker Public Version

].211

In addition to helping to retain listeners in the face of emerging digital technologies, simulcasts also affect the advertising side of the terrestrial radio business. [

].213

Consistent with this evidence that simulcasts enhance and complement a radio station’s core business, Mr. Leonard Wheeler, President and Owner of NAB member company Mel Wheeler,

210 SoundExchange Ex. 380, [ ], NAB00002609 ([ ]); SoundExchange Ex. 382, [ ], NAB00002659 ([ ]). 211 SoundExchange Ex. 398, [ ], PANWEBV_00007062, at 00007071. 212 SoundExchange Ex. 321, [ ], NAB00004542, at tab “Digital Rates.” See also, SoundExchange Ex. 389, [ ], NAB00004537. In fact, [ ]. See SoundExchange Ex. 391, NAB00006441, at tab “Summary;” SoundExchange Ex. 375, Declaration of Collin R. Jones, Jan. 7, 2020, at 2-3. 213 SoundExchange Ex. 386, [ ], NAB00004036, at 00004041.

58

Written Rebuttal Testimony of Catherine Tucker Public Version

Inc. (“Wheeler”), testified to the strategic importance of offering its radio content digitally.214 As described by Mr. Wheeler, Wheeler “feared that, without at least establishing some streaming presence, [it] would eventually lose listeners as they increasingly sought to listen to

content digitally.”215 Mr. Wheeler indicated that “Wheeler has no choice but to make our stations available digitally, to guard against the possibility that our traditional radio audience begins to tune-in, not from traditional AM/FM radios, but rather from their desktop computers,

cell phones and smart speakers.”216 At his deposition, Mr. Wheeler testified that [

]217 He testified that [

]218 Mr. Wheeler further testified that he simulcasts [

]219 This testimony is consistent with evidence that, even while it may not be a broadcaster’s core business, there is a business

214 Written Direct Testimony of Leonard Wheeler, Sept. 19, 2019 (“Wheeler WDT”). 215 Wheeler WDT, at 9. 216 Wheeler WDT, at 9. 217 Deposition of Leonard Wheeler, Dec. 4, 2019 (“Sound Exchange Ex. 230 (Wheeler Deposition)”), at 28:1-7 ([

]), 98:9-12 ([ ]). 218 SoundExchange Ex. 230, (Wheeler Deposition), at 54:22-25. 219 SoundExchange Ex. 230, (Wheeler Deposition), at 56:5-8; see also at 170:5-9 (acknowledging that [

]). Despite this business need, Mr. Wheeler claimed that he was hesitant to embrace simulcasting due to “the exorbitant SoundExchange royalties that we must pay to simulcast.” Wheeler WDT, at 8. To make this point, Mr. Wheeler stated at his deposition that [ ]. SoundExchange Ex. 230 (Wheeler Deposition), at 158:4-24; see also SoundExchange Ex. 230 (Wheeler Deposition), at 26-27, 168, 170, 192, 202, 210. [ ]. SoundExchange Ex. 390, NAB00005547; SoundExchange Ex. 431, [ ]. [

].

59

Written Rebuttal Testimony of Catherine Tucker Public Version

need for simulcasting and webcasting services, and these services provide benefits for the broadcaster beyond current direct profit generation.

Mr. Wheeler’s testimony regarding the importance of simulcasting reflects the general trend towards digital that I explained in my initial written direct testimony. This trend is driven by the rise of mobile devices, smart speakers, and connected cars, which in turn reduces the share of traditional venues where people have listened to terrestrial radio. Mr. Wheeler’s testimony

confirms that this concern motivates broadcasters, as he believes simulcasting listenership [ ] due to [

]220 The data backs up this belief. Results of a 2018 Jacobs Media Tech Survey show that in 2013, [ ] of radio listening was digital while [ ] was on terrestrial broadcasts, compared to [ ] and [

] in 2018.221 The same 2018 survey also shows that [ ] of smart speaker owners

frequently use their smart speaker to listen to music on AM/FM radio.222 [

]223

Documents produced by NAB in discovery confirm that many listeners are shifting their radio listening to smart speakers and connected cars and thereby are switching from over-the-air broadcasts to simulcasts. [

220 SoundExchange Ex. 230, (Wheeler Deposition), at 171:15-18 ([

]). 221 SoundExchange Ex. 254, Tech Survey 2018 Jacobs Media: Radio Navigates the Digital Revolution (2018), NAB00002238, at 00002250. 222 SoundExchange Ex. 254, Tech Survey 2018 Jacobs Media: Radio Navigates the Digital Revolution (2018), NAB00002238, at 00002266. 223 SoundExchange Ex. 378, The Infinite Dial: 2018, NAB00002166, at 00002178 ([ ]; see also SoundExchange Ex. 61, MusicWatch Annual Music Study 2018: Report to Pandora Media (Apr. 2019), PANWEBV_00004139, at 00004214.

60

Written Rebuttal Testimony of Catherine Tucker Public Version

]224 To the radio industry, a smart speaker replacing a bedside or kitchen radio would threaten broadcast stations that did not have simulcasts that could be accessed on that smart speaker. These broadcasters would face increased competition from streaming services and other forms of music that could be accessed on that device. According to a 2017 study done by Edison Research, [

].225 Similarly, Infinite Dial research shows that [

]226

Internal documents produced in this proceeding from iHeartMedia and Pandora clearly show

that radio businesses view streaming services as competitors, and vice versa.227 [

224 SoundExchange Ex. 377, Deloitte Insights: Technology, Media, and Telecommunications Predictions (2019), NAB00001993, at 30-31; at 31 ([ ]); at 68 ([

]). 225 SoundExchange Ex. 387, BIA Advisory Services: Market Assessment and Opportunities for Local Radio: 2018- 2022 (Apr. 2018), NAB00004126, at 00004144. 226 SoundExchange Ex. 387, BIA Advisory Services: Market Assessment and Opportunities for Local Radio: 2018- 2022 (Apr. 2018), NAB00004126, at 00004141; at 00004140 ([

]). See also SoundExchange Ex. 403, [ ], SXMWEBV_00005224, at 00005224 ([ ]). 227 See, e.g., SoundExchange Ex. 385, [ ], NAB00002858, at 00002861 ([ ]); SoundExchange Ex. 397, Tech Survey 2019 Jacobs Media: Radio’s Survival Kit (2019), PANWEBV_00006670, at 00006680 ([ ]).

61

Written Rebuttal Testimony of Catherine Tucker Public Version

].231

Pandora expert Dr. Waldfogel has generally recognized the importance of digitization in the

music industry, and has written extensively about the digital renaissance in music.232 His own writings suggest that this digital renaissance affects the role and popularity of terrestrial radio as online platforms grow. For example, Dr. Waldfogel has observed that “the past decade has

seen the emergence and growth in alternative institutions, including Internet radio… New

information channels are changing the pathways to commercial success.”233 In addition, Dr.

228 SoundExchange Ex. 388, [ ] NAB00004413, at 00004473. 229 SoundExchange Ex. 381, [ ], NAB00002613, at 00002629, 00002631. See also SoundExchange Ex. 385, [ ], NAB00002858, at 00002882-83, 00002887-93, 00002912, 00002914-15, 00002917-18 ([ ]). 230 SoundExchange Ex. 396, [ ], PANWEBV_00006244, at 00006253-55. 231 SoundExchange Ex. 396, [ ], PANWEBV_00006244, at 00006280. [ ]. SoundExchange Ex. 396, [ ], PANWEBV_00006244, at 00006264. 232 Joel Waldfogel, Digital Renaissance: What Data and Economics Tell Us about the Future of Popular Culture (Princeton University Press, 2018). In his written testimony, Dr. Waldfogel highlights the “dramatic gains” in recorded music revenues in the U.S. between 2014 and 2018. Written Direct Testimony of Joel Waldfogel, Sept. 23, 2019, at 5-7. However, Dr. Waldfogel ignores the 15-year decline in the music industry prior to this period and the fact that, despite the recent growth, U.S. recorded music revenues are still substantially lower than the industry’s peak revenues in 1999. Tucker WDT, at Appendix 1. 233 Joel Waldfogel, “Digitization and Quality of New Media Products: The Case of Music” in Economic Analysis of

62

Written Rebuttal Testimony of Catherine Tucker Public Version

Waldfogel’s research has found that “a declining share of successful artists have traditional [radio] airplay, while a growing share are covered by online radio and critics,” highlighting the

waning influence of traditional radio.234

Further, Dr. Waldfogel’s published work emphasizes that this digital renaissance in music is

due to the emergence of interactive webcasting services.235 In his report, he also discusses the rise of digital streaming services, but fails to distinguish the role of interactive services rather

than noninteractive services in leading this renaissance.236

VI. The Religious Broadcasters’ Arguments for Why They Should Pay Less Are Not Based on Economics

NRBNMLC’s written direct statement focuses on religious broadcasters in general, and on

Family Stations, Inc. (“Family Radio”) specifically.237 However, neither of those appear representative of the larger population of noncommercial webcasters for which the Judges must

set rates in this proceeding. As I described in my written direct testimony, the vast majority of noncommercial webcasters (96 percent at the parent company level) pay only the annual minimum fee, and for 2018, only 20 noncommercial webcasters paid some amount of statutory

royalties on usage in excess of 159,140 aggregate tuning hours (“ATH”) per month.238 The

latter group includes religious broadcasters with significant financial resources.239

the Digital Economy, at 410 (Avi Goldfarb, Shane Greenstein, and Catherine Tucker eds., University of Chicago Press, Apr. 2015). 234 Joel Waldfogel, “Digitization and Quality of New Media Products: The Case of Music” in Economic Analysis of the Digital Economy, at 411 (Avi Goldfarb, Shane Greenstein, and Catherine Tucker eds., University of Chicago Press, Apr. 2015). 235 Joel Waldfogel, Digital Renaissance: What Data and Economics Tell Us about the Future of Popular Culture 203- 04 (Princeton University Press, 2018). 236 Waldfogel WDT, at 5-7, 11-12. 237 See, e.g., Introductory Memorandum to the Written Direct Statement of the National Religious Broadcasters Noncommercial Music License Committee, Including Educational Media Foundation, Sept. 23, 2019. 238 Tucker WDT, at 83-84, Appendix 18. 239 Tucker WDT, at Appendix 16.

63

Written Rebuttal Testimony of Catherine Tucker Public Version

In their written direct testimonies on behalf of the NRBNMLC, Dr. Steinberg and Dr. Cordes argue that noncommercial webcasters should pay lower statutory rates than commercial webcasters. Dr. Steinberg argues that statutory rates for noncommercial webcasters should be substantially lower because, among other reasons, noncommercial webcasters rely on donations and therefore are often “struggl[ing] to survive” and cannot afford large and

“unpredictable” royalty payment obligations.240 In addition, Dr. Steinberg and Dr. Cordes also

point to evidence that “[f]or-profit firms are often willing to sell their products and services to nonprofit organizations at a substantial discount” as an indication that statutory royalty

obligations should be considerably lower for noncommercial webcasters.241

These arguments, however, ignore evidence that the average per-performance rate paid by noncommercial webcasters is already lower than the rates paid by commercial webcasters, [ ]. In addition, Dr. Cordes and Dr. Steinberg fail to recognize that, far from being large and unpredictable, statutory royalty

payments comprise a small portion of operating costs for many noncommercial webcasters. Dr. Steinberg and Dr. Cordes also ignore evidence that, while small non-profit organizations are given discounts when purchasing other goods and services, those discounts may not be extended proportionally to larger organizations, much like the webcasters that pay excess royalties at the commercial rate. Dr. Steinberg’s argument that per-performance excess royalty payments are too unpredictable to be financed by donations is not grounded in data or economics, and is inconsistent with evidence that excess royalties are both predictable and

controllable. Finally, Dr. Steinberg amended his written direct testimony to add a discussion of SoundExchange’s settlement agreements with College Broadcasters, Inc. (“CBI”) and National Public Radio (“NPR”). Neither provides support for NRBNMLC’s rate proposal.

240 Steinberg Amended WDT, at 8-9, 28; see also Cordes Corrected WDT, at 10. 241 Steinberg Amended WDT, at 20-21; see also Cordes Corrected WDT, at 10-11. 64

Written Rebuttal Testimony of Catherine Tucker Public Version

A. Family Radio is not representative of noncommercial webcasters

In the introductory memo to its written direct statement, NRBNMLC pointed to Family Radio as an example of a large, nonprofit webcaster for which “the increase in streaming rates and current onerous license reporting requirements have substantially harmed its ability to reach as

many listeners as it has in the past.”242 Ms. Burkhiser, Family Radio’s Director of Broadcast Regulatory Compliance and Issue Programming, described Family Radio’s recent financial struggles, noting that “Family Radio has experienced severe financial hardship in recent

years—made worse by increased streaming rates—that has forced it to make difficult decisions

to enable it to continue to offer its radio ministry to its listeners.”243 However, Family Radio is not representative of the broader population of noncommercial webcasters, or even other large religious broadcasters. NRBNMLC’s emphasis on Family Radio does not provide a valid basis to draw economic conclusions about noncommercial webcasters in general. In addition, Family Radio’s recent financial struggles appear largely related to unique, strategic programming

decisions, and unrelated to streaming rates.

First, as a large noncommercial religious broadcaster, Family Radio is not representative of noncommercial webcasters in general. In 2018, there were over 900 noncommercial

webcasters (at the statement of account level)244 and close to 500 at the parent company level.245 This is a diverse group of services, including many that do not offer religious programming or

are not FCC-licensed broadcasters.246 These include music-only services and services with an

express purpose of supporting artists.247 Any inferences about an appropriate statutory royalty

242 Introductory Memorandum to the Written Direct Statement of the National Religious Broadcasters Noncommercial Music License Committee, Including Educational Media Foundation, Sept. 23, 2019, at 5-6. 243 Written Direct Testimony of Jennifer D. Burkhiser, Sept. 23, 2019 (“Burkhiser WDT”), at 13. 244 Written Direct Testimony of Jonathan Bender, Sept. 20, 2019 (“Bender WDT”), at 14. 245 Based on SoundExchange-provided data in SoundExchange Ex. 430, 2020-1-6 NonComms Stated and Allocated 2018.xlsx, at tab “Parent Level Summary.” 246 Written Rebuttal Testimony of Travis Ploeger, Jan. 10, 2020 (“Ploeger WRT”), ¶ 44. 247 Ploeger WRT, at ¶¶ 44-45.

65

Written Rebuttal Testimony of Catherine Tucker Public Version

determined by examining the business and finances of Family Radio, a large religious broadcaster, would not necessarily generalize to noncommercial webcasters more broadly.

Second, Family Radio is not representative of other large noncommercial religious broadcasters. Among other things, I understand that Educational Media Foundation (“EMF”) is a participant in this proceeding in its own right and [

].248 [

].251 As I showed in my Written Direct Testimony, the largest noncommercial broadcasters in terms of excess royalties owed in 2018 are well-resourced organizations with millions of

dollars in revenues.252 Since submitting my Written Direct Testimony, I have seen Family Radio’s IRS Form 990 for 2018 and have used this information to update my previous

calculations, as summarized in Rebuttal Appendices 3 and 4. As shown in Rebuttal Appendix

3, Family Radio operated at a loss in 2018.253 In contrast, other large noncommercial religious broadcaster webcasters generated millions of dollars more in revenue than they incurred in

expenses in 2018.254 For example, in 2018, EMF generated a surplus of [ ].255

248 SoundExchange Ex. 394, [ ], NRBNMLC_WEBV_00000823. 249 Calculated as [ ]. SoundExchange Ex. 430, 2020-1-6 NonComms Stated and Allocated 2018.xlsx, at tab “Pivot.” 250 Calculated as [ ]. SoundExchange Ex. 430, 2020-1-6 NonComms Stated and Allocated 2018.xlsx, at tab “Pivot;” Bender WDT, at 14. 251 Calculated as [ ]. SoundExchange Ex. 430, 2020-1-6 NonComms Stated and Allocated 2018.xlsx, at tab “Pivot;” Bender WDT, at 14. 252 Tucker WDT, at Appendix 16. 253 Tucker WRT, Appendix 3. See also, Family Stations, Inc. Form 990 for the year ended December 31, 2018, at 1, 12. 254 Tucker WRT, Appendix 3. 255 Tucker WRT, Appendix 3.

66

Written Rebuttal Testimony of Catherine Tucker Public Version

Third, Family Radio’s current situation is a result of unique circumstances that have been well documented in the popular media and include failed doomsday predictions accompanied by expensive advertising campaigns, as well as programming antagonistic to the organized church. In connection with predictions that the world would end on May 21, 2011 or October 21, 2011 (after a previous prediction that the world would end on September 6, 1994), Family Radio engaged in a national advertising campaign possibly costing as much as $100 million,

including advertisements on thousands of billboards, while donations fell sharply.256 These unique circumstances also make Family Radio a poor example for drawing any conclusions about noncommercial webcasters in general.

Fourth, even with Family Radio’s unique circumstances, as I explained in my Written Direct

Testimony, statutory royalties do not appear material to the finances of Family Radio.257 Family Radio’s IRS Form 990 for 2018 shows that Family Radio generated overall revenues of $5,422,789 in 2018 and provides a webcasting service with an average of 150,000 to 200,000

unique monthly streamers.258 In 2018, Family Radio paid only [ ] in statutory

royalties.259 This compares to $5,422,789 in revenues and $7,267,331 in program expenses, meaning that statutory royalties constituted only [ ] of its 2018 revenues and [

256 Rick Paulas, What Happened to Doomsday Prophet Harold Camping After the World Didn’t End?, Vice (Nov. 7, 2014), https://www.vice.com/en_us/article/yvqkwb/life-after-doomsday-456; Bob Smietana, Christian radio group faces financial hard times, U.S.A. Today (May 14, 2013), https://www.usatoday.com/story/news/nation/2013/05/14/family-radio-finances-world-did-not-end/2159621/; End of the line for Christian radio network that predicted 2011 rapture, Denver Post (May 12, 2013), https://www.denverpost.com/2013/05/12/end-of-the-line-for-christian-radio-network-that-predicted-2011-rapture/; Nicola Menzie, Family Radio Founder Harold Camping Repents, Apologizes for False Teachings, Christian Post (Oct. 30, 2011), https://www.christianpost.com/news/family-radio-founder-harold-camping-repents-apologizes-for- false-teachings.html; An insider’s look at Family Radio and its leader Harold Camping, Mercury News (May 20, 2011), https://www.mercurynews.com/2011/05/20/an-insiders-look-at-family-radio-and-its-leader-harold-camping/. 257 Tucker WDT, at Appendix 16. 258 Family Stations, Inc. Form 990 for the year ended December 31, 2018, at 1-2, 12. See also, Tucker WRT, Appendix 3. 259 Tucker WRT, Appendix 3.

67

Written Rebuttal Testimony of Catherine Tucker Public Version

] of its 2018 program expenses.260 [

].261

As a result, the situation of Family Radio should not be generalized to the overall noncommercial webcaster population. In fact, neither Dr. Steinberg nor Dr. Cordes appears to have relied on Ms. Burkhiser’s testimony in formulating their opinions on behalf of

NRBNMLC.262

B. Statutory royalty payments make up a very small proportion of noncommercial webcasters’ costs

Dr. Steinberg suggests that noncommercial webcasters cannot afford to pay more than they currently do because of their non-profit status. He attributes this to a free-rider problem, noting, “[a]nyone can consume the results of total donations (religious broadcasting and webcasting) whether they have personally contributed or not, so that there is a natural tendency to let others

donate while taking a free ride on the output.”263 Dr. Steinberg asserts that, as a result of this

problem, “with rare exceptions, donative nonprofits are bare-bones operations that often

struggle to survive.”264 He also notes that “[o]rganizational expenditures on mission consist of donations minus fees for the rights to webcast recordings (and other expenses, of course), so that donors would have to give more to accomplish the same outcome when royalty fees go

up.”265

However, as shown in my Written Direct Testimony, the vast majority of noncommercial

webcasters pay only the minimum fee per station per year.266 Of the noncommercial webcasters

260 Tucker WRT, Appendix 3. 261 Family Stations, Inc. Form 990 for the year ended December 31, 2018, at 10. 262 Steinberg Amended WDT, at 34-35 (listing works consulted, but omitting Ms. Burkhiser’s testimony); Cordes Corrected WDT, at Appendix B (same). 263 Steinberg Amended WDT, at 9. 264 Steinberg Amended WDT, at 9. 265 Steinberg Amended WDT, at 12. 266 Tucker WDT, at 83-84 and Appendix 18.

68

Written Rebuttal Testimony of Catherine Tucker Public Version

that pay royalties on usage in excess of the 159,140 ATH per month threshold, many generate substantial revenues and pay a relatively small amount in statutory royalties as compared to

their revenues.267 In my Written Direct Testimony, I examined the five largest noncommercial webcasters, which account for the vast majority of excess royalties paid and include Family Radio. These are not “bare-bones operations.” As shown in my written direct testimony, statutory royalties accounted for [ ] percent to [ ] percent of their total expenses, [ ]

percent to [ ] percent of their program expenses, and from [ ] percent to [ ] percent of

their revenue.268 The more recent financial information for Family Radio that is now available to me does not change this conclusion. Its 2018 Form 990 shows 2018 total expenses of

$9,776,918, program expenses of $7,267,331, and revenue of $5,422,789.269 As shown in Rebuttal Appendix 3, its 2018 statutory royalty payment of [ ] constituted [ ] percent of its total expenses, [ ] percent of its program expenses, and [ ] percent of its revenue,

falling within the ranges established by other top noncommercial webcasters.270

Furthermore, these firms, which would be most affected by a change in royalties, appear well positioned to pay increased statutory royalties. Based on the data, SoundExchange’s proposal to increase minimum fees to $1,000 per station or channel and excess fees to $0.0028 per performance would raise the royalty burden among the five largest noncommercial royalty

payers to at most [ ] percent of total expenses, [ ] percent of programming expenses, and

[ ] percent of revenues.271 This remains true with the more recent financial information for

267 Tucker WDT, at 84-85 and Appendix 16. 268 Tucker WDT, at 84-85 and Appendix 16. Excluding one-time revenues from, among other things, the sale of station licenses, property and equipment, Family Radio generated $5,523,080 in revenues in 2017. Its 2018 statutory royalty payment of [ ] amounts to approximately [ ] percent of these revenues, falling within the ranges established by other top noncommercial webcasters. Family Stations, Inc. (A California Not-For Profit Corporation) and its Affiliates, Consolidated Financial Statements, December 31, 2017 and December 31, 2016, at 5; Tucker WDT, at Appendix 16. 269 Tucker WRT, Appendix 3; see also, Family Stations, Inc. Form 990 for the year ended December 31, 2018, at 1, 10. 270 Tucker WRT, Appendix 3. 271 Tucker WRT, Appendix 4.

69

Written Rebuttal Testimony of Catherine Tucker Public Version

Family Radio that is now available to me. As shown in Rebuttal Appendix 4, Family Radio’s royalty payment for 2018 usage at SoundExchange’s proposed 2021 rates would have been [ ], which would have accounted for [ ] percent of its total expenses, [ ] percent of

its program expenses and [ ] percent of its revenue in 2018.272 This comparison undermines Dr. Steinberg’s conclusion that statutory royalties force noncommercial webcasters to divert a meaningful amount of donations away from other mission-related expenses.

C. Though small non-profits are given discounts in some cases, those discounts may not be extended proportionally to larger non-profits

Dr. Steinberg and Dr. Cordes both provide examples of “for-profit firms [that] offer lower prices to nonprofits in the form of discounts” as evidence that noncommercial webcasters

should receive lower rates than commercial webcasters.273 However, they do not mention that these discounts may be targeted to smaller non-profit organizations or may not be proportionally extended to large non-profits.

For example, Microsoft offers discounts for nonprofit organizations that vary with the size of the organization, with larger discounts for smaller organizations. Microsoft offers its Office 365 Business Premium product for $3.00 per user per month for “small & mid-sized nonprofits” and its Office 365 Nonprofit E3 product for $4.50 per user per month for “large

nonprofits.”274 Microsoft also offers $3,500 credits per year for its Azure cloud services regardless of nonprofit size, representing a larger proportional discount for smaller nonprofits

than larger nonprofits.275 Similarly, Google offers “$10,000 USD of in-kind advertising every

272 Tucker WRT, Appendix 4. 273 Cordes Corrected WDT, at 10-11; Steinberg Amended WDT, at 20. 274 Compare Office 365 Nonprofit plans: Large Nonprofits, Microsoft, https://www.microsoft.com/en-us/microsoft- 365/nonprofit/office-365-nonprofit-plans-and-pricing?&activetab=tab:primaryr2 (last visited Jan. 10, 2020); Compare Office 365 Nonprofit plans: Small and Mid-sized Nonprofits, Microsoft, https://www.microsoft.com/en- us/microsoft-365/nonprofit/office-365-nonprofit-plans-and-pricing?&activetab=tab:primaryr1 (last visited Jan. 10, 2020). 275 Microsoft Nonprofits, Microsoft, https://www.microsoft.com/en-us/nonprofits/azure (last visited Jan. 10, 2020).

70

Written Rebuttal Testimony of Catherine Tucker Public Version

month for text ads” through its Google AdGrants program, representing a larger proportional

discount for nonprofit organizations with smaller expenditures.276 As another example, Slack offers nonprofit organizations with 250 or fewer members a free upgrade to its Standard Plan,

while organizations larger than that receive a smaller discount.277

Further, as I discuss in Section VI.D, all noncommercial webcasters already receive a discounted rate under the existing statutory royalty rate system and, consistent with some of

the examples I have discussed, smaller noncommercial webcasters receive greater average effective discounts than larger non-commercial webcasters.

D. Noncommercial webcasters already receive a discounted rate

The argument that noncommercial webcasters should get a discount based on their non-profit status ignores the fact that they effectively are receiving a discount relative to commercial webcasters due to the structure of statutory royalty payments for noncommercial webcasters.

Currently, noncommercial webcasters receive substantial discounts under the existing royalty structure. Dr. Steinberg and Dr. Cordes do not provide any reason to think those discounts are too low.

As described in my Written Direct Testimony, noncommercial webcasters are governed by a two-part royalty schedule made up of: (1) a $500 minimum fee per station per year, and (2) a per-performance fee of $0.0017 in 2016, subject to CPI adjustments for 2017 to 2020, for

transmissions in excess of 159,140 ATH per month.278 Currently the per-performance rate is

276 Google for Nonprofits: Reach more donors online with Google Ad Grants, Google, https://www.google.com/nonprofits/offerings/google-ad-grants/ (last visited Jan. 10, 2020). 277 “Slack for Nonprofits,” Slack (2019), https://slack.com/help/articles/204368833-Slack-for-Nonprofits. Other examples include Aplos Accounting, which offers additional discounts for organizations with less than $50,000 in annual revenues, and Salesforce, which offers 10 free subscriptions to its Lightning Enterprise Edition regardless of organization size. Aplos Pricing: Try Aplos for Free, Aplos Software (2019), https://www.aplos.com/pricing; Get 10 Donated Subscriptions of the World’s #1 Cloud Engagement Application, Salesforce, (2019) https://www.salesforce.org/nonprofit_product/nonprofit-editions-pricing/. 278 In re Determination of Royalty Rates and Terms for Ephemeral Recording and Webcasting Digital Performance of

71

Written Rebuttal Testimony of Catherine Tucker Public Version

$0.0018.279 As described by the Copyright Royalty Judges in the Web IV determination, this fee schedule “results in noncommercial webcasters paying a lower average per-play rate than

a commercial webcaster (that pays at the commercial rate for every performance).”280

Most noncommercial webcasters owe only the $500 minimum fee per channel and do not pay excess royalty fees. The average per-performance rate for noncommercial webcasters that use fewer than 159,140 ATH per station per month and therefore only owe minimum fees is, in

most cases, substantially lower than the average per-performance rate for non-subscription

commercial webcasters with equivalent usage.281 For a music-focused noncommercial webcaster close to the threshold of 159,140 ATH per month, the average per-performance rate

is approximately $0.000022,282 or approximately 1.2 percent of the statutory royalty rate for

commercial webcasters.283 In other words, under the existing statutory royalty rate system, music-focused noncommercial webcasters close to the monthly threshold are already receiving a discount of roughly 99 percent off the commercial per-performance rate.

Only 4.2 percent of all noncommercial webcasters pay any amount of excess royalties.284 As summarized in Appendix 17 to my Written Direct Testimony, the webcasters that account for the majority of excess royalty payments are relatively large non-profits with millions of dollars

in revenues.285 Noncommercial webcasters that exceed the 159,140 ATH threshold and owe

Sound Recordings (Web IV); Final Rule, 81 Fed. Reg. 26316, 26316 (May 2, 2016), https://www.govinfo.gov/ content/pkg/FR-2016-05-02/pdf/2016-09707.pdf (“Web IV Determination”). 279 37 C.F.R. § 380.10(a)(2). 280 Web IV Determination, at 26392 n.208. 281 The only exception is webcasters with very low usage such that the $500 minimum fee per channel is spread over relatively few performances. 282 To estimate this, I assume a conversion factor of 12 recordings per hour. See Ploeger WRT, at ¶¶ 40-42. This implies that 159,140 ATH translates to roughly 1.9 million performances (calculated as 159,140 hours × 12 recordings/hour = 1,909,680). The annual minimum fee of $500 equates to $41.67 per month (the price of 159,140 ATH), implying an effective rate of $0.000022 per performance (calculated as $41.67 / 1,909,680 performances = $0.000022). 283 $0.000022 / $0.0018 = 1.21%. A noncommercial webcaster that transmits fewer than 159,140 ATH per month, and therefore only pays the $500 annual minimum fee, would have to use less than 1.21 percent of the available ATH for its effective per-performance rate to exceed the commercial rate of $0.0018. 284 Tucker WDT, at Appendix 18. 285 Tucker WDT, at Appendix 17. 72

Written Rebuttal Testimony of Catherine Tucker Public Version

per-performance royalties on their excess usage still receive an overall discount relative to the commercial rate because of the steep discount they receive on the first 159,140 ATH of usage per month. As the usage goes up, the average per-performance discount declines, but never reaches zero.

I understand that SoundExchange’s rate proposal continues the existing payment structure, with increased minimum annual fees of $1,000 per station or channel and excess fees of

$0.0028 per performance. Because this proposal follows a similar payment structure where noncommercial webcasters receive a steep discount on the first 159,140 ATH of usage per month, noncommercial webcasters would continue to pay average per-performance rates lower than the commercial rate under SoundExchange’s proposed royalty rate increase.

E. The argument that noncommercial webcasters should not pay excess royalties because they are too unpredictable to finance with donations is not based on data or economics

Dr. Steinberg argues that the per-performance royalty payments for usage in excess of 159,140 ATH per month under the current system should be “replaced by tiered flat fees or tiered and capped flat fees” on the basis that “predictable payment obligations are important to [noncommercial webcasters] because they can finance them through regular on-air fundraising

drives with accurate campaign goals.”286 The costs associated with excess royalties under the statutory license are both predictable and controllable, however. Excess royalties are simply a function of the number of performances transmitted, which are easily tracked over time.

Noncommercial webcasters can monitor their current listenership and spending on excess royalties, presumably predict future usage based on annual trends and seasonal variations in

286 Steinberg Amended WDT, at 28.

73

Written Rebuttal Testimony of Catherine Tucker Public Version

previous years, and influence spending levels by choosing to play more or less music, playing

longer recordings,287 and managing access to streams.288

F. The CBI and NPR settlements do not provide support for NRBNMLC’s rate proposal

Dr. Steinberg amended his written direct testimony to add a discussion of SoundExchange’s

settlement agreements with CBI and NPR.289 However, neither of those agreements provides

support for NRBNMLC’s rate proposal in this proceeding.

I understand that the NRBNMLC has proposed “that noncommercial broadcasters pay: (a) a flat, $500 annual fee for each channel or station streaming digital audio transmissions up to 1,909,680 ATH in a year (159,140 ATH multiplied by 12 months/year); and (b) an additional $500 annual fee for each channel or station streaming digital audio transmissions for each

additional 1,909,680 ATH in the same year.”290 That is, as I understand it, the NRBNMLC’s proposal would maintain the $500 annual fee for small noncommercial webcasters and charge

larger noncommercial webcasters in similar increments for additional usage. However, neither the CBI nor NPR settlement agreements appear to support the NRBNMLC’s proposed fee structure or proposed royalty levels.

287 Testimony of Steven Cutler, Executive Vice President, Business Development and Corporate Strategy, iHeartMedia, Inc., Oct. 7, 2014, in the matter of Determination of Royalty Rates and Terms for Ephemeral Recording and Digital Performance of Sound Recordings (Web IV), at ¶ 13. 288 Steinberg Amended WDT, at 14 (discussing potential for limiting access); Wheeler WDT, ¶¶ 23-25 (describing station group’s process of deciding which stations to webcast), 27-28 (describing promotion strategy for webcasts); Introductory Memorandum to the Written Direct Statement of the National Religious Broadcasters Noncommercial Music License Committee, Including Educational Media Foundation, Sept. 23, 2019, at 2 (“the current ATH threshold is causing noncommercial broadcasters approaching this threshold to limit their streaming activities to avoid the obligation to pay usage fees”). Dr. Steinberg describes such management of access as a “harmful” problem, because it constrains a webcaster’s ability to pursue its mission. Steinberg Amended WDT, at 13. However, that is not an economic conclusion, and does not imply that the price of those goods and services should be lower than their fair market value. 289 Steinberg Amended WDT, at 16-21. 290 Introductory Memorandum to the Written Direct Statement of the National Religious Broadcasters Noncommercial Music License Committee, Including Educational Media Foundation, Sept. 23, 2019, at 3.

74

Written Rebuttal Testimony of Catherine Tucker Public Version

As I understand it, under the relevant regulations, a station subject to the CBI settlement qualifies as a “Noncommercial Educational Webcaster” because, among other things, it “[i]s directly operated by, or is affiliated with and officially sanctioned by, and the digital audio transmission operations of which are staffed substantially by students enrolled at, a domestically accredited primary or secondary school, college, university or other post- secondary degree-granting educational institution,” and it “[t]akes affirmative steps not to

make total transmissions in excess of 159,140 Aggregate Tuning Hours on any individual channel or station in any month, if in any previous calendar year it has made total transmissions in excess of 159,140 Aggregate Tuning Hours on any individual channel or station in any

month.”291

For each Noncommercial Educational Webcaster, its statutory rate is governed by 37 C.F.R. Part 380 Subpart C. I understand that as per 37 C.F.R. Part 380 Subpart C, for 2016 to 2020,

Each Noncommercial Educational Webcaster that did not exceed 159,140 total ATH for any individual channel or station for more than one calendar month in the immediately preceding calendar year and does not expect to make total transmissions in excess of 159,140 Aggregate Tuning Hours on any individual channel or station in any calendar month during the applicable calendar year shall pay an annual, nonrefundable minimum fee of $500 … for each of its individual channels, … for each calendar year.292

On September 23, 2019, SoundExchange and CBI announced that they reached a partial settlement in this proceeding concerning royalty rates and terms for eligible nonsubscription

transmissions made by Noncommercial Educational Webcasters over the internet during the

period 2021-2025.293 According to the Joint Motion to Adopt Partial Settlement between

291 37 C.F.R. § 380.21. 292 37 C.F.R. § 380.22(a). 293 NRBNMLC Ex. 20, Joint Motion to Adopt Partial Settlement between SoundExchange and CBI, Sept. 23, 2019, at 1-2.

75

Written Rebuttal Testimony of Catherine Tucker Public Version

SoundExchange and CBI, the settlement agreement with CBI generally adopted the terms

relevant for Noncommercial Educational Webcasters in Web IV.294 One exception is that the settlement specified that eligible Noncommercial Educational Webcasters would owe annual

minimum fees of $550 in 2021, increasing by $50 each year to $750 in 2025.295

While Dr. Steinberg suggests that “the CBI settlement rates are above the upper bound of a

reasonable rate for webcast rights” because CBI was motivated to avoid litigation costs,296 it is

not clear why CBI would be more motivated to avoid litigation costs than SoundExchange.297 Either way, Dr. Steinberg has not explained how this factor would explain the difference in the terms of the CBI settlement and those in NRBNMLC’s rate proposal.

As with SoundExchange’s agreement with CBI, the terms of SoundExchange’s settlement agreement with NPR do not provide support for NRBNMLC’s rate proposal. On September 23, 2019, SoundExchange, NPR, and the Corporation for Public Broadcasting (“CPB”) announced that they reached a partial settlement in this proceeding concerning royalty rates

and terms during the period 2021-2025.298 As I understand it, SoundExchange’s NPR and CPB settlement provides for a number of NPR affiliated public radio stations to collectively stream

up to 360 million “Music ATH” in 2021, growing to 400 million “Music ATH” in 2025.299 In

exchange, the agreement requires CPB300 to pay SoundExchange an annual lump sum payment

294 NRBNMLC Ex. 20, Joint Motion to Adopt Partial Settlement between SoundExchange and CBI, Sept. 23, 2019, at 2, 6. 295 NRBNMLC Ex. 20, Joint Motion to Adopt Partial Settlement between SoundExchange and CBI, Sept. 23, 2019, at 2, 7. 296 Steinberg Amended WDT, at 16. 297 I understand that SoundExchange incurs additional litigation costs for each party participating in the rate-setting proceedings. 298 NRBNMLC Ex. 21, Joint Motion to Adopt Partial Settlement between SoundExchange and NPR, Sept. 23, 2019. 299 NRBNMLC Ex. 21, Joint Motion to Adopt Partial Settlement between SoundExchange and NPR, Sept. 23, 2019, at 7-8. 300 CPB is a private, nonprofit entity that was founded by Congress and is funded by the federal government. Among other things, CPB provides funding for NPR. In particular, I understand that, historically, CPB has paid sound recording royalties for NPR. NRBNMLC Ex. 21, Joint Motion to Adopt Partial Settlement between SoundExchange and NPR, Sept. 23, 2019, at 2.

76

Written Rebuttal Testimony of Catherine Tucker Public Version

of $800,000.301 Neither the fee structure nor the royalty levels in the NPR agreement appear to support NRBNMLC’s rate proposal in this proceeding. As I understand it, neither SoundExchange nor NRBNMLC has filed a rate proposal including an industry-wide lump sum payment as the statutory royalty rate for religious broadcasters or other noncommercial

webcasters.302

Further, Dr. Steinberg dismisses the idea that this lump-sum payment might reflect a discount due to real advantages in terms of protection from bad debt that arises from being paid in

advance.303 Dr. Steinberg’s rationale is that:

[S]tations named by CPB as participants in the NPR agreement have unique access to relatively stable funding through tax dollars allocated as grants by CPB. Indeed, qualification to receive funding from the CPB is a requirement for originating public radio stations to participate in the NPR settlement agreement. CPB support is substantial, with $69.31 million budgeted for direct grants to local public radio stations in FY 2018.304

301 NRBNMLC Ex. 21, Joint Motion to Adopt Partial Settlement between SoundExchange and NPR, Sept. 23, 2019, at 7. According to data from SoundExchange, [ ]. SoundExchange Ex. 430, 2020-1-6 NonComms Stated and Allocated 2018.xlsx. 302 Dr. Steinberg proposes various adjustments to the NPR rate “[i]f the Judges determine that a lump sum similar to the NPR agreement is a reasonable fee structure for certain NCE webcasters (such as religious broadcasters),” but those adjustments, including scaling the NPR agreement lump sum payment based on a differential ATH cap, appear to assume a linear relationship between the lump sum amount and the ATH cap. Steinberg Amended WDT, at 17. Such a relationship is not obvious from the agreement, and may even be contradicted by the terms of the agreement. 303 Steinberg Amended WDT, at 19. 304 Steinberg Amended WDT, at 19 (internal citations omitted).

77

Written Rebuttal Testimony of Catherine Tucker Public Version

However, Dr. Steinberg fails to note that this allegedly stable source of government income is

only a small proportion of NPR revenues and is subject to congressional review.305 Like other

noncommercial broadcasters, NPR relies heavily on donations for its funding.306

SoundExchange’s settlement agreement with NPR “continues the structure of previous

settlements between the parties, while increasing the payment to be made by CPB.”307 I understand that those previous settlement agreements date back to 2001, and informed the

Judges’ determination of rates for other noncommercial webcasters in Web II.308 However, even then, the written determination suggests that the Judges found that the original NPR agreement did “not provide clear evidence of a per station rate that could be viewed as a proxy for one

that a willing buyer and a willing seller would negotiate today.”309 Dr. Steinberg does not provide any reason to believe that the new NPR agreement is more informative.

305 Public Radio Finances, NPR (last visited Jan. 10, 2020), https://www.npr.org/about-npr/178660742/public-radio- finances; Matthew Ingram, Trump Budget Has Public Broadcasting in a Fight for its Life, Fortune (Mar. 16, 2017), https://fortune.com/2017/03/16/trump-budget-public-broadcasting/; Callum Borchers, Trump’s budget will probably slash public media, but the biggest losers won’t be PBS and NPR, Wash. Post (Mar. 15, 2017), https://www.washingtonpost.com/news/the-fix/wp/2017/03/15/trumps-budget-will-likely-slash-public-media-but- the-biggest-losers-wont-be-pbs-and-npr/; Joe Concha, Trump proposes eliminating federal funding for PBS, NPR, The Hill (Feb. 12, 2018), https://thehill.com/homenews/media/373434-trump-proposes-eliminating-federal-funding- for-pbs-npr. 306 Public Radio Finances, NPR (last visited Jan. 10, 2020), https://www.npr.org/about-npr/178660742/public-radio- finances. 307 NRBNMLC Ex. 21, Joint Motion to Adopt Partial Settlement between SoundExchange and NPR, Sept. 23, 2019, at 2. 308 Web II Determination, 72 Fed. Reg., at 24097-100. 309 Web II Determination, 72 Fed. Reg., at 24098. 78

Written Rebuttal Testimony of Catherine Tucker Public Version

Written Rebuttal Testimony of Catherine Tucker Public Version

REBUTTAL APPENDIX 1

FRACTION OF USERS WITH LIMITED, REDUCED, OR NO EXPOSURE TO THE TREATMENT OR FOR WHOM THE EFFECT CANNOT BE FULLY ASCERTAINED Public Version

REBUTTAL APPENDIX 2

DISTRIBUTION OF LISTENING ACTIVITY PER REGISTERED USER JUNE 4 – AUGUST 31, 2019 Public Version

REBUTTAL APPENDIX 3

NON-COMMERCIAL BROADCASTERS TOP FIVE HIGHEST EXCESS ROYALTIES OWED ROYALTIES AND FINANCIAL DATA 2018¹

Page 1 of 2 Public Version

REBUTTAL APPENDIX 3

NON-COMMERCIAL BROADCASTERS TOP FIVE HIGHEST EXCESS ROYALTIES OWED ROYALTIES AND FINANCIAL DATA 2018¹

Page 2 of 2 Public Version

REBUTTAL APPENDIX 4

NON-COMMERCIAL BROADCASTERS TOP FIVE HIGHEST EXCESS ROYALTIES OWED ROYALTIES AND FINANCIAL DATA USING PROPOSED ROYALTY RATES OF $1,000 PLUS $0.0028 PER PLAY 2018¹

Page 1 of 2 Public Version

REBUTTAL APPENDIX 4

NON-COMMERCIAL BROADCASTERS TOP FIVE HIGHEST EXCESS ROYALTIES OWED ROYALTIES AND FINANCIAL DATA USING PROPOSED ROYALTY RATES OF $1,000 PLUS $0.0028 PER PLAY 2018¹

Page 2 of 2 Public Version

REBUTTAL APPENDIX 5

PANDORA MEDIA, INC. INCOME STATEMENTS STANDARD & POOR'S CAPITAL IQ AND SEC FILINGS 2012 – Q3 2019 Public Version

REBUTTAL APPENDIX 6

PANDORA MEDIA, INC. INCOME STATEMENTS, PER USER PER MONTH BASIS STANDARD & POOR'S CAPITAL IQ AND SEC FILINGS 2012 – Q3 2019 Public Version

REBUTTAL APPENDIX 7

PANDORA MEDIA, INC. USER, ADVERTISING, AND SUBSCRIPTION METRICS STANDARD & POOR'S CAPITAL IQ AND SEC FILINGS 2012 – Q3 2019

Page 1 of 3 Public Version

REBUTTAL APPENDIX 7

PANDORA MEDIA, INC. USER, ADVERTISING, AND SUBSCRIPTION METRICS STANDARD & POOR'S CAPITAL IQ AND SEC FILINGS 2012 – Q3 2019

Page 2 of 3 Public Version

REBUTTAL APPENDIX 7

PANDORA MEDIA, INC. USER, ADVERTISING, AND SUBSCRIPTION METRICS STANDARD & POOR'S CAPITAL IQ AND SEC FILINGS 2012 – Q3 2019

Page 3 of 3 Public Version

REBUTTAL APPENDIX 8

IHEARTMEDIA WORLDWIDE INCOME STATEMENT SEC FILINGS 2010 – Q3 2019 Public Version

REBUTTAL APPENDIX 9

INCREMENTAL DOCUMENTS RELIED UPON

Bates Numbered Documents: [ ], NAB00002794. SoundExchange Ex. 206, [ ], PANWEBV_00005332. SoundExchange Ex. 210, [ ], PANWEBV_00005093. SoundExchange Ex. 254, [ ], NAB00002238. SoundExchange Ex. 288, [ ], SXMWEBV_00005444. SoundExchange Ex. 321, [ ], NAB00004542. SoundExchange Ex. 377, Deloitte Insights: Technology, Media, and Telecommunications Predictions (2019), NAB00001993. SoundExchange Ex. 378, The Infinite Dial: 2018, NAB00002166. SoundExchange Ex. 380, [ ], NAB00002609. SoundExchange Ex. 381, [ ], NAB00002613. SoundExchange Ex. 382, [ ], NAB00002659. SoundExchange Ex. 385, [ ], NAB00002858. SoundExchange Ex. 386, [ ], NAB00004036. SoundExchange Ex. 387, BIA Advisory Services: Market Assessment and Opportunities for Local Radio: 2018- 2022 (Apr. 2018), NAB00004126. SoundExchange Ex. 388, [ ] (May 16, 2018), NAB00004413. SoundExchange Ex. 389, [ ], NAB00004537. SoundExchange Ex. 390, NAB00005547. SoundExchange Ex. 391, NAB00006441. SoundExchange Ex. 394, [ ], NRBNMLC_WEBV_00000823. SoundExchange Ex. 395, [ ], PANWEBV_00004469. SoundExchange Ex. 396, Morgan Stanley, Revival: 5th Annual Music & Radio Survey (Jan. 10, 2019), PANWEBV_00006244. SoundExchange Ex. 397, Tech Survey 2019 Jacobs Media: Radio’s Survival Kit (2019), PANWEBV_00006670. SoundExchange Ex. 398, MusicWatch: How US Customers Listen to Music, audiocensus Q4 2018, PANWEBV_00007062. SoundExchange Ex. 399, [ ], PANWEBV_00008435. SoundExchange Ex. 400, [ ], PANWEBV_00009100. SoundExchange Ex. 401, [ ], PANWEBV_00009182. SoundExchange Ex. 402, [ ], PANWEBV_00004024. SoundExchange Ex. 403, [ ] (July 2019), SXMWEBV_00005224. SoundExchange Ex. 58, [ ], PANWEBV_00003357. SoundExchange Ex. 61, [ ], PANWEBV_00004139. SoundExchange Ex. 62, [ ], PANWEBV_00004249. SoundExchange Exhibit 205, [ ], PANWEBV_00004571. SoundExchange Exhibit 207, [ ], SXMWEBV_00004833. SoundExchange Exhibit 208, [ ], PANWEBV_00006711. SoundExchange Exhibit 209, [ ], PANWEBV_00006865. [ ], NRBNMLC_WEBV_00000270.

Page 1 of 8

Public Version

REBUTTAL APPENDIX 9

INCREMENTAL DOCUMENTS RELIED UPON

Articles, Books, and Publications: Anderson, Eric and Duncan Simester, “Long-Run Effects of Promotion Depth on New Versus Established Customers: Three Field Studies,” Marketing Science, Vol. 23, No. 1 (Winter 2004): 4-20. Anderson, Simon P., et al., “Push‐ Me Pull‐ You: Comparative Advertising in the OTC Analgesics Industry,” RAND Journal of Economics, Vol. 47, No. 4 (Nov. 2016): 1029-1056. Baker, Andrew and Naveen Naveen, “Chapter 15: Word-of-mouth processes in marketing new products: recent research and future opportunities” in Handbook of Research on New Product Development, Eds. Peter Golder and Debanjan Mitra, Edward Elgar Publishing, 2018. Banerjee, Abhijit and Esther Duflo, “An Introduction to the ‘Handbook of Field Experiments,’” Aug. 2016, available at https://www.povertyactionlab.org/sites/default/files/documents/handbook_intro.pdf. Bapna, Ravi and Akhmed Umyarov, “Do Your Online Friends Make You Pay? A Randomized Field Experiment on Peer Influence in Online Social Networks,” Management Science, Vol. 61, No. 8 (Aug. 2015): 1902-1920. Burtch, Gordon, Anindya Ghose, and Sunil Wattal, “The Hidden Cost of Accommodating Crowdfunder Privacy Preferences: A Randomized Field Experiment,” Management Science, Vol. 61, No. 5 (May 2015): 949-962. Castro, Luis and Carlos Scartascini, “Tax Compliance and Enforcement in the Pampas Evidence From a Field Experiment,” Journal of Economic Behavior & Organization, Vol. 116 (2015): 65-82. Catalini, Christian and Catherine Tucker, “When Early Adopters Don't Adopt,” Science, Vol. 357, No. 6347 (July 2017): 135-136. Chandrasekaran, Deepa and Gerard J. Tellis, “Chapter 14: A summary and review of new product diffusion models and key findings” in Handbook of Research on New Product Development, Eds. Peter Golder and Debanjan Mitra, Edward Elgar Publishing, 2018. Chassang, Sylvain, et al., “Accounting for Behavior in Treatment Effects: New Applications for Blind Trials,” PLoS ONE, Vol. 10, No. 6 (June 2015). Deaton, Angus and Nancy Cartwright, “Understanding and Misunderstanding Randomized Controlled Trials,” Social Science & Medicine, Vol. 210 (Aug. 2018): 2-21. Dewan, Sanjeev, Yi-Jen Ho, and Jui Ramaprasad, “Popularity or Proximity: Characterizing the Nature of Social Influence in an Online Music Community,” Information Systems Research, Vol. 28, No. 1 (Mar. 2017): 117-136. Harrison, Glenn and John List, “Field Experiments,” Journal of Economic Literature, Vol. 42, No. 4 (Dec. 2004). Harrison, Glenn, “Cautionary Notes on the Use of Field Experiments to Address Policy Issues,” Oxford Review of Economic Policy, Vol. 30, No. 4 (2014): 753-763. Hohnhold, Henning, Deirdre O’Brien, and Diane Tang, “Focusing on the Long-term: It’s Good for Users and Business,” 2015, https://static.googleusercontent.com/media/research.google.com/en//pubs/archive/43887.pdf. Kahneman, Daniel and Amos Tversky, “Prospect Theory: An Analysis of Decision Under Risk,” Econometrica, Vol. 47, No. 2 (Mar. 1979): 263-292. Lambrecht, Anja and Catherine Tucker, “Field Experiments” in Handbook of Marketing Analytics, Eds. Natalie Mizik and Dominique Hanssens, Edward Elgar Publishing, 2018. Lambrecht, Anja and Catherine Tucker, “Paying with Money or Effort: Pricing When Customers Anticipate Hassle,” Journal of Marketing Research, Vol. 49, No. 1 (2012): 66-82. Levitt, Steven and John List, “Was There Really a Hawthorne Effect at the Hawthorne Plant? An Analysis of the Original Illumination Experiments,” American Economic Journal: Applied Economics, Vol. 3, No. 1 (Jan. 2011): 224-238. Lewis, Randall and David Reiley, “Online Ads and Offline Sales: Measuring the Effects of Retail Advertising via a Controlled Experiment on Yahoo!,” Quantitative Marketing and Economics, Vol. 12, No. 3 (Sept. 2014): 235- 266. Lewis, Randall and Justin Rao, “The Unfavorable Economics of Measuring the Returns to Advertising,” The Quarterly Journal of Economics, Vol. 130, No. 4 (Nov. 2015): 1941-1973. Oestreicher-Singer, Gal and Lior Zalmanson, “Content or Community? A Digital Business Strategy for Content Providers in the Social Age,” Management Information Systems Quarterly, Vol. 37, No. 2 (June 2013): 591-616.

Page 2 of 8

Public Version

REBUTTAL APPENDIX 9

INCREMENTAL DOCUMENTS RELIED UPON

Articles, Books, and Publications, Continued: Rochet, Jean-Charles and Jean Tirole, “Platform Competition in Two-Sided Markets,” Journal of the European Economic Association, Vol. 1, No. 4 (June 2003): 990-1029. Senat, Marie-Victoire, et al., “Endoscopic Laser Surgery versus Serial Amnioreduction for Severe Twin-to-Twin Transfusion Syndrome,” New England Journal of Medicine, Vol. 351, No. 2 (July 2004): 136-144. Slemrod, Joel, Marsha Blumenthal, and Charles Christian, “Taxpayer Response to an Increased Probability of Audit: Evidence from a Controlled Experiment in Minnesota,” Journal of Public Economics, Vol. 79 (2001): 455-483. Tucker, Catherine and Anja Lambrecht, “When Does Retargeting Work? Information Specificity in Online Advertising,” Journal of Marketing Research, Vol. 50, No. 5 (Oct. 2013): 561-576. Tucker, Catherine and Juanjuan Zhang, “Growing Two-sided Networks by Advertising the User Base: A Field Experiment,” Marketing Science, Vol. 29, No. 5 (Sept.-Oct. 2010): 805-814. Tucker, Catherine and Juanjuan Zhang, “How Does Popularity Information Affect Choices? A Field Experiment,” Management Science, Vol. 57, No. 5 (May 2011): 828-842. Tucker, Catherine, “Social Networks, Personalized Advertising, and Privacy Controls,” Journal of Marketing Research, Vol. 51, No. 5 (Oct. 2014): 546-562. Tucker, Catherine, “The Reach and Persuasiveness of Viral Video Ads,” Marketing Science, Vol. 34, No. 2 (Mar. 2015): 281-296. Waldfogel, Joel, “Digitization and Quality of New Media Products: The Case of Music” in Economic Analysis of the Digital Economy, Eds. Avi Goldfarb, Shane Greenstein, and Catherine Tucker, University of Chicago Press, Apr. 2015. Waldfogel, Joel, Digital Renaissance: What Data and Economics Tell Us about the Future of Popular Culture, Princeton University Press, 2018. Yanagizawa-Drott, David and Jakob Svensson, “Estimating Impact in Partial vs. General Equilibrium: A Cautionary Tale from a Natural Experiment in Uganda,” Aug. 2012, https://epod.cid.harvard.edu/sites/default/files/2018- 02/estimating_impact_in_partial_vs._general_equilibrium- _a_cautionary_tale_from_a_natural_experiment_in_uganda.pdf. Zeiler, Kathryn, “Cautions on the Use of Economics Experiments in Law,” Journal of Institutional and Theoretical Economics, Vol. 166, No. 1 (Mar. 2010): 178-193.

Data from Listener Suppression Experiments: [ ], PANWEBV_00008309-310. [ ], PANWEBV_00008188-8307. [ ], PANWEBV_00008067-8082; PANWEBV_00008084-8187. [ ], PANWEBV_00004982- 991. [ ], PANWEBV_00008312. [ ], PANWEBV_00008311. File Production Cross Reference, Pandora-Sirius_WebV_013_X-Ref.xls. File Production Cross Reference, Pandora-Sirius_WebV_014_X-Ref.xls. File Production Cross Reference, Pandora-Sirius_WebV_11 Cross Ref.xls. [ ], PANWEBV_00008083. [ ], PANWEBV_00008065. [ ], PANWEBV_00008308. [ ], PANWEBV_00008064. [ ], PANWEBV_00008066. [ ], PANWEBV_00008314-8433. [ ], PANWEBV_00008061.

Page 3 of 8

Public Version

REBUTTAL APPENDIX 9

INCREMENTAL DOCUMENTS RELIED UPON

Data from Listener Suppression Experiments, Continued: [ ], PANWEBV_00008062. [ ], PANWEBV_00008063. [ ], PANWEBV_00008434. [ ], PANWEBV_00008313.

Depositions: Deposition of Leonard Wheeler, Dec. 4, 2019. SoundExchange Ex. 231, Deposition of David H. Reiley, Jr., Ph.D., Dec. 16, 2019.

Testimonies and Direct Statements: Amended Written Direct Testimony of Richard Steinberg, Dec. 11, 2019. Amended Written Rebuttal Testimony of Michael Herring, Feb. 20, 2015. Corrected Written Direct Testimony of Carl Shapiro, Nov. 20, 2019. Corrected Written Direct Testimony of David Reiley, Nov. 26, 2019. Corrected Written Direct Testimony of Joseph J. Cordes, Dec. 20, 2019. Introductory Memorandum to the Written Direct Statement of the National Religious Broadcasters Noncommercial Music License Committee, Including Educational Media Foundation, Sept. 23, 2019. Testimony of Steven Cutler in Web IV, Oct. 7, 2014. Written Direct Testimony of Aaron Harrison, Sept. 22, 2019. Written Direct Testimony of Andrew Gille, Sept. 17, 2019. Written Direct Testimony of Arpan Agrawal, Sept. 18, 2019. Written Direct Testimony of Carl Shapiro, Sept. 23, 2019. Written Direct Testimony of Catherine Tucker, Sept. 23, 2019. Written Direct Testimony of Christopher Phillips, Sept. 23, 2019. Written Direct Testimony of David Reiley, Sept. 23, 2019. Written Direct Testimony of Dominique M. Hanssens, Sept. 23, 2019. Written Direct Testimony of Dr. Gregory K. Leonard, Sept. 20, 2019. Written Direct Testimony of James Russell Williams III, Sept. 23, 2019. Written Direct Testimony of Jennifer D. Burkhiser, Sept. 23, 2019. Written Direct Testimony of Jennifer Witz, Sept. 23, 2019. Written Direct Testimony of Joel Waldfogel, Sept. 23, 2019. Written Direct Testimony of Jonathan Orszag, Sept. 23, 2019. Written Direct Testimony of Joseph C. Emert, Oct. 6, 2014. Written Direct Testimony of Joseph J. Cordes, Sept. 21, 2019. Written Direct Testimony of Leonard Wheeler, Sept. 19, 2019. Written Direct Testimony of Professor John R. Hauser, Sept. 20, 2019. Written Direct Testimony of Richard Steinberg, Sept. 23, 2019. Written Direct Testimony of Robert Pittman, Sept. 18, 2019. Written Direct Testimony of Robert Willig, Sept. 23, 2019. Written Direct Testimony of Stephan McBride in Web IV, Oct. 7, 2014. Written Direct Testimony of Steven Blatter, Nov. 28, 2011. Written Direct Testimony of Steven Blatter, Sept. 23, 2019. Written Direct Testimony of Steven R. Peterson, Sept. 23, 2019. Written Direct Testimony of Steven W. Newberry, Oct. 6, 2014. Written Direct Testimony of Steven W. Newberry, Sept. 20, 2019. Written Direct Testimony of T. Jay Fowler, Sept. 18, 2019. Written Direct Testimony of Timothy Westergren, Oct. 6, 2014. Written Direct Testimony of Tom Poleman, Sept. 19, 2019.

Page 4 of 8

Public Version

REBUTTAL APPENDIX 9

INCREMENTAL DOCUMENTS RELIED UPON

Testimonies and Direct Statements, Continued: Written Direct Testimony of Waleed Diab, Sept. 18, 2019. Written Rebuttal Testimony of Travis Ploeger, Jan. 10, 2020.

Earnings Calls Transcripts: iHeartMedia Inc. Q3 2019 Earnings Call Transcript, Nov. 7, 2019.

Financial Filings: Family Stations, Inc. Form 990 for the year ended December 31, 2018. iHeartMedia, Inc. Form 10-Q for the Fiscal Quarter ended September 30, 2019. Sirius XM Holdings, Inc. Form 10-Q for the Fiscal Quarter ended September 30, 2019.

Legal Documents: NRBNMLC Ex. 20, Joint Motion to Adopt Partial Settlement between SoundExchange and CBI, Sept. 23, 2019. NRBNMLC Ex. 21, Joint Motion to Adopt Partial Settlement between SoundExchange and NPR, Sept. 23, 2019. Pandora Media, LLC’s Response and Objections to Request No. 8 from SoundExchange Inc.’s Second Set of Requests for Production of Documents, Dec. 13, 2019. Sirius XM Radio Inc. and Pandora Media, LLC’s Responses and Objections to SoundExchange Inc.’s November 11, 2019 Interrogatories Directed to Sirius XM and Pandora, Nov. 18, 2019.

Statutes: 37 C.F.R. § 380.10(a)(2). 37 C.F.R. § 380.21. 37 C.F.R. § 380.22(a).

Websites: “2019 Conference on Digital Experimentation (CODE),” MIT Initiative on the Digital Economy, Nov. 1-2, 2019, http://ide.mit.edu/events/2019-conference-digital-experimentation-code. “An insider’s look at Family Radio and its leader Harold Camping,” The Mercury News, May 20, 2011, https://www.mercurynews.com/2011/05/20/an-insiders-look-at-family-radio-and-its-leader-harold-camping/. “Aplos Pricing: Try Aplos for Free,” Aplos Software, 2019, https://www.aplos.com/pricing. “Bezos calls Amazon experiment ‘a mistake’,” BizJournals, Sept. 28, 2000, https://www.bizjournals.com/seattle/ stories/2000/09/25/daily21.html. “Compare Office 365 Nonprofit plans: Large Nonprofits,” Microsoft, https://www.microsoft.com/en-us/microsoft- 365/nonprofit/office-365-nonprofit-plans-and-pricing?&activetab=tab:primaryr2. “Compare Office 365 Nonprofit plans: Small and Mid-sized Nonprofits,” Microsoft, https://www.microsoft.com/en- us/microsoft-365/nonprofit/office-365-nonprofit-plans-and-pricing?&activetab=tab:primaryr1. “Conference on Digital Experimentation (CODE),” Oct. 15, 2016, http://ide.mit.edu/events/conference-digital- experimentation-code-0. “End of the line for Christian radio network that predicted 2011 rapture,” The Denver Post, May 12, 2013, https://www.denverpost.com/2013/05/12/end-of-the-line-for-christian-radio-network-that-predicted-2011- rapture/. “Get 10 Donated Subscriptions of the World’s #1 Cloud Engagement Application,” Salesforce, 2019, https://www.salesforce.org/nonprofit_product/nonprofit-editions-pricing/. “Google for Nonprofits: Reach more donors online with Google Ad Grants,” Google, https://www.google.com/nonprofits/offerings/google-ad-grants/. “I’m already a subscriber. Do I get a discount on additional subscriptions?” SiriusXM, 2019, https://listenercare.siriusxm.com/app/answers/detail/a_id/3680/~/im-already-a-subscriber.-do-i-get-a-discount- on-additional-subscriptions%3F.

Page 5 of 8

Public Version

REBUTTAL APPENDIX 9

INCREMENTAL DOCUMENTS RELIED UPON

Websites, Continued: “iHeartMedia, Inc. Reports Results for 2019 Third Quarter,” BusinessWire, Nov. 7, 2019, https://www.businesswire.com/news/home/20191107005341/en/iHeartMedia-Reports-Results-2019-Quarter. “Microsoft Nonprofits,” Microsoft, https://www.microsoft.com/en-us/nonprofits/azure. “Pandora Premium for Families,” Pandora, 2019, https://www.pandora.com/upgrade/premium/family- plan?TID=PM:PSE:Google&gclid=Cj0KCQiAw4jvBRCJARIsAHYewPP_2y9rQDmMsvF09oDTAUp- fZkm7fU_ixzim2U6mjrK0ku4n2nx0eYaApywEALw_wcB. “Pandora Premium Student,” Pandora, 2019, https://www.pandora.com/upgrade/premium/student. “Pandora teams up with T-Mobile as an Un-carrier partner for unlimited ad-free music,” Pandora Blog, Aug. 15, 2018, http://blog.pandora.com/us/pandora-teams-up-with-t-mobile-as-an-un-carrier-partner-for-unlimited-ad- free-music/. “Public Radio Finances,” NPR, 2020, https://www.npr.org/about-npr/178660742/public-radio-finances. “Slack for Nonprofits,” Slack, 2019, https://slack.com/help/articles/204368833-Slack-for-Nonprofits. “Soundiiz General Features,” Soundiiz, https://soundiiz.com/features. “Spotify Premium,” Spotify, https://www.spotify.com/us/premium/. Alexei Oreskovic, “Twitter shares crash as user growth stalls,” Business Insider, Oct. 27, 2015, https://www.businessinsider.com/twitter-earnings-q3-2015-2015-10. Amber Neely, “How to transfer playlists from Spotify to Apple Music,” Apple Insider, Aug. 18, 2019, https://appleinsider.com/articles/19/08/18/how-to-transfer-playlists-from-spotify-to-apple-music. Ben Sisaro, “Adele Is Said to Reject Streaming for ‘25’,” The New York Times, Nov. 19, 2015, https://www.nytimes.com/2015/11/20/business/media/adele-music-album-25.html. Ben Sisaro, “Taylor Swift Announces World Tour and Pulls Her Music From Spotify,” The New York Times, Nov. 3, 2014, https://artsbeat.blogs.nytimes.com/2014/11/03/taylor-swift-announces-world-tour-and-pulls-her-music- from-spotify/. Bill Shaikan, “For the sixth year in a row, most Dodgers fans can’t watch their team on television,” Los Angeles Times, Mar. 8, 2019, https://www.latimes.com/sports/dodgers/la-sp-dodgers-20190308-story.html. Bob Smietana, “Christian radio group faces financial hard times,” U.S.A. Today, May 14, 2013, https://www.usatoday.com/story/news/nation/2013/05/14/family-radio-finances-world-did-not-end/2159621/. Callum Borchers, “Trump’s budget will probably slash public media, but the biggest losers won’t be PBS and NPR,” The Washington Post, Mar. 15, 2017, https://www.washingtonpost.com/news/the-fix/wp/2017/03/15/trumps- budget-will-likely-slash-public-media-but-the-biggest-losers-wont-be-pbs-and-npr/. Carmen Reinicke, “Spotify slips after not adding as many paid subscribers as hoped (SPOT),” Markets Insider, July 31, 2019, https://markets.businessinsider.com/news/stocks/spotify-earnings-2q-stock-price-reaction- disappointing-subscriber-growth-2019-7-1028403687. Catherine Tucker, “Network Effects Matter Less Than They Used To,” Harvard Business Review, June 22, 2018, https://hbr.org/2018/06/why-network-effects-matter-less-than-they-used-to. Cecilia Kang, “Taylor Swift has taken all her music off Spotify,” The Washington Post, Nov. 3, 2014, https://www.washingtonpost.com/news/business/wp/2014/11/03/taylor-swift-has-taken-all-her-music-off- spotify/. Charlotte Atler, “Taylor Swift Just Removed Her Music From Spotify,” TIME, Nov. 3, 2014, https://time.com/3554438/taylor-swift-spotify/. Claire Reilly, “Netflix quietly tests price hikes in Australia,” CNET, May 14, 2017, https://www.cnet.com/news/netflix-quietly-tests-weekend-price-increases-australia/. David Porter, “To Everything There is a Season,” 8tracks Blog, Dec. 26, 2019, https://blog.8tracks.com/. Dominic Rushe, “Myspace sold for $35m in spectacular fall from $12bn heyday,” The Guardian, June 30, 2011, https://www.theguardian.com/technology/2011/jun/30/myspace-sold-35-million-news. Don Reisinger, “Here’s the Latest Taylor Swift Apple Music Ad to Go Viral,” Fortune, Apr. 18, 2016, https://fortune.com/2016/04/18/taylor-swift-apple-music/.

Page 6 of 8

Public Version

REBUTTAL APPENDIX 9

INCREMENTAL DOCUMENTS RELIED UPON

Websites, Continued: Frank Pallotta and Brian Stelter, “Adele won’t allow ‘25’ album to be streamed,” CNN Business, Nov. 19, 2015, https://money.cnn.com/2015/11/19/media/adele-streaming/. Hannah Ellis-Petersen, “Taylor Swift takes a stand over Spotify music royalties,” The Guardian, Nov. 5, 2014, https://www.theguardian.com/music/2014/nov/04/taylor-swift-spotify-streaming-album-sales-snub. Hulu (@hulu), Twitter, Sept. 16, 2019, https://twitter.com/hulu/status/1173724121726738433. Hulu (@hulu), Twitter, Oct. 3, 2017, https://twitter.com/hulu/status/915283841098682368. Hulu (@hulu), Twitter, Oct. 3, 2017, https://twitter.com/hulu/status/915294183363170306. Hulu (@hulu), Twitter, Oct. 4, 2017, https://twitter.com/hulu/status/915648154854404097. Joanna Robinson, “Now That You’re Hooked, Netflix Is Looking to Raise Its Prices Again,” Vanity Fair, May 16, 2017, https://www.vanityfair.com/hollywood/2017/05/netflix-raising-prices-weekend-surge-pricing. Joe Concha, “Trump proposes eliminating federal funding for PBS, NPR,” The Hill, Feb. 12, 2018, https://thehill.com/homenews/media/373434-trump-proposes-eliminating-federal-funding-for-pbs-npr. Josh Levenson and Quentyn Kennemer, “Apple Music vs. Spotify: Which service is the streaming king?,” Digital Trends, Nov. 11, 2019, https://www.digitaltrends.com/music/apple-music-vs-spotify/. Kia Kokalitcheva, “Thanks to Adele, Pandora Says ‘Hello’ to a Stock Price Bump,” Fortune, Nov. 25, 2015, https://fortune.com/2015/11/25/adele-pandora/. Kristen Scholer, “Adele Says Hello to Pandora,” Wall Street Journal, Nov. 25, 2015, https://blogs.wsj.com/moneybeat/2015/11/25/pandora-up-as-adele-says-hello-to-the-streaming-service-bye-to- others/. Linda Rosencrance, “Testing reveals varied DVD prices on Amazon,” CNN, Sept. 7, 2000, https://www.cnn.com/2000/TECH/computing/09/07/amazon.dvd.discrepancy.idg/hs~index.html. Lisa Eadicicco, “Microsoft hired a man named Mac Book to star in its latest ad slamming Apple's laptops,” Business Insider, Aug. 1, 2019, https://www.businessinsider.com/microsoft-slams-apple-macbook-laptops-ad-2019-8. Marco Iansiti and Karim R. Lakhani “Managing Our Hub Economy,” Harvard Business Review, Sept.-Oct. 2017, https://hbr.org/2017/09/managing-our-hub-economy. Matthew Ingram, “Trump Budget Has Public Broadcasting in a Fight for its Life,” Fortune, Mar. 16, 2017, https://fortune.com/2017/03/16/trump-budget-public-broadcasting/. Michael Potluck, “Samsung Galaxy ad uses missing iPhone 11 camera feature as bait to switch,” 9to5Mac, Sept. 13, 2019, https://9to5mac.com/2019/09/13/samsung-iphone-11-missing-camera-feature/. Mike Ayers, “Adele’s ‘25’ Won’t Be Available on Spotify or Apple Music,” Wall Street Journal, Nov. 19, 2015, https://blogs.wsj.com/speakeasy/2015/11/19/adeles-25-wont-be-available-on-spotify-or-apple-music/. Neha Malara and Supantha Mukherjee, “Spotify shares surge after surprise profit, rise in paid users,” Reuters, Oct. 28, 2019, https://www.reuters.com/article/us-spotify-tech-results/spotify-shares-surge-after-surprise-profit-rise- in-paid-users-idUSKBN1X70X9. Nelson Aguilar, “Spotify vs. Apple vs. Pandora vs. Tidal vs. Deezer vs. Amazon,” Gadget Hacks, July 19, 2019, https://web.archive.org/web/20190721232028/https://smartphones.gadgethacks.com/news/best-music-streaming- services-spotify-vs-apple-vs-pandora-vs-tidal-vs-deezer-vs-amazon-0199737/. Nicola Menzie, “Family Radio Founder Harold Camping Repents, Apologizes for False Teachings,” The Christian Post, Oct. 30, 2011, https://www.christianpost.com/news/family-radio-founder-harold-camping-repents- apologizes-for-false-teachings.html. Nigel Smith, “Adele’s new album 25 will not be streamed on Spotify,” The Guardian, Nov. 19, 2015, https://www.theguardian.com/music/2015/nov/19/adele-new-album-25-not-stream-spotify-apple-music. Pamela Engel, “Taylor Swift Pulls All Of Her Albums From Spotify,” Business Insider, Nov. 3, 2014, https://www.businessinsider.com/taylor-swift-pulled-all-of-her-albums-from-spotify-2014-11. Remi Rosmarin, “Here are the main differences between Amazon's two music streaming services, Prime Music and Amazon Music Unlimited,” Business Insider, June 26, 2019, https://www.businessinsider.com/prime-music-vs- amazon-music-unlimited.

Page 7 of 8

Public Version

REBUTTAL APPENDIX 9

INCREMENTAL DOCUMENTS RELIED UPON

Websites, Continued: Rick Paulas, “What Happened to Doomsday Prophet Harold Camping After the World Didn’t End?,” Vice, Nov. 7, 2014, https://www.vice.com/en_us/article/yvqkwb/life-after-doomsday-456. Ronny Kohavi, “Trustworthy Online Controlled Experiments and the Risks of Uncontrolled Observational Studies,” Microsoft, 2019, https://exp-platform.com/Documents/2019-08%20CausalDiscoveryExPRonnyK.pdf. Ryan Waniata and Parker Hall, “Spotify vs. Pandora: Which music streaming service is better for you?” Digital Trends, July 25, 2019, https://www.digitaltrends.com/music/spotify-vs-pandora/. Sally Kaplan, “Amazon is offering 4 months of access to its music-streaming service for $1 as a Cyber Monday deal — here’s how to take advantage,” Business Insider, Dec. 2, 2019, https://www.businessinsider.com/amazon- music-unlimited-deal. Seth Stevenson, “Mac Attack: Apple’s mean-spirited new ad campaign,” Slate, https://slate.com/business/2006/06/apple-s-mean-spirited-ad-campaign.html. Steve Knopper, “Taylor Swift Abruptly Pulls Entire Catalog From Spotify,” Rolling Stone, Nov. 3, 2014, https://www.rollingstone.com/music/music-news/taylor-swift-abruptly-pulls-entire-catalog-from-spotify-55523/. Thomas Germain, “Best Music Streaming Services,” Consumer Reports, Sept. 18, 2019, https://www.consumerreports.org/streaming-media/best-music-streaming-service-for-you/. Ty Pendlebury and Xiomara Blanco, “Best music streaming: Spotify, Apple Music and more, compared,” CNET, Nov. 24, 2019, https://www.cnet.com/how-to/best-music-streaming-service-of-2019-spotify-pandora-apple- music/. Victor Luckerson, “11 Wildly Popular Albums You Can’t Get on Spotify,” TIME, Mar. 29, 2016, https://time.com/4274430/spotify-albums/. Wendy Melillo, “Amazon Price Test Nets Privacy Outcry,” AdWeek, Oct. 2, 2000, https://www.adweek.com/brand- marketing/amazon-price-test-nets-privacy-outcry-30060/.

Other: Factiva. SoundExchange Ex. 375, Declaration of Collin R. Jones (Jan. 7, 2020). SoundExchange Ex. 376, David Reiley, “Suggestions for Further Reading, W241: Experiments and Causality,” 2015, https://docs.google.com/document/d/1IMsGTHmklhvetfJJfEm9dhoFM7bvb-YOkN_6mAM8kFM/edit#. SoundExchange Ex. 404, David Reiley, “Field Experiments,” 2015, https://docs.google.com/document/d/1BDUxgzEk1vWXMiV2nz pZzoa7Hyp8I9LBiYmqA9XtZpo/edit. SoundExchange Ex. 430, 2020-1-6 NonComms Stated and Allocated 2018.xlsx. SoundExchange Ex. 431, NAB Stipulation.

Page 8 of 8

Public Version Exhibits Sponsored by Catherine Tucker

Exhibit No. Sponsored By Description Designation

SoundExchange Ex. 017 Catherine Tucker [ Restricted ] SoundExchange Ex. 018 Catherine Tucker [ Restricted ] SoundExchange Ex. 054 Catherine Tucker; Robert [ Restricted Willig ] SoundExchange Ex. 056 Catherine Tucker; Robert [ Restricted Willig ] SoundExchange Ex. 058 Catherine Tucker; Robert [ Restricted Willig; Jonathan Orszag ] SoundExchange Ex. 060 Catherine Tucker; Robert [ Restricted Willig ] SoundExchange Ex. 061 Catherine Tucker; Robert [ Restricted Willig ] SoundExchange Ex. 062 Catherine Tucker; Robert [ Restricted Willig; Jonathan Orszag ]

SoundExchange Ex. 065 Catherine Tucker; Jonathan [ Restricted Orszag; Robert Willig ] SoundExchange Ex. 069 Catherine Tucker Annual Music Study 2018 Report RIAA April Restricted 2019 SoundExchange Ex. 070 Catherine Tucker Top Statutory Stated Liability through June 2019 Restricted SoundExchange Ex. 071 Catherine Tucker Noncomms and CPD stated and allocated 2018 Public Final SoundExchange Ex. 191 Catherine Tucker [ Restricted ] SoundExchange Ex. 192 Catherine Tucker [ Restricted ] SoundExchange Ex. 193 Catherine Tucker [ Restricted ] SoundExchange Ex. 194 Catherine Tucker [ Restricted

] SoundExchange Ex. 195 Catherine Tucker [ Restricted ] SoundExchange Ex. 196 Catherine Tucker [ Restricted ] SoundExchange Ex. 197 Catherine Tucker [ ] Restricted SoundExchange Ex. 198 Catherine Tucker [ Restricted ] SoundExchange Ex. 199 Catherine Tucker [ Restricted ] SoundExchange Ex. 200 Catherine Tucker [ Restricted ] SoundExchange Ex. 205 Robert Willig; Catherine [ Restricted Tucker ]

Page 1 of 3 Public Version

Exhibit No. Sponsored By Description Designation

SoundExchange Ex. 206 Robert Willig; Jonathan [ Restricted Orszag; Catherine Tucker ]

SoundExchange Ex. 207 Robert Willig; Catherine [ Restricted Tucker ]

SoundExchange Ex. 208 Robert Willig; Catherine [ Restricted Tucker; Jonathan Orszag ]

SoundExchange Ex. 209 Robert Willig; Jonathan [ Restricted Orszag; Catherine Tucker ]

SoundExchange Ex. 210 Robert Willig; Jonathan [ Restricted Orszag; Catherine Tucker ] SoundExchange Ex. 231 Catherine Tucker; Robert [ ] Restricted Willig; Gal Zauberman; Itamar Simonson SoundExchange Ex. 254 Jonathan Orszag; Catherine [ Restricted Tucker ] SoundExchange Ex. 288 Jonathan Orszag; Catherine [ Restricted Tucker ] SoundExchange Ex. 321 Jonathan Orszag; Catherine [ ] Restricted Tucker SoundExchange Ex. 374 Catherine Tucker Consent Motion of the National Religious Public Broadcasters Noncommercial Music License Committee to Submit Corrected Written Direct Testimony of Joseph J. Cordes SoundExchange Ex. 375 Catherine Tucker Declaration of Cullin R. Jones Public SoundExchange Ex. 376 Catherine Tucker Suggestions for Further Reading Public SoundExchange Ex. 377 Catherine Tucker Deloitte Insights: Technology, Media, and Restricted Telecommunications Predictions, 2019 SoundExchange Ex. 378 Catherine Tucker Edison: The Infinite Dial 2018 Restricted SoundExchange Ex. 379 Catherine Tucker National Association of Broadcasters Radio Board Restricted of Directors, Minutes, Oct. 25, 2016 SoundExchange Ex. 380 Catherine Tucker [ Restricted ] SoundExchange Ex. 381 Catherine Tucker [ ] Restricted SoundExchange Ex. 382 Catherine Tucker [ Restricted ] SoundExchange Ex. 383 Catherine Tucker [ Restricted ] SoundExchange Ex. 384 Catherine Tucker Coleman Insights Media Research: The Image Restricted Pyramid SoundExchange Ex. 385 Catherine Tucker Coleman Insights Media Research: National Restricted Marketplace SoundExchange Ex. 386 Catherine Tucker [ ] Restricted SoundExchange Ex. 387 Catherine Tucker [ Restricted ]

Page 2 of 3 Public Version

Exhibit No. Sponsored By Description Designation

SoundExchange Ex. 388 Catherine Tucker [ Restricted

] SoundExchange Ex. 389 Catherine Tucker [ Restricted ] SoundExchange Ex. 390 Catherine Tucker [ Restricted ]

SoundExchange Ex. 391 Catherine Tucker [ Restricted

] SoundExchange Ex. 392 Catherine Tucker [ ] Restricted SoundExchange Ex. 393 Catherine Tucker Family Stations, Inc Form 8879-EO Public SoundExchange Ex. 394 Catherine Tucker NRBNMLC Board Members Restricted SoundExchange Ex. 395 Catherine Tucker [ Restricted ] SoundExchange Ex. 396 Catherine Tucker Morgan Stanley - Revival: 5th Annual Music & Restricted Radio Survey SoundExchange Ex. 397 Catherine Tucker Jacobs Media: Radio's Survival Kit Restricted SoundExchange Ex. 398 Catherine Tucker MusicWatch: How US Consumers Listen to Music Restricted (Audiocensus Q4 2018) SoundExchange Ex. 399 Catherine Tucker [ ] Restricted SoundExchange Ex. 400 Catherine Tucker [ ] Restricted SoundExchange Ex. 401 Catherine Tucker, Itamar [ ] Restricted Simonson SoundExchange Ex. 402 Catherine Tucker [ Restricted ] SoundExchange Ex. 403 Catherine Tucker [ Restricted ] SoundExchange Ex. 404 Catherine Tucker Dr. Reiley Course Syllabus Public SoundExchange Ex. 430 Catherine Tucker [ Restricted

]

Page 3 of 3 Public Version

EXHIBIT B Public Version

Before the UNITED STATES COPYRIGHT ROYALTY JUDGES Washington, D.C.

In re

Determination of Rates and Terms for Docket No. 19-CRB-0005-WR Digital Performance of Sound Recordings (2021-2025) and Making of Ephemeral Copies to Facilitate those Performances (Web V)

CORRECTED WRITTEN REBUTTAL TESTIMONY OF Catherine Tucker Sloan Distinguished Professor of Management Science MIT Sloan at the Massachusetts Institute of Technology

Julyanuary, 2020 Public Version

TABLE OF CONTENTS

I. Introduction and Assignment ...... 1 II. Summary of Conclusions ...... 2 III. Dr. Reiley’s Label Suppression Experiments Provide an Inaccurate Guide to the Effect of the Removal of a Label, Rendering Use of These Estimates by Professor Shapiro Flawed ...... 4 A. A field experiment that does not inform subjects about the treatment is unrealistic and cannot measure the true effect of a treatment that would be known to users ...... 6 1. A field experiment needs to measure the “treatment” of interest ...... 6 2. In reality, the removal of [ ] catalog would be known by many Pandora users ...... 9 3. Many of the users in the Label Suppression Experiments were unlikely to be aware of the removal of recordings from the suppressed record company ...... 15 4. Dr. Reiley’s statements about the merits of “blinded” experiments do not apply in this context ...... 21 B. The Label Suppression Experiments do not help us understand competitive effects ...... 25 1. Field experiments cannot be used to measure competitive effects ...... 26 2. Streaming services are competitive because it is easy for users to switch ...... 26 3. Competitors could respond to a reduction in Pandora’s catalog...... 29 C. Notwithstanding that the Label Suppression Experiments did not measure the treatment of interest, the experiments as conducted do not provide precise estimates of the effect of 100 percent label suppression ...... 30 1. The Label Suppression Experiments were underpowered ...... 30 2. The Label Suppression Experiments were deliberately limited in scope because the researchers anticipated that the consequences of a larger experiment could negatively affect Pandora’s business ...... 34 3. The Label Suppression Experiments struggled to attain precision on estimates because of the inclusion of many users who received little or no suppression treatment ...... 35

i

Written Rebuttal Testimony of Catherine Tucker Public Version

4. Unlike other work by Dr. Reiley, Dr. Reiley’s analysis of the Label Suppression Experiments does not use data on the intensity of treatment ...... 40 D. The Label Suppression Experiments are not useful for estimating true long-run effects ...... 42

IV. Professor Shapiro Presents Insufficient Analysis to Conclude that No Label is a “Must-Have” ...... 46 A. Professor Shapiro’s estimates rely heavily on the Label Suppression Experiments ...... 47 B. Professor Shapiro’s ad hoc corrections to the estimates from the Label Suppression Experiments do not result in a conservative application of those estimates ...... 48 C. Professor Shapiro ignores the additional effects of losing access to content from [ ] on Pandora’s underlying business model ...... 50 D. Professor Shapiro improperly applies the results of the Label Suppression Experiments to estimate a reasonable royalty for subscription webcasters ...... 54 E. Ultimately, Professor Shapiro’s application of the results of the Label Suppression Experiments suffers from compounding errors ...... 56

V. The Submission by the National Association of Broadcasters (“NAB”) Misses the Importance of Simulcasting to Their Broadcasters...... 56 VI. The Religious Broadcasters’ Arguments for Why They Should Pay Less Are Not Based on Economics ...... 63 A. Family Radio is not representative of noncommercial webcasters ...... 65 B. Statutory royalty payments make up a very small proportion of noncommercial webcasters’ costs ...... 68 C. Though small non-profits are given discounts in some cases, those discounts may not be extended proportionally to larger non-profits ...... 70 D. Noncommercial webcasters already receive a discounted rate ...... 71 E. The argument that noncommercial webcasters should not pay excess royalties because they are too unpredictable to finance with donations is not based on data or economics ...... 73 F. The CBI and NPR settlements do not provide support for NRBNMLC’s rate proposal ...... 74

ii

Written Rebuttal Testimony of Catherine Tucker Public Version

I. Introduction and Assignment

I previously submitted written direct testimony in this matter on September 23, 2019 (“Written

Direct Testimony”).1

I have been asked to address certain issues raised in testimony submitted on behalf of the various services, including in:  Dr. David Reiley’s testimony,2 on behalf of Pandora, regarding the design and analysis of a series of field experiments (the “Label Suppression Experiments”) conducted to assess the effect on users’ listening if Pandora lost access to the entire catalog of a particular record company;

 Professor Carl Shapiro’s testimony,3 on behalf of Pandora, regarding the use of the Label Suppression Experiments in determining reasonable royalty rates for non-interactive services for the 2021-2025 time period;

 Dr. Gregory Leonard’s testimony,4 on behalf of the National Association of Broadcasters (“NAB”), regarding the role of simulcasting for terrestrial broadcasters;

 Dr. Joseph Cordes’ testimony,5 on behalf of the National Religious Broadcasters Noncommercial Music License Committee (“NRBNMLC”), regarding reasonable royalty rates for noncommercial webcasters; and

 Dr. Richard Steinberg’s testimony,6 on behalf of NRBNMLC, regarding reasonable royalty rates for noncommercial webcasters.

1 Written Direct Testimony of Catherine Tucker, Sept. 23, 2019 (“Tucker WDT”). Since submitting my Written Direct Testimony, Sirius XM and iHeartMedia reported financial results for Q3 2019. These results are summarized in Rebuttal Appendices 5 to 8. 2 Written Direct Testimony of David Reiley, Sept. 23, 2019 (“Reiley WDT”); Corrected Written Direct Testimony of David Reiley, Nov. 26, 2019 (“Reiley Corrected WDT”). Dr. Reiley is a Principal Scientist at Pandora. 3 Written Direct Testimony of Carl Shapiro, Sept. 23, 2019 (“Shapiro WDT”); Corrected Written Direct Testimony of Carl Shapiro, Nov. 20, 2019 (“Shapiro Corrected WDT”). 4 Written Direct Testimony of Dr. Gregory K. Leonard, Sept. 20, 2019 (“Leonard WDT”). 5 Written Direct Testimony of Joseph J. Cordes, Sept. 21, 2019 (“Cordes WDT”); Corrected Written Direct Testimony of Joseph J. Cordes, Dec. 20, 2019 (“Cordes Corrected WDT”). 6 Written Direct Testimony of Richard Steinberg, Sept. 22, 2019 (“Steinberg WDT”); Amended Written Direct

1

Written Rebuttal Testimony of Catherine Tucker Public Version

I am being compensated for my services in this matter at my customary hourly rate of $1,200. Certain employees of Analysis Group have assisted me in working on this report. Analysis Group is being compensated for their time in this matter at an hourly rate ranging between $225 and $775 an hour. In addition, I receive compensation based on a proportion of the total

billing of Analysis Group. A copy of my CV is attached to my Written Direct Testimony.7 My qualifications remain unchanged from my previous submission. A list of the materials I have

relied upon to date in developing my opinions contained in this report is attached as Rebuttal

Appendix 9.8

II. Summary of Conclusions

I have concluded that:

First, the Label Suppression Experiments conducted by Dr. Reiley result in misleading and unreliable estimates of the effect of Pandora’s loss of content from a record company on its

ad-supported radio service. Most users in the Label Suppression Experiments would not have been aware of the suppression treatment, while in reality the absence of music from [ ] would be widely known to users due to publicity from third-party industry commentators, other users, and competitors. The Label Suppression Experiments do not measure competitive effects or provide insights into the influence of competitors’ responses on consumer behavior. The Label Suppression Experiments were also underpowered and achieved only partial suppression of recordings from the tested record companies, and

therefore cannot provide reliable estimates of the effect of Pandora’s full loss of a record company’s content. Last, the Label Suppression Experiments do not provide any useful guide to the long-run effects of Pandora’s loss of content from [ ]. These and

Testimony of Richard Steinberg, Dec. 11, 2019 (“Steinberg Amended WDT”). 7 Tucker WDT, at Appendix 19. 8 I may amend my testimony with any new information as additional evidence becomes available, and may modify my analysis and resulting conclusions. 2

Written Rebuttal Testimony of Catherine Tucker Public Version

other errors render Dr. Reiley’s analysis and conclusions not useful for purposes of measuring the effects of the removal of a record company from Pandora’s ad-supported noninteractive service. I discuss this in detail in Section III.

Second, because Professor Shapiro’s analysis of reasonable royalty rates relies heavily on the results of the Label Suppression Experiments, errors in these experiments render Professor Shapiro’s reasonable royalty estimates flawed and unreliable. These errors are compounded by

Professor Shapiro’s unfounded and ad hoc assumptions surrounding the applicability of the estimates from the experiments and how to correct the errors within the experiments. In addition, Professor Shapiro’s analysis ignores the additional effects of losing access to content from [ ] on Pandora’s underlying business model beyond just the direct effect on listener hours, further understating the effect on Pandora’s business. I discuss this in detail in Section IV.

Third, while NAB witnesses suggest that webcasting is a small and unprofitable aspect of their business, their arguments ignore the role that simulcasting plays in protecting the NAB’s members’ radio businesses. In particular, evidence from their own executives suggests that simulcasts enhance and complement radio stations’ core businesses, including by helping radio businesses to retain listeners in the face of emerging digital technologies. I discuss this in detail in Section V.

Fourth, in their written direct testimonies on behalf of the NRBNMLC, Dr. Steinberg and Dr. Cordes argue that noncommercial webcasters should pay lower statutory rates than commercial

webcasters. However, the argument that noncommercial webcasters should not pay excess royalties because they are too unpredictable to finance with donations is not based on data or economics. Dr. Steinberg and Dr. Cordes ignore that the average per-performance rate paid by noncommercial webcasters is already lower than the rates paid by commercial webcasters, and would remain lower under SoundExchange’s proposed rates. Last, though Dr. Steinberg amended his written direct testimony to add a discussion of SoundExchange’s settlement

3

Written Rebuttal Testimony of Catherine Tucker Public Version

agreements with CBI and NPR, neither of these settlement agreements provides support for NRBNMLC’s rate proposal. I discuss this in detail in Section VI.

III. Dr. Reiley’s Label Suppression Experiments Provide an Inaccurate Guide to the Effect of the Removal of a Label, Rendering Use of These Estimates by Professor Shapiro Flawed

The “Label Suppression Experiments” refer to a series of experiments that, as described by Dr. Reiley, were “conducted to assess whether selectively suppressing user access to music from particular record companies has an impact on consumer listening hours, the extent of any

impact, and whether the impact varies by record company.”9 Following instructions from Professor Shapiro, Pandora’s economic expert, Dr. Reiley ran five experimental treatments with the goal of suppressing music from one of each of the following record companies: [

]10

Dr. Reiley asserts that the Label Suppression Experiments address “[w]hat effect, if any, there would be on users’ listening if Pandora stopped playing the entire catalog of a particular record

company on Pandora’s ad-supported service.”11 Dr. Reiley’s research objective is based on Professor Shapiro’s instruction that the “goal of these experiments is to measure the responses of Pandora listeners if a Pandora advertising-supported statutory service were to lose access to

the music of a given record company.”12

9 Reiley Corrected WDT, at 1-2. 10 Reiley Corrected WDT, at 7. As I discuss in Section III.A, Dr. Reiley’s label suppression treatment is unrealistic and does not measure the true effect of Pandora’s loss of content from [ ] because, among other things, many of the users in the treatment groups were unlikely to have been aware of the suppression treatment. Further, as discussed in Section III.C, while Dr. Reiley indicates that he intended the label suppression treatment to completely suppress recordings from the tested record company, it appears that few, if any, users received this treatment as intended. 11 Reiley Corrected WDT, at 6-7. 12 Reiley Corrected WDT, at 26.

4

Written Rebuttal Testimony of Catherine Tucker Public Version

Dr. Reiley finds that the label suppression treatment resulted in [

] of the five treatment groups relative to the control groups.13 For the [ ], Dr. Reiley estimates a reduction in listening hours of [ ] percent for the [ ] treatment group and an increase in listening hours of [ ]

percent for the [ ] treatment group as compared to the control groups.14 Dr. Reiley obtained [

]. Dr. Reiley states that “none of these estimates [

].”15

In this section, I explain why the Label Suppression Experiments result in misleading and unreliable estimates of the effect of the removal of a record company from Pandora’s ad- supported radio service. Most users in the Label Suppression Experiments would not have known that [ ] was being suppressed, but in reality such lack of recordings from [ ] would be known by users because of the amount of publicity it

would receive from third-party industry commentators, other users, and competitors. The Label Suppression Experiments do not measure how competitors would react or measure how these competitive reactions would affect behavior. The Label Suppression Experiments were underpowered and only achieved partial suppression of recordings from the tested record companies, and therefore cannot provide precise estimates of the effect. Finally, the Label

Suppression Experiments—which ran for a period of only three months—do not provide any useful guide to the long-run effects of Pandora’s loss of content from [

13 Reiley Corrected WDT, at 11-12. 14 Reiley Corrected WDT, at 11-12. On its face, the fact that [ ] calls into question the validity and reliability of the Label Suppression Experiments. Dr. Reiley testified that [

] SoundExchange Ex. 231, Deposition of David H. Reiley, Jr., Ph.D., Dec. 16, 2019 (“SoundExchange Ex. 231 (Deposition of David Reiley)”), at 125. As discussed below in Section III.A.3, this result is consistent with evidence that many listeners in the treatment group experienced few suppressed recordings and were likely unware of the suppression treatment. 15 Reiley Corrected WDT, at 11-12. 5

Written Rebuttal Testimony of Catherine Tucker Public Version

]. These and other errors render Dr. Reiley’s analysis and conclusions not useful for the purpose of measuring the effects of the removal of a record company from Pandora’s ad- supported noninteractive service.

A. A field experiment that does not inform subjects about the treatment is unrealistic and cannot measure the true effect of a treatment that would be known to users

1. A field experiment needs to measure the “treatment” of interest

A field experiment, often referred to as a randomized controlled trial (“RCT”), is a controlled study in which a firm or organization randomizes whether or not a user receives a specific

“treatment.”16 The random assignment of participants to a treatment group that receives the treatment of interest and a control group that does not receive the treatment ensures that no unobservable characteristics of the participants are reflected in the assignment, and therefore that any difference between treatment and control groups reflects the effect of the treatment

itself.17 Done correctly, this allows the researcher to address causal questions, such as the

expected effect of taking a specific course of action. Reflecting the value of field experiments in digital economics, I have conducted and analyzed many field experiments during the course

of my research.18 I co-authored the chapter on how to conduct field experiments in marketing

16 See, e.g., Abhijit Banerjee & Esther Duflo, An Introduction to the ‘Handbook of Field Experiments,’ (Aug. 2016), available at https://www.povertyactionlab.org/sites/default/files/documents/handbook_intro.pdf; Glenn W. Harrison, Cautionary Notes on the Use of Field Experiments to Address Policy Issues, 30 Oxford Review of Economic Policy 753-763, 753 (2014); see also, Reiley Corrected WDT, at 5. 17 Abhijit Banerjee & Esther Duflo, An Introduction to the ‘Handbook of Field Experiments,’ at 1 (Aug. 2016), available at https://www.povertyactionlab.org/sites/default/files/documents/handbook_intro.pdf. 18 See, e.g., Catherine Tucker & Juanjuan Zhang, Growing Two-sided Networks by Advertising the User Base: A Field Experiment, 29 Marketing Science 805-814 (Sept.-Oct. 2010); Catherine Tucker & Juanjuan Zhang, How Does Popularity Information Affect Choices? A Field Experiment, 57 Management Science 828-842 (May 2011); Catherine Tucker & Anja Lambrecht, When Does Retargeting Work? Information Specificity in Online Advertising, 50 Journal of Marketing Research561-576 (Oct. 2013); Catherine Tucker, Social Networks, Personalized Advertising, and Privacy Controls, 51 Journal of Marketing Research 546-562 (Oct. 2014); Anja Lambrecht and Catherine Tucker, Paying with Money or with Effort: Pricing When Customers Anticipate Hassle, 49 Journal of Marketing Research 66-82 (2012); Catherine Tucker, The Reach and Persuasiveness of Viral Video Ads, 34 Marketing Science 281-296 (Mar. 2015); Christian Catalini and Catherine Tucker, When Early Adopters Don’t Adopt, 357 Science, 135-136 (July 2017).

6

Written Rebuttal Testimony of Catherine Tucker Public Version

for the Handbook of Marketing Analytics.19 I am also a regular keynote speaker at the

Conference on Digital Experimentation.20

However, while there are many potential benefits of field experiments, there are also pitfalls that must be avoided. Any properly designed field experiment can provide an estimate of something, but those results may not be relevant for the policy question or hypothesis of interest. As stated by researchers at Princeton,

[W]hether, and in what ways, an RCT result is evidence depends on exactly what the hypothesis is for which the result is supposed to be evidence, and that what kinds of hypotheses these will be depends on the purposes to be served. This should in turn affect the design of the trial itself.21

A field experiment that provides an estimate of the wrong thing is uninformative for the researcher’s question or, worse, may be misleading if the researcher assumes they are measuring the effect of interest. Any ad hoc attempts to adjust the estimates of the poorly

designed field experiment to approximate the variable that should have been measured in the first place will likely result in unreliable and/or biased estimates of what the researcher or firm

was trying to measure. [ ], Dr. Reiley testified that he [

]22

19 Anja Lambrecht & Catherine Tucker, Field Experiments in Handbook of Marketing Analytics (Natalie Mizik & Dominique Hanssens eds., Edward Elgar Publishing 2018). 20 “2019 Conference on Digital Experimentation (CODE),” MIT Initiative on the Digital Economy (Nov. 1-2, 2019), http://ide.mit.edu/events/2019-conference-digital-experimentation-code; “Conference on Digital Experimentation (CODE),” MIT Initiative on the Digital Economy (Oct. 14-15, 2016), http://ide.mit.edu/events/conference-digital- experimentation-code-0. 21 See, e.g., Angus Deaton & Nancy Cartwright, Understanding and Misunderstanding Randomized Controlled Trials, 210 Social Science & Medicine 2-21, 10 (Aug. 2018) (emphasis in original). 22 SoundExchange Ex. 231 (Deposition of David Reiley), at 45:17-25 ([

]).

7

Written Rebuttal Testimony of Catherine Tucker Public Version

In this case, Dr. Reiley and Professor Shapiro explain that the Label Suppression Experiments are meant to capture the effect on Pandora listeners if Pandora’s ad-supported service no longer

offered music from a particular record company.23 The right experiment, therefore, is one that measures the effect of the loss of a record company’s catalog on Pandora’s ad-supported

service.24 A properly designed experiment would administer a treatment that accurately reflects

the characteristics of such a situation in the real world.25 As I explain in Section III.A.2, one

important characteristic of this situation is that Pandora’s loss of content from [ ] would be known to many users in the real world, in part because competitors and third parties would be motivated to publicize this change.

The Label Suppression Experiments, however, do not measure the effect of suppressing recordings from a record company when users are aware of that absence. At best, the Label Suppression Experiments measure only the short-run effect on listening hours of removing certain recordings without informing users that this is happening.

Many researchers have noted the importance of context on participants’ behavior in experiments. Harrison and List (2004) cautioned that “[i]t is not the case that abstract, context- free experiments provide more general findings if the context itself is relevant to the

performance of subjects.”26 Others have shown that information about the treatment can influence participants’ behavior. For example, one study found evidence of systematic

23 Dr. Reiley’s research objective is based on Professor Shapiro’s instruction that the “goal of these experiments is to measure the responses of Pandora listeners if a Pandora advertising-supported statutory service were to lose access to the music of a given record company.” Reiley Corrected WDT, at 26. 24 In economics, we refer to this as measuring a “treatment effect” of interest. This idea of a treatment effect is something that Dr. Reiley has emphasized in his own work. Randall Lewis & David Reiley, Online Ads and Offline Sales: Measuring the Effects of Retail Advertising via a Controlled Experiment on Yahoo!, 12 Quantitative Marketing and Economics 235-266 (Sept. 2014). 25 In addition, a properly designed experiment would need to carefully consider potential spillover effects on the Plus and Premium tiers. 26 Glenn Harrison & John List, Field Experiments, 42 Journal of Economic Literature 1009-1055, 1028 (Dec. 2004) (emphasis in original).

8

Written Rebuttal Testimony of Catherine Tucker Public Version

differences in behavior of participants in a medical trial depending on the information they

received about the probability of receiving the treatment as opposed to the placebo.27 The concept of “ecological validity” captures the idea that the degree to which an experiment’s

design resembles the real world influences the generalizability of its results.28

2. In reality, the removal of [ ] catalog would be known by many Pandora users

Dr. Reiley chose to make the Label Suppression Experiments “blind” by not informing participants in the treatment group that they would no longer have access to music from the

suppressed record company on Pandora’s ad-supported service.29 In reality, however, the removal of [ ] catalog from Pandora’s service would be made public by Pandora, its competitors, and/or third parties and would be known to many users. Therefore, measuring the effects of removing recordings from [ ] on users’ listening behavior when those users are under the impression they are receiving the same

Pandora ad-supported service and music catalogs they have always received, as Dr. Reiley did, is not useful for measuring the real-world impact of losing access to [ ] catalog. Dr. Reiley agreed that, [

]30

27 Sylvain Chassang, et al., Accounting for Behavior in Treatment Effects: New Applications for Blind Trials, 10 PLoS ONE 1, 4-5 (June 2015). 28 See, e.g., Kathryn Zeiler, Cautions on the Use of Economics Experiments in Law, 166 Journal of Institutional and Theoretical Economics 178-193, 181-82 (Mar. 2010). 29 Reiley Corrected WDT, at 4. 30 SoundExchange Ex. 231 (Deposition of David Reiley), at 162:6-163:4 ([

]).

9

Written Rebuttal Testimony of Catherine Tucker Public Version

Consumers are aware of, or can learn, a substantial amount of information about their

streaming service of choice and the breadth of the recordings and music catalogs it offers.31 Many third-party reviews and articles discuss the breadth of music offered by competing streaming music services. For example, Consumer Reports highlights and compares the size of major streaming services’ music libraries in its article advising consumers how to choose

between available streaming options.32 Similarly, a 2019 article from Digital Trends compares

Spotify and Apple Music on several dimensions, including their music libraries. The article attributes Spotify’s success to the size of its catalog, explaining that “Spotify first took its dominant position on the strength of its impressive 30 million-plus song catalog,” but notes that Apple Music now “touts around 45 million songs, which is superior to Spotify’s current 35 million-plus figure, and also outdoes newer contenders like Amazon Prime Music and Jay-

Z’s Tidal.”33 Although Amazon Prime Music is not representative of other streaming services

as it is bundled with Amazon Prime,34 third-party discussions of Amazon Prime Music have

focused on its relatively small catalog of approximately two million recordings, often characterizing its limited selection as a drawback of the service. For example, a June 2019 article by Business Insider describes the “pretty meager” recording selection of Amazon Prime Music as “[t]he most notable difference” between it and Amazon Music Unlimited, warning

that “you may feel let down by Prime Music’s small selection.”35 The article notes that Amazon

31 For example, TIME published an article in 2015 detailing “11 Wildly Popular Albums You Can’t Get on Spotify.” Victor Luckerson, 11 Wildly Popular Albums You Can’t Get on Spotify, TIME (Mar. 29, 2016), https://time.com/4274430/spotify-albums/. 32 Thomas Germain, Best Music Streaming Services, Consumer Reports, Sept. 18, 2019, https://www.consumerreports.org/streaming-media/best-music-streaming-service-for-you/. See also, Ty Pendlebury & Xiomara Blanco, Best music streaming: Spotify, Apple Music and more, compared” CNET (Nov. 24, 2019), https://www.cnet.com/how-to/best-music-streaming-service-of-2019-spotify-pandora-apple-music/. 33 Josh Levenson & Quentyn Kennemer, Apple Music vs. Spotify: Which service is the streaming king?, Digital Trends (Nov. 11, 2019), https://www.digitaltrends.com/music/apple-music-vs-spotify/. 34 Amazon Price offers members numerous retail and other benefits. 35 Remi Rosmarin, Here are the main differences between Amazon’s two music streaming services, Prime Music and Amazon Music Unlimited, Business Insider (June 26, 2019), https://www.businessinsider.com/prime-music-vs- amazon-music-unlimited.

10

Written Rebuttal Testimony of Catherine Tucker Public Version

Music Unlimited, in contrast, offers “50 million songs” and “fills all the holes” in Prime

Music’s library.36

In addition, if Pandora lost access to [ ] catalog of sound recordings, third-party industry commentators and competitors (as described in Section III.B.3) would have incentives to publicly promote this event as it would represent a meaningful difference between Pandora’s service and competing services. Further, artists whose recordings were

removed from Pandora’s catalog would similarly have incentives to publicize this change and encourage consumers to listen to other music streaming services instead.

When a single artist, Taylor Swift, removed her recording catalog from Spotify in November

2014,37 it was covered in at least 260 news stories that week, including on the websites associated with Time, The Guardian, Rolling Stone, The New York Times, The Washington

Post, and others.38 The return of Taylor Swift’s catalog to Spotify on June 9, 2017 was covered

in approximately 215 news stories that week.39

36Similarly, Professor Shapiro notes that Amazon Prime Music differs from standalone interactive subscription services because it has a much smaller music catalog (less than 10 percent of recordings) and “customers do not expect to find all their favorite artists and recordings on Amazon Prime as they do with a standalone interactive subscription service.” Shapiro Corrected WDT, at 37-38. 37 Hannah Ellis-Petersen, Taylor Swift takes a stand over Spotify music royalties, The Guardian (Nov. 5, 2014), https://www.theguardian.com/music/2014/nov/04/taylor-swift-spotify-streaming-album-sales-snub; Pamela Engel, Taylor Swift Pulls All Of Her Albums From Spotify, Business Insider (Nov. 3, 2014), https://www.businessinsider.com/taylor-swift-pulled-all-of-her-albums-from-spotify-2014-11. 38 I estimated the number of news articles discussing Taylor Swift’s decision to remove her music catalog from Spotify by performing a Factiva search on November 14, 2019 for articles published between November 3, 2014 and November 10, 2014 containing “Taylor Swift + Spotify + (Remove* or Drop*).” See also Charlotte Atler, Taylor Swift Just Removed Her Music From Spotify, TIME (Nov. 3, 2014), https://time.com/3554438/taylor-swift-spotify/; Steve Knopper, Taylor Swift Abruptly Pulls Entire Catalog From Spotify, Rolling Stone (Nov. 3, 2014), https://www.rollingstone.com/music/music-news/taylor-swift-abruptly-pulls-entire-catalog-from-spotify-55523/; Ben Sisaro, Taylor Swift Announces World Tour and Pulls Her Music From Spotify, N.Y. Times (Nov. 3, 2014), https://artsbeat.blogs.nytimes.com/2014/11/03/taylor-swift-announces-world-tour-and-pulls-her-music-from- spotify/; Cecilia Kang, Taylor Swift has taken all her music off Spotify, Wash. Post (Nov. 3, 2014), https://www.washingtonpost.com/news/business/wp/2014/11/03/taylor-swift-has-taken-all-her-music-off-spotify/. 39 Factiva search on November 14, 2019 for articles published between June 9, 2017 and June 16, 2017 containing “Taylor Swift + Spotify.”

11

Written Rebuttal Testimony of Catherine Tucker Public Version

Similarly, Adele’s choice not to release her “25” album on streaming services such as Spotify and Apple Music in November 2015 was widely reported, appearing in over 480 news articles across websites associated with major news publications such as The Guardian, The New York

Times, CNN, and The Wall Street Journal in the week following the announcement.40 The availability of Adele’s “25” album on Pandora’s noninteractive streaming service was also covered by several news outlets, with some noting that Pandora’s ability to offer the new album

while it was absent from competing services increased the company’s stock price by up to five

percent.41

If Pandora stopped playing music from [ ] or artist, similar publicity would result. This publicity would increase consumer awareness of Pandora’s loss of access to [ ] music catalog, which would influence consumer behavior in ways that cannot be measured by the Label Suppression Experiments.

Further, publicity of Pandora’s diminished catalog is likely to be amplified by listeners’ discussions on social media, further contributing to users’ awareness of the removal of a record company’s catalog from Pandora’s ad-supported service and influencing their listening behavior. Several research studies have demonstrated that social media and peer influence can have a substantial impact on user behavior towards music. Studies have shown that online

40 Factiva search on November 14, 2019 for articles published between November 19, 2015 and November 26, 2015 containing “Adele + (Spotify or Apple).” See also Nigel Smith, Adele’s new album 25 will not be streamed on Spotify, The Guardian (Nov. 19, 2015), https://www.theguardian.com/music/2015/nov/19/adele-new-album-25-not-stream- spotify-apple-music; Ben Sisaro, Adele Is Said to Reject Streaming for ‘25’, N.Y. Times (Nov. 19, 2015), https://www.nytimes.com/2015/11/20/business/media/adele-music-album-25.html; Frank Pallotta & Brian Stelter, Adele won’t allow ‘25’ album to be streamed,” CNN Business (Nov. 19, 2015), https://money.cnn.com/2015/11/19/media/adele-streaming/; Mike Ayers, Adele’s ‘25’ Won’t Be Available on Spotify or Apple Music, Wall Street Journal (Nov. 19, 2015), https://blogs.wsj.com/speakeasy/2015/11/19/adeles-25-wont- be-available-on-spotify-or-apple-music/. 41 Kia Kokalitcheva, Thanks to Adele, Pandora Says ‘Hello’ to a Stock Price Bump, Fortune (Nov. 25, 2015), https://fortune.com/2015/11/25/adele-pandora/; Kristen Scholer, Adele Says Hello to Pandora, Wall Street Journal (Nov. 25, 2015), https://blogs.wsj.com/moneybeat/2015/11/25/pandora-up-as-adele-says-hello-to-the-streaming- service-bye-to-others/. I understand that because Pandora was operating under the statutory license at the time, Adele had no choice but to make “25” available on Pandora.

12

Written Rebuttal Testimony of Catherine Tucker Public Version

music listening is socially driven and is influenced by the opinions of the community as a

whole, as well as the opinion of the user’s immediate social network friends.42 A study of users of a music-listening social network, Last.fm, found that the opinions and behavior of peers substantially influence listening behavior and upgrade decisions, with “peer influence caus[ing] more than a 60% increase in odds of buying the service due to the influence coming

from an adopting friend.”43 Together, this research suggests that social media commentary can

influence listeners’ behavior in response to Pandora’s loss of content by, for example, making them more likely to reduce their listening time on Pandora or switch away from Pandora altogether.

[

42 Sanjeev Dewan, Yi-Jen Ho, & Jui Ramaprasad, Popularity or Proximity: Characterizing the Nature of Social Influence in an Online Music Community, 28 Information Systems Research 117-136 (Mar. 2017); see also Gal Oestreicher-Singer & Lior Zalmanson, Content or Community? A Digital Business Strategy for Content Providers in the Social Age, 37 Management Information Systems Quarterly 591-616 (June 2013). 43 Ravi Bapna & Akhmed Umyarov, Do Your Online Friends Make You Pay? A Randomized Field Experiment on Peer Influence in Online Social Networks, 61 Management Science 1902-1920, 1902, 1904 (Aug. 2015). 44 SoundExchange Ex. 210, [ ], PANWEBV_00005093, at 00005105; see also SoundExchange Ex. 206, [ ], PANWEBV_00005332, at 00005338 [ ]), 800005339 ([ ]; SoundExchange Ex. 205, [ ], PANWEBV_00004571, at 00004605 [

]; SoundExchange Ex. 207, [ ], SXMWEBV_00004833, at 00004863 (stating that on Pandora, [ ]); SoundExchange Ex. 208, [ ], PANWEBV_00006711, at 00006727 [ ]; SoundExchange Ex. 209, [ ], PANWEBV_00006865, at 00006889 [ ].

13

Written Rebuttal Testimony of Catherine Tucker Public Version

] This further suggests that listeners’ awareness of the missing content in the real world would likely affect their behavior.

Indeed, when a treatment or experiment is actually publicized, there are often insights that would not have been available while users were blind to the experiment. For example,

Amazon’s random consumer price experiment conducted in September 2000, during which it charged consumers different prices for the same DVD, caused outrage when it was made

public.46 Similarly, consumers responded negatively to Netflix’s pricing tests in Australia in May 2017 under the mistaken belief that the company was experimenting with “surge pricing”

during popular usage times, such as weekends.47 In both cases, the experiments themselves would not have captured the outrage that customers felt in response to the pricing adjustments if the field tests had been conducted in a manner such that the users did not find out.

In sum, because of their “blind” design, Dr. Reiley’s Label Suppression Experiments cannot capture any of these effects specifically associated with listeners’ awareness of the removal of recordings from [ ] from Pandora’s ad-supported service. As such, Dr. Reiley’s Label Suppression Experiments provide an inaccurate and downward biased estimate of even the short run effect of losing access to [ ] content.

45 SoundExchange Ex. 206, [ ], PANWEBV_00005332, at 00005340. 46 Wendy Melillo, Amazon Price Test Nets Privacy Outcry, AdWeek (Oct. 2, 2000), https://www.adweek.com/brand- marketing/amazon-price-test-nets-privacy-outcry-30060/; Bezos calls Amazon experiment ‘a mistake’, BizJournals (Sept. 28, 2000), https://www.bizjournals.com/seattle/stories/2000/09/25/daily21.html; Linda Rosencrance, Testing reveals varied DVD prices on Amazon, CNN (Sept. 7, 2000), https://www.cnn.com/2000/TECH/computing/09/07/ amazon.dvd.discrepancy.idg/hs~index.html. 47 Claire Reilly, Netflix quietly tests price hikes in Australia, CNET (May 14, 2017), https://www.cnet.com/news/ netflix-quietly-tests-weekend-price-increases-australia/; Joanna Robinson, Now That You’re Hooked, Netflix Is Looking to Raise Its Prices Again, VanityFair (May 16, 2017), https://www.vanityfair.com/hollywood/2017/05/ netflix-raising-prices-weekend-surge-pricing.

14

Written Rebuttal Testimony of Catherine Tucker Public Version

3. Many of the users in the Label Suppression Experiments were unlikely to be aware of the removal of recordings from the suppressed record company

[

].48 [

].49 Indeed, many aspects of the experiments obscured the label suppression treatment, making it less likely that users in the treatment group would become

aware of the removal of recordings from the suppressed record company.50 First, public information about Pandora’s music library revealed no change in its music catalog during the experiment. Second, many users’ listening patterns suggest that many users in the treatment group were unlikely to have many recordings suppressed during the experimental period. Third, many users in the treatment groups were exposed to recordings from the suppressed

record company either through errors in the implementation of the experiment, Premium Access, subscription tiers, or “miscellaneous provider tracks” in Pandora’s system. Dr. Reiley does not provide any evidence that listeners might have independently ascertained that certain recordings were suppressed. Even if they identified or suspected a pattern, they may have misattributed this to Pandora’s song selection algorithm or other factors rather than label

suppression. As such, the Label Suppression Experiments do not provide a useful predictor of

48 SoundExchange Ex. 231 (Deposition of David Reiley), at 109:24-110:5 ([

]). 49 SoundExchange Ex. 231 (Deposition of David Reiley), at 160:18-25 ([

]), 161:3-6 ([ ]). 50 Professor Shapiro also recognizes that users would not have been aware of losing access to certain material and that this lack of knowledge could affect the results, noting: “in the Label Suppression Experiments, listeners were presumably not aware of the blackout, and they might react more strongly if they were aware.” Shapiro Corrected WDT, at 19. 15

Written Rebuttal Testimony of Catherine Tucker Public Version

what would happen in reality if recordings from [ ] were removed from Pandora’s ad-supported service.

a. Public information suggested that Pandora was supplying tracks from all record companies

As Pandora’s ad-supported service is noninteractive in nature, there is always some degree of uncertainty around which track will be played next. Pandora’s ad-supported service relies on

algorithms to select recordings for each user based on her listening habits and expressed preferences. This element of unpredictability inherently obscures the label suppression treatment and makes it more difficult for users to detect the absence of the missing record company’s catalog over a short period of time.

However, despite the many reasons that users were likely to believe that Pandora was still offering sound recordings from [ ], even if some listeners who received the label suppression treatment noticed a difference in the recordings played on their

Pandora stations, surmised that the difference might reflect a modification in Pandora’s service or music catalog, and attempted to investigate this difference, by Dr. Reiley’s design, they would have found no mention of any change in Pandora’s music catalog on its website or any other public sources during the experiment. Articles published during this period suggested

that Pandora’s music library continued to offer “over 40 million songs.”51 For example, a July 2019 Digital Trends article noted that Spotify’s and Pandora’s “libraries are very comparable,

and there aren’t any notable artists who appear on one service and not the other.”52

Similarly, as only a small subset of Pandora listeners were in the treatment groups, it is unlikely that friends and family members of users in the treatment groups also received the label

51 Nelson Aguilar, Spotify vs. Apple vs. Pandora vs. Tidal vs. Deezer vs. Amazon, Gadget Hacks (July 19, 2019), https://web.archive.org/web/20190721232028/https://smartphones.gadgethacks.com/news/best-music-streaming- services-spotify-vs-apple-vs-pandora-vs-tidal-vs-deezer-vs-amazon-0199737/. 52 Ryan Waniata & Parker Hall, Spotify vs. Pandora: Which music streaming service is better for you?, Digital Trends (July 25, 2019), https://www.digitaltrends.com/music/spotify-vs-pandora/. 16

Written Rebuttal Testimony of Catherine Tucker Public Version

suppression treatment. As a result, treatment users who noticed a reduction in variety or the absence of certain artists on Pandora’s service and attempted to confirm these suspicions by consulting other users likely would have found that their friends and family members experienced no change during the experiment.

Therefore, even if listeners detected some change during the experimental period and attempted to search for information about Pandora’s music library online or from friends or

family members, they would have been led to believe that Pandora still offered a full catalog of music, including recordings from the suppressed record company. This may have prevented users from becoming aware of the label suppression or modifying their behavior in response to the label suppression during the experiment. In contrast, in the real world, discussion in third-party reporting and friends comparing experiences would reinforce users’ perceptions of the missing content.

b. Many of the users in the experiment spent very little time listening to Pandora’s ad-supported service and were unlikely to have many recordings suppressed in the short timeframe of the experiment

Given the listening patterns of users in the treatment group, the average user likely would have not been aware that a record company’s catalog was being suppressed. [

53 Dr. Reiley testified that the Label Suppression Experiments were started partway through the day on June 3, but emphasized June 4 as the first full day of treatment when reporting the results. I refer to the experimental period as the 89 days between June 4 (the first full day of treatment) and August 31, 2019. Reiley Corrected WDT, at 9. 54 Tucker WRT, Appendix 1.

17

Written Rebuttal Testimony of Catherine Tucker Public Version

] It is not surprising that users were not able to identify (without being told) that the service they were listening to had suppressed less than one track per day on

average during the experimental period.

However, as discussed above in Section III.A.2, many of these users would become aware of Pandora’s loss of content from [ ] in reality, and would respond differently with this awareness.

c. Many of the users in the treatment groups in the experiment were exposed to recordings from the suppressed record company, further obscuring the treatment

In addition, Dr. Reiley identifies a number of ways in which users in the experiments did not receive the treatment as intended and still had access to recordings from the suppressed record company’s catalog. There were at least four main ways in which users in the treatment groups could have heard recordings from the suppressed record company during the experimental

period: (1) through recordings attributed to “miscellaneous providers” rather than the

55 Tucker WRT, Appendix 2. 56 Shapiro Corrected WDT, at 26, 30. 57 [ ]. These numbers reflect account-level information. To the extent that multiple individual users share the same account, this could mean that individuals listen to [ ] and would be even less likely to notice the label suppression treatment over the three-month experimental period. 58 For example, [ ] accounts for approximately [ ] percent of plays on Pandora, implying that, on average, [ ]. Shapiro Corrected WDT, at 26, 30.

18 Public Version

suppressed record company; (2) through Premium Access sessions; (3) through paid subscription tiers; and (4) through errors in the implementation of the experiment.

First, Dr. Reiley explains that there are a number of legacy “miscellaneous provider” tracks that are “not yet tied to our current direct license agreements [but] continue to be spun on the

Pandora service because of the long history of user data associated with those tracks.”59 Dr. Reiley acknowledges that some of these “miscellaneous provider” tracks “might actually be

tracks from the record company we were attempting to suppress—resulting in something less

than a full 100% suppression.”60 Dr. Reiley observes that the “‘miscellaneous provider’ legacy spin share in the [ ] treatment group approximately doubled, from about [ ] in the

control group to approximately [ ] in the treatment group”61 and notes that these spins included some recordings from the suppressed company:

In the [ ] treatment group, our analysis identified six out of the 60 “miscellaneous provider” tracks examined (10%) that appear to be covered by our [ ] license (and thus should have been suppressed). In the [ ] suppression group, they identified nine out of 60 tracks (15%) that appear to be covered by our [ ] license.62

Dr. Reiley considers this magnitude to be unimportant because only a small proportion of spins ([ ]) should have been suppressed but were not

because of this issue.63

However, Dr. Reiley fails to recognize the fact that miscellaneous provider tracks could have played a disproportionate role in preventing those in the treatment group from realizing that anything had changed. If Pandora were still delivering miscellaneous provider tracks from

59 Reiley Corrected WDT, at 17. 60 Reiley Corrected WDT, at 18. 61 Reiley Corrected WDT, at 18. 62 Reiley Corrected WDT, at 19. 63 Reiley Corrected WDT, at 19.

19

Written Rebuttal Testimony of Catherine Tucker Public Version

Adele on a user’s “Adele” channel,64 and furthermore, if they were delivering the type of Adele

tracks that had elicited the most user response,65 such as receiving “thumbs up,” then it would have been less apparent to an Adele fan that Pandora was no longer delivering other recordings by Adele. Even playing one track from a suppressed record company, particularly if it was of the type that had elicited the most user response, could prevent a user from concluding that they no longer had access to the suppressed record company. According to the data produced

by Dr. Reiley, [ ] in the [ ] treatment groups

received spins from miscellaneous providers during the experimental period.66

Second, users may have been exposed to recordings from the suppressed record company through Premium Access listening sessions, during which they could gain access to on-demand functionality for a limited period in exchange for viewing a video advertisement. Dr. Reiley describes that “[b]ecause interactive plays in the ‘Premium Access’ sessions fall outside the statutory license, we did not suppress music played in that feature or in the tracks that ‘auto-

play’ in a Premium Access session after an interactive spin.”67 While Dr. Reiley asserts that continued access to the suppressed record company via Premium Access listening sessions “should not have a significant impact on the results of the Label Suppression Experiments,” he again fails to recognize that these spins could have played a substantial role in preventing users

from detecting a change in the catalog of music available on their Pandora service.68 According to Dr. Reiley’s data, approximately 14 percent of users in the [ ] treatment group and 20 percent of users in the [ ] treatment group received at least one spin from the

64 For the purposes of this example, I am assuming that all Adele tracks are owned by the same record company. 65 Dr. Reiley describes that the likely explanation for the increase in spins from miscellaneous providers was that its “playlist algorithms, confronted with the inability to play [ ] tracks, had to turn to other tracks that our data suggested the user would enjoy—and the body of ‘miscellaneous’ legacy tracks with the deepest history of usage data were a natural place for our algorithms to look for substitutes.” Reiley Corrected WDT, at 18. 66 Rebuttal Appendix 1. This excludes users who, according to Dr. Reiley’s data, listened to zero tracks on the ad- supported radio service during the experimental period. 67 Reiley Corrected WDT, at 8. 68 Reiley Corrected WDT, at 21.

20

Written Rebuttal Testimony of Catherine Tucker Public Version

suppressed record company during Premium Access sessions during the 89-day experimental

period.69

Third, users who upgraded to Plus or Premium subscription tiers during the experiment no longer received the suppression treatment. Despite being analyzed as part of the treatment group, these users received recordings from the suppressed record company as normal. According to Dr. Reiley’s produced data, [

]70

Fourth, many users were exposed to recordings from the suppressed record company due to technical issues with the implementation of the experiment. Dr. Reiley describes that users may have received recordings from the suppressed record company due to various software upgrades on June 13-16, 2019 and June 26, 2019 or due to “routine changes and updates in

ownership information for recordings” on other days during the experimental period.71 As

shown in Rebuttal Appendix 1, [ ] were erroneously exposed to at least one spin from the suppressed record company on Pandora’s ad-supported radio service

while the experiment was running.72

4. Dr. Reiley’s statements about the merits of “blinded” experiments do not apply in this context

Dr. Reiley argues that conducting experiments in a “blind” manner represents the “best method

for determining the causal impact of the manipulated experience.”73 He states that:

“Blind” means experimental subjects are unaware of their assignment to treatment versus control conditions, so that their

69 Tucker WRT, Appendix 1. 70 Tucker WRT, Appendix 1. 71 Reiley Corrected WDT, at 19-20. 72 Tucker WRT, Appendix 1. 73 Reiley Corrected WDT, at 3. 21

Written Rebuttal Testimony of Catherine Tucker Public Version

behavior is influenced only by the different policies adopted by Pandora for each group in the experiment, and not by any communication about the experiment itself (such as listeners changing their behavior in an attempt to influence the results of the experiment, or because of the perceived unfairness of their receiving a different treatment from other customers). We prefer, for scientific reasons, not to bring attention to the fact that consumers are receiving different experiences from each other, so in general they are not aware that they are part of an experiment.74

However, these reasons for making the experiment “blind” underscore why the experiment did not measure the correct effect here. In randomized field experiments, researchers sometimes worry that subjects may change their behavior simply because they are aware that they are participating in an experiment, particularly if they are aware of the purpose of the experiment

and what the researchers are looking for, rather than because of the treatment itself.75 Researchers refer to these effects as “Hawthorne effects,” named after a study of different ways of improving productivity at a factory in which the researcher noticed that productivity increased in both the control and treatment conditions simply because the workers knew they

were being studied.76 As a result, researchers sometimes prefer to keep participants “blind” to the fact that they are part of an experiment and the purpose of that experiment.

In this case, however, Dr. Reiley’s concerns about Hawthorne effects are misplaced and do not justify his choice not to inform users about the label suppression treatment. As Dr. Reiley explained in deposition, [

74 Reiley Corrected WDT, at 4. 75 Glenn Harrison & John List, Field Experiments, 42 Journal of Economic Literature 1034 (Dec. 2004). 76 Glenn Harrison & John List, Field Experiments, 42 Journal of Economic Literature 1034 (Dec. 2004).

22

Written Rebuttal Testimony of Catherine Tucker Public Version

].77 However, recent re-analysis of this data has questioned whether Hawthorne

effects even existed in the original Hawthorne experiments.78

Indeed, researchers such as myself actually emphasize that there is likely to be an empirical distinction between users who are aware of the precise treatment and those who are not. For example, my co-author and I ran a field experiment where we used popularity information to reorder results on a webpage. In some treatment conditions, we made it clear that we had used

popularity information to reorder the results by giving data on the popularity of the links. In another condition, we simply reordered the links by popularity without alerting the users to the fact that we had done so. We found that users had a greater response to the “treatment” of re-

ranking by popularity when they were aware that was what we had done.79 In fact, the effect of knowledge of a treatment is so profound that many field experiments are designed to

measure the effect of knowledge of a treatment rather than the treatment itself.80 For example, several field experiments on tax non-compliance have examined the effect of informing

participants about the probability of audit on compliance behavior without changing the actual

probability of audit.81

The purpose of the Label Suppression Experiments is to estimate the real-world effects if Pandora stopped offering recordings by [ ]—an event which, as established in Section III.A.2 above, would be publicized by various sources and would be

77 SoundExchange Ex. 231 (Deposition of David Reiley), at 113:9-16 ([

]). 78 Steven Levitt & John List, Was There Really a Hawthorne Effect at the Hawthorne Plant? An Analysis of the Original Illumination Experiments, 3 American Economic Journal: Applied Economics 224-238 (Jan. 2011). 79 Catherine Tucker & Juanjuan Zhang, How Does Popularity Information Affect Choices? A Field Experiment, 57 Management Science 828-842 (May 2011). 80 Gordon Burtch, Anindya Ghose, & Sunil Wattal, The Hidden Cost of Accommodating Crowdfunder Privacy Preferences: A Randomized Field Experiment, 61 Management Science 949-962 (May 2015). 81 See, e.g., Luis Castro & Carlos Scartascini, Tax Compliance and Enforcement in the Pampas Evidence From a Field Experiment, 116 Journal of Economic Behavior & Organization 65-82, 66, 69 (2015); see also Joel Slemrod, Marsha Blumenthal, & Charles Christian, Taxpayer Response to an Increased Probability of Audit: Evidence from a Controlled Experiment in Minnesota, 79 Journal of Public Economics155-483 (2001). 23

Written Rebuttal Testimony of Catherine Tucker Public Version

known by many users. Therefore, while there may be reasons to keep users blind to the existence or purpose of the experiment, a properly-designed experiment in this instance must make users aware of the treatment—i.e., the fact that music from a given record company is no longer available—to understand how they would react in reality. Otherwise, Dr. Reiley is simply measuring the effect of an experimental treatment that is meaningfully different from the conditions that users would experience in the real world. To the extent that Dr. Reiley

worried that users would alter their behavior if they knew they were participating in an experiment, he could have made users aware of the absence of recordings from [ ] while keeping them “blind” to the experiment itself and its purpose.

Dr. Reiley acknowledges the possibility that if users had been aware of the experiment, they may have intentionally “chang[ed] their behavior in an attempt to influence the results of the

experiment.”82 This is telling as it suggests that Dr. Reiley expected that participants would have been concerned about the results of the experiment if they understood the nature and

purpose of that experiment. If Dr. Reiley believes that suppression of [ ]

has a “relatively small impact on listening hours” as he claims,83 then his concerns that users would alter their behavior to drive the experiment’s results are unfounded.

Similarly, Dr. Reiley indicates that another reason for keeping the field experiment secret was to avoid the “perceived unfairness” of one group of customers receiving a different treatment

from other customers.84 This suggests that Dr. Reiley believed that users care about having recordings from [ ] and would prefer to have access to those recordings,

such that it could be perceived as “unfair” if one group of users has access and another does not. Dr. Reiley anticipated that if users had known about the treatment, they may have responded

82 Reiley Corrected WDT, at 4. 83 Reiley Corrected WDT, at 16. 84 Reiley Corrected WDT, at 4.

24

Written Rebuttal Testimony of Catherine Tucker Public Version

differently than they did in the experiment when they were unaware of the treatment. [

].85

B. The Label Suppression Experiments do not help us understand competitive effects

In general, blinded field experiments are not known by competitors and therefore cannot be used to measure competitive responses to the tested treatment. If Pandora were to stop offering music from [ ], its competitors could respond in a number of ways. For example, competitors would have incentives to publicize the resulting hole in Pandora’s catalog and emphasize that their own service compares favorably on the breadth and depth of its catalog. Competitors may also target existing Pandora users by running advertising campaigns or offering promotional prices emphasizing this gap to entice users to leave Pandora and switch to their own service. In addition, competitors could respond to Pandora’s

diminished catalog by changing their own offerings, introducing new offerings, and/or changing their prices. In deposition, Dr. Reiley [

].86 The Label Suppression Experiments, however, cannot measure the impact of these potential competitive responses as they were deliberately conducted in secret and not visible to competitors.

85 See also SoundExchange Ex. 231 (Deposition of David Reiley), at 162:23-163:4 ([

]). 86 SoundExchange Ex. 231 (Deposition of David Reiley), at 163:5-164:4 ([

]).

25

Written Rebuttal Testimony of Catherine Tucker Public Version

1. Field experiments cannot be used to measure competitive effects

A field experiment cannot be used to measure competitive effects, and therefore cannot be used to make market share predictions. Indeed, I highlight this as a major caveat of the use of field experiments in my teaching at MIT. As field experiments are small in scale and difficult to detect, competitors are typically unaware as to what is occurring and consequently do not respond. If Pandora were to implement the “experiment” at a commercial scale in the in real

world, competitors presumably would find out about it, and might react.87 Competitive

reactions could include pricing changes, launching new promotional campaigns, or even new products.

2. Streaming services are competitive because it is easy for users to switch

Competitive reactions are important because, as I discuss in my own research and teaching,

streaming services have become increasingly competitive.88 Although there may be reasons

why users might refrain from switching between paid streaming services in the short term,89

these reasons are less prominent for ad-supported services. There are also solutions that reduce some of the nonfinancial costs associated with switching, such as by enabling users to transfer

playlists between certain services.90 As a result, users can move readily between services.

Furthermore, it is also easy for a user to have multiple accounts on different services. For example, a user (such as myself) easily might have a Spotify account, a Google Music account,

87 Professor Shapiro recognizes but does not adjust for this deficiency of the Reiley experiments, noting: “the experiments do not account for the strategic responses of Pandora, the record company, and perhaps other industry participants, to the blackout. This factor is ambiguous; I make no additional adjustment based on this factor.” Shapiro Corrected WDT, at 19-20. 88 Catherine Tucker, Network Effects Matter Less Than They Used To, Harvard Business Review (June 22, 2018), https://hbr.org/2018/06/why-network-effects-matter-less-than-they-used-to. 89 Written Direct Testimony of Aaron Harrison, Sept. 22, 2019, at 14-16. 90 There are even apps that allow users to transfer playlists between Pandora and Spotify. “Soundiiz General Features,” Soundiiz, https://soundiiz.com/features; see also Amber Neely, How to transfer playlists from Spotify to Apple Music, Apple Insider (Aug. 18, 2019), https://appleinsider.com/articles/19/08/18/how-to-transfer-playlists-from-spotify-to- apple-music.

26

Written Rebuttal Testimony of Catherine Tucker Public Version

an Amazon music account, and a Pandora account. [

].91 In the language of platform economics, we call this behavior “multihoming”— where users have multiple homes on different platforms—and economics has shown that

multihoming on both sides of a platform can lead to more intense competition.92 To understand this concept, it is useful to consider the example of Lyft and Uber. It is easy for passengers to

multihome by downloading both apps to their smartphones. This means that Uber and Lyft have to compete on price to avoid passengers just using the other service exclusively. When users multihome streaming services, those services have to compete for user engagement to

decrease their churn.93

91 See, e.g., SoundExchange Ex. 62, [ ], PANWEBV_00004249, at 00004314 ([ ]); SoundExchange Ex. 395, [ ], PANWEBV_00004469, at 00004499 ([ ]); see also SoundExchange Ex. 58, [ ], PANWEBV_00003357, at 00003407 ([ ]); SoundExchange Ex. 402, [ ], PANWEBV_00004024, at 00004054 ([ ]); SoundExchange Ex. 400, [ ], PANWEBV_00009100, at 00009139 ([

]); SoundExchange Ex. 288, [ ], SXMWEBV_00005444, at 00005447 ([ ]). 92 Jean-Charles Rochet & Jean Tirole, Platform Competition in Two-Sided Markets, 1 Journal of the European Economic Association 990-1029 (June 2003); see also, Marco Iansiti & Karim R. Lakhani, Managing Our Hub Economy, Harvard Business Review (Sept.–Oct. 2017), https://hbr.org/2017/09/managing-our-hub-economy (“If multihoming is common, the market is less likely to tip to a single player, preserving competition and diffusing value capture.”). 93 SoundExchange Ex. 206, [ ], PANWEBV_00005332, at 00005335 ([ ]); SoundExchange Ex. 205, [ ] (Jan. 2019), PANWEBV_00004571, at 00004604 ([ ]); see also SoundExchange Ex. 288, [ ], SXMWEBV_00005444, at 00005447([

]); SoundExchange Ex. 401, [ ], PANWEBV_00009182, at 00009185, 00009213 ([

27

Written Rebuttal Testimony of Catherine Tucker Public Version

Competitive pressures can also be seen in the increasing similarity of service offerings across different providers. For example, as I discussed in my Written Direct Testimony, Pandora has recently started to emulate Spotify’s personalized offerings. In September 2018, Pandora introduced “The Drop” on its premium subscription service. This feature, much like Spotify’s

“Release Radar” feature, curates new releases based on the individual user’s listening history.94 Pandora also introduced dozens of personalized playlists tailored to moods, activities, and

favorite genres, similar to Spotify’s popular curated playlists.95

In addition, competitive pressures are also reflected in the prevalence of discounts offered by streaming services to attract users. For example, Spotify has run promotions where it offered

new users three months on its Premium subscription service for $0.99 per month.96 In 2018, Pandora partnered with T-Mobile to offer a promotion where it offered T-Mobile customers a

year-long subscription to Pandora Plus for free.97 Amazon has also offered discounts on its

service..98 Several services, including Spotify, Apple Music, SiriusXM, and Pandora, offer

discounted student and family plans.99

]). 94 Tucker WDT, at 15-17. 95 Tucker WDT, at 16-17. 96 Hilda Scott, “Act Fast: Spotify Premium Just $1 for 3 Months,” Tom’s Guide, May 16, 2019, https://www.tomsguide.com/us/best-spotify-premium-deals,news-30097.html. 97 Pandora teams up with T-Mobile as an Un-carrier partner for unlimited ad-free music, Pandora Blog (Aug. 15, 2018), http://blog.pandora.com/us/pandora-teams-up-with-t-mobile-as-an-un-carrier-partner-for-unlimited-ad-free- music/. 98 See, e.g., Sally Kaplan, Amazon is offering 4 months of access to its music-streaming service for $1 as a Cyber Monday deal — here’s how to take advantage, Business Insider (Dec. 2, 2019), https://www.businessinsider.com/ amazon-music-unlimited-deal. 99 Pandora Premium Student, Pandora (2019), https://www.pandora.com/upgrade/premium/student; Pandora Premium for Families, Pandora (2019), https://www.pandora.com/upgrade/premium/family-plan?TID= PM:PSE:Google&gclid=Cj0KCQiAw4jvBRCJARIsAHYewPP_2y9rQDmMsvF09oDTAUp-fZkm7fU_ ixzim2U6mjrK0ku4n2nx0eYaApywEALw_wcB; Spotify Premium, Spotify, https://www.spotify.com/us/premium/ (last visited January 10, 2020); Apple Music, Apple, https://www.apple.com/apple-music/ (last visited January 10, 2020); I’m already a subscriber. Do I get a discount on additional subscriptions?, SiriusXM (2019), https://listenercare.siriusxm.com/app/answers/detail/a_id/3680/~/im-already-a-subscriber.-do-i-get-a-discount-on- additional-subscriptions%3F.

28

Written Rebuttal Testimony of Catherine Tucker Public Version

3. Competitors could respond to a reduction in Pandora’s catalog

As discussed in Section III.A.2 above, if Pandora were to lose access to [ ] music catalog, its competitors would have incentives to publicly promote this event to consumers. This type of competitive reaction is common in the broader content streaming industry. For example, Hulu has advertised that its service offered content that was

unavailable on Netflix, including popular shows such as Seinfeld and 30 Rock.100 Similarly, Spectrum, a cable company, advertises that they are the exclusive broadcaster of Los Angeles

Dodgers games.101

More generally, it is common for competitors to promote each other’s weaknesses, as demonstrated by examples in other industries. For example, Apple and Samsung have highlighted features missing from the other’s smartphones for several years. Just recently, in September 2019, Samsung launched an advertising campaign coinciding with the release of the iPhone 11 “trying to tempt iPhone users to pick up the Galaxy Note 10 by highlighting

camera features the new iPhones don’t have.”102 Similarly, Apple and Microsoft have a long history of advertising each other’s weaknesses in the desktop and laptop space. A recent advertisement by Microsoft compares its Surface laptop to an Apple MacBook, highlighting

that the Microsoft Surface lasts longer, is faster, and has a better touch screen.103

100 Hulu (@hulu), Twitter (Sept. 16, 2019), https://twitter.com/hulu/status/1173724121726738433; Hulu (@hulu), Twitter (Oct. 4, 2017), https://twitter.com/hulu/status/915648154854404097; Hulu (@hulu), Twitter (Oct. 3, 2017), https://twitter.com/hulu/status/915294183363170306; Hulu (@hulu), Twitter (Oct. 3, 2017), https://twitter.com/hulu/ status/915283841098682368. 101 Bill Shaikan, For the sixth year in a row, most Dodgers fans can’t watch their team on television, L.A. Times (Mar. 8, 2019), https://www.latimes.com/sports/dodgers/la-sp-dodgers-20190308-story.html. 102 Michael Potluck, Samsung Galaxy ad uses missing iPhone 11 camera feature as bait to switch, 9to5Mac (Sept. 13, 2019), https://9to5mac.com/2019/09/13/samsung-iphone-11-missing-camera-feature/. 103 Lisa Eadicicco, Microsoft hired a man named Mac Book to star in its latest ad slamming Apple’s laptops, Business Insider (Aug. 1, 2019), https://www.businessinsider.com/microsoft-slams-apple-macbook-laptops-ad-2019-8; see also Seth Stevenson, Mac Attack: Apple’s mean-spirited new ad campaign, Slate (June 19, 2006) https://slate.com/business/2006/06/apple-s-mean-spirited-ad-campaign.html.

29

Written Rebuttal Testimony of Catherine Tucker Public Version

In addition, the affected record company and its artists may also be motivated to promote Pandora’s diminished catalog in an effort to encourage consumers to use alternative streaming services that offer their music instead. For example, Taylor Swift appeared in several advertisements promoting Apple Music in 2016 when Apple Music was the only service

streaming her music catalog.104

Negative advertisements from competitors, like those observed in other industries, and publicity from record companies and artists could substantially affect Pandora’s brand and reputation among consumers. Indeed, research has shown that comparative advertising can be

effective in damaging a targeted brand.105 Dr. Reiley’s Label Suppression Experiments cannot capture such effects.

C. Notwithstanding that the Label Suppression Experiments did not measure the treatment of interest, the experiments as conducted do not provide precise estimates of the effect of 100 percent label suppression

1. The Label Suppression Experiments were underpowered

In the case of the Label Suppression Experiments, Dr. Reiley uses a sample size of

approximately 15,000 listeners for each of the treatment groups for [ ].106 According to Dr. Reiley, this sample size was selected “based on the tradeoff between statistical power (more data gives more precise estimates) and the potential business impact of

exposing large groups of listeners to a full suppression [ ].”107 Dr. Reiley calculated, ex ante, that this sample size would give him an “80% probability of detecting a

104 Don Reisinger, Here’s the Latest Taylor Swift Apple Music Ad to Go Viral, Fortune (Apr. 18, 2016), https://fortune.com/2016/04/18/taylor-swift-apple-music/. 105 Simon P. Anderson, et al., Push‐Me Pull‐You: Comparative Advertising in the OTC Analgesics Industry, 47 RAND Journal of Economics 1029-1056 (Nov. 2016). 106 Reiley Corrected WDT, at 9. [ ]. 107 Reiley Corrected WDT, at 9.

30

Written Rebuttal Testimony of Catherine Tucker Public Version

statistically significant change in listening hours, relative to the control, if the true change were

4% in the [ ] suppression treatments.”108

As established above in Section III.A, the Label Suppression Experiments are not useful for measuring the real-world impact of losing access to [ ] catalog because, among other things, the participants were “blind” to the suppression and many users were unlikely to experience many suppressed recordings during the short experimental period.

Given these flaws, the Label Suppression Experiments could have measured, at best, a small effect on listener hours in the short timeframe in which the experiment was conducted. In reality, Pandora’s loss of access to [ ] catalog would have a much larger effect on listening hours.

However, the Label Suppression Experiments were underpowered to detect small effects over the three-month experimental period. Dr. Reiley’s experimental design does not adequately account for the highly skewed distribution of listening patterns in the population, meaning that

a large number of users in the treatment groups were likely to have very few recordings actually suppressed. Further, [

].109

Dr. Reiley’s previous work has highlighted the difficulty of establishing ex ante an appropriate sample size for a field experiment. In his paper about experiments in online brand advertising, Dr. Reiley and his co-author commented, “[w]e began this research project expecting that an experiment with more than a million customers would give precise statistical results—we now

108 Reiley Corrected WDT, at 9. 109 SoundExchange Ex. 231 (Deposition of David Reiley), at 138:23-24: ([ ]), 141:12-21 ([

]).

31

Written Rebuttal Testimony of Catherine Tucker Public Version

think otherwise.”110 Dr. Reiley and his co-authors observed that “an economically significant (i.e., profitable) effect of advertising could easily fail to be statistically significant even in a

clean experiment with hundreds of thousands of observations per treatment.”111

A different study by two of Dr. Reiley’s co-authors found that because clicks on online advertisements are a rare event and individual-level sales are highly volatile, “informative advertising experiments can easily require more than ten million person-weeks” to detect the

effect of advertising”.112 The authors found that identifying a five percent return on investment for a campaign would require a sample so large that “the total U.S. population and the

advertiser’s annual advertising budget are binding constraints in most cases.”113

Consistent with this, Dr. Reiley’s published studies on advertising effects required samples many times larger than the sample sizes he selected for the Label Suppression Experiments. For example, to measure consumer sensitivity to audio advertising on Pandora’s service, Dr. Reiley conducted a randomized experiment over 21 months and included approximately 1.8

million Pandora users in each treatment group.114 Similarly, Dr. Reiley conducted an experiment to measure causal effects of online advertising for a major retailer with a treatment

group of approximately 1.3 million individuals.115 These sample sizes are approximately 86 to 2,300 times larger than the sample of approximately 15,000 users for each of the [

110 Randall Lewis & David Reiley, Online Ads and Offline Sales: Measuring the Effects of Retail Advertising via a Controlled Experiment on Yahoo!, 12 Quantitative Marketing and Economics 235-266, 238 (Sept. 2014). 111 Randall Lewis & David Reiley, Online Ads and Offline Sales: Measuring the Effects of Retail Advertising via a Controlled Experiment on Yahoo!, 12 Quantitative Marketing and Economics 235-266, 238 (Sept. 2014). 112 Randall Lewis & Justin Rao, The Unfavorable Economics of Measuring the Returns to Advertising, 130 Quarterly Journal of Economics 1941-1973, at Abstract, at 1941 (Nov. 2015). 113 Randall Lewis & Justin Rao, The Unfavorable Economics of Measuring the Returns to Advertising, 130 Quarterly Journal of Economics 1941-1973, 1964 (Nov. 2015). 114Jason Huang, David Reiley, and Nickolai Riabov, Measuring Consumer Sensitivity to Audio Advertising: A Field Experiment on Pandora Internet Radio (Apr. 21, 2018), https://ssrn.com/abstract=3166676, at Abstract, 3-4. 115 Randall Lewis and David Reiley, Online Ads and Offline Sales: Measuring the Effects of Retail Advertising via a Controlled Experiment on Yahoo!, 12 Quantitative Marketing and Economics 235-266, 239, 247 (Sept. 2014).

32

Written Rebuttal Testimony of Catherine Tucker Public Version

] treatment groups in the Label Suppression Experiments.116 Indeed, Dr. Reiley testified in deposition that [

]117

Further, as noted by Dr. Reiley, the samples used in the Label Suppression Experiments were considerably smaller than those used by Pandora’s witness in the Web IV proceedings, Dr. McBride, in a series of similar “Steering Experiments” that [

].118 Dr. McBride’s treatment groups for [ ] included approximately 5 to 8 percent of Pandora listeners, approximately 250 to 400 times

larger than those used in the Label Suppression Experiments.119 In deposition, Dr. Reiley noted that Dr. McBride’s [

]120 Dr. Reiley also

testified that [

]121

116 Calculated as 1.3 million / 15,000 = 86.67 and 35 million / 15,000 = 2,333.33. 117 SoundExchange Ex. 231 (Deposition of David Reiley), at 36. 118 Written Direct Testimony of Stephan McBride, Oct. 7, 2014, In re Determination of Royalty Rates and Terms for Ephemeral Recording and Digital Performance of Sound Recordings (Web IV) (“McBride Web IV WDT”). 119 Calculated as 0.05 / 0.0002 = 250 and 0.08 / 0.0002 = 400. Reiley Corrected WDT, at 9; McBride Web IV WDT, at 7. 120 SoundExchange Ex. 231, (Deposition of David Reiley), at 124:7-15 ([

]). 121 SoundExchange Ex. 231 (Deposition of David Reiley), at 124:17-125:9 ([

33

Written Rebuttal Testimony of Catherine Tucker Public Version

2. The Label Suppression Experiments were deliberately limited in scope because the researchers anticipated that the consequences of a larger experiment could negatively affect Pandora’s business

The Label Suppression Experiments were designed to minimize potential publicity of the removal of recordings from Pandora’s catalog by restricting the sample to approximately

15,000 users for each of the [ ].122 This was done because of “the potential business impact of exposing large groups of listeners to a full

suppression of [ ].”123 As Dr. Reiley described in his deposition, [

]124 In other words, [ ]. This is at odds with Dr. Reiley’s conclusions that “a near-total suppression of spins of any single record company does not lead to large decreases in the

number of listeners or the number of hours they spend listening to Pandora.”125

To the extent that Dr. Reiley argues that he was wary of a potential business impact simply because he was not aware ex ante that the effect of the label suppression treatment would be

]). 122 Reiley Corrected WDT, at 9. 123 Reiley Corrected WDT, at 9. 124 SoundExchange Ex. 231 (Deposition of David Reiley), at 33:20-34:7 ([

]). Dr. Reiley further indicated that [ ] SoundExchange Ex. 231 (Deposition of David Reiley), at 36:19-24 ([

]). 125 Reiley Corrected WDT, at 15.

34

Written Rebuttal Testimony of Catherine Tucker Public Version

small, he could have rerun the experiment on a larger sample after observing the effect size if

he believed that these results would hold on a larger scale.126,127

3. The Label Suppression Experiments struggled to attain precision on estimates because of the inclusion of many users who received little or no suppression treatment

Due to the design of Dr. Reiley’s experiment, many of the users in the treatment group were rarely, if ever, exposed to the treatment. As shown in Rebuttal Appendix 1, [

126 Further, Dr. Reiley could have stopped the experiment if the effect was larger than anticipated. For example, a medical experiment studying potential treatments for twin-to-twin transfusion syndrome was stopped earlier than planned because the treatment proved so effective. Marie-Victoire Senat, et al., Endoscopic Laser Surgery versus Serial Amnioreduction for Severe Twin-to-Twin Transfusion Syndrome, 351 New England Journal of Medicine 136- 144 (July 2004). 127 Indeed, Dr. Reiley testified that [

]. SoundExchange Ex. 231, (Deposition of David Reiley), at 130:10-18 ([

]). 128 Tucker WRT, Appendix 1. 129 Tucker WRT, Appendix 1.

35

Written Rebuttal Testimony of Catherine Tucker Public Version

130 Tucker WRT, Appendix 1. [

]. Shapiro Corrected WDT, at 26, 30. 131 Tucker WRT, Appendix 1. [

]. Shapiro Corrected WDT, at 26, 30. 132 Tucker WRT, Appendix 1. 133 Tucker WRT, Appendix 1. 134 Tucker WRT, Appendix 1.

36

Written Rebuttal Testimony of Catherine Tucker Public Version

135 Tucker WRT, Appendix 1. 136 Tucker WRT, Appendix 1. 137 Tucker WRT, Appendix 1. [

]. 138 SoundExchange Ex. 231 (Deposition of David Reiley), at 67:25-68:10 ([

]). Dr. Reiley also testified that [ ]. SoundExchange Ex. 231 (Deposition of David Reiley), at 64:1365:9 ([

]).

37

Written Rebuttal Testimony of Catherine Tucker Public Version

139 SoundExchange Ex. 231 (Deposition of David Reiley), at 68:11-23 ([

]). As Dr. Reiley testified in deposition, [

]. Like Dr. Reiley, I am unable to observe such errors. To the extent that such errors exist, this would call into question the validity of the results. SoundExchange Ex. 231 (Deposition of David Reiley), at 49:25-50:8 ([

), 58:20-24 ([ ]). 140 Tucker WRT, Appendix 1. 141 As noted in Section III.A, the “treatment” administered by Dr. Reiley is unrealistic and does not reflect important characteristics of Pandora’s loss of content from [ ] in the real world. 142 Tucker WRT, Appendix 1.

38

Written Rebuttal Testimony of Catherine Tucker Public Version

]

Overall, Dr. Reiley estimates that approximately [ ] of total spins in the [ ] treatment group were from the suppressed record company as compared to approximately [ ] of spins in the control group, meaning that the Label Suppression Experiments

achieved approximately [ ] suppression rather than the intended 100 percent.144 Dr. Reiley argues that [

]145

However, this claim is unsubstantiated and ignores literature suggesting that people can behave differently at the extremes of a distribution. For example, researchers have found substantial nonlinearities around zero price, such that significantly more participants choose the cheaper

143 Tucker WRT, Appendix 1. [

]. 144 Reiley Corrected WDT, at 22. 145 SoundExchange Ex. 231 (Deposition of David Reiley), at 140:5-16 ([

]).

39

Written Rebuttal Testimony of Catherine Tucker Public Version

option when it is offered for free than when it is set at a low positive price.146 Further, research of decision-making under uncertainty has demonstrated that people tend to respond differently to outcomes that are merely probable as compared to those that are certain, and tend to

overweight low probability events.147 Correspondingly, users may respond differently when they receive some recordings from the suppressed record company than when they receive no recordings from the suppressed record company.

4. Unlike other work by Dr. Reiley, Dr. Reiley’s analysis of the Label Suppression Experiments does not use data on the intensity of treatment

Unlike other published work by Dr. Reiley, Dr. Reiley’s analysis of the Label Suppression Experiments does not use data on the actual treatment administered—i.e., the number of recordings suppressed for each user. Dr. Reiley testified in deposition that [

].148 Given that the Label Suppression Experiments attempt to measure the effect of suppressing tracks on user behavior, [

]. While Dr. Reiley said that the Label Suppression Experiments were conducted in accordance with how Pandora regularly conducts experiments, it is clear that this departs from procedures used in Dr. Reiley’s published work, including a study conducted on Pandora users.

146 Kristina Shampanier, Nina Mazar, & Dan Ariely, Zero as a Special Price: The True Value of Free Products, 26 Marketing Science 742-757 (Nov.-Dec. 2007). 147 Daniel Kahneman & Amos Tversky, Prospect Theory: An Analysis of Decision Under Risk, 47 Econometrica 263- 292 (Mar. 1979). 148 SoundExchange Ex. 231 (Deposition of David Reiley), at 155:19-25 ([

]), 198:21-199:8 ([

]).

40

Written Rebuttal Testimony of Catherine Tucker Public Version

In 2018, Dr. Reiley and his co-authors published a study of the long-run effect of increasing

ad-load on consumer listening behavior on Pandora’s service (the “Ad-Load Experiments”).149 In these experiments, Dr. Reiley and his co-authors collected data on the number of ads shown to each user (i.e., the “intensity of treatment”). The major focus of the empirical work, and the method by which the authors increased the precision of their estimates, involved estimating a

specification that measured the effect of the intensity of the treatment on listening behavior.150

The authors explained the importance of using this analytical approach, noting that this “guarantees that we exploit only the experimental differences between groups to identify the causal effects, removing all within-group variation that might yield spurious correlation—for example, urban listeners might both receive more ads and listen fewer hours, on average, than

rural listeners, even though this correlation is not at all causal.”151

In another experiment published in 2014, Dr. Reiley and a co-author examined the effects of a

nationwide retailer’s online advertising campaign on Yahoo! on offline sales.152 The authors

collected data on the number of ads from the campaign shown to each user in the treatment group, allowing them to identify the amount of treatment received by each individual. The authors used this information in their empirical analysis to estimate the treatment effect on the treated (i.e., the individuals in the treatment group who were exposed to at least one ad from the relevant campaign) and to examine how the treatment effect varied with the number of ads

seen.153

149 Jason Huang, David Reiley, & Nickolai Riabov, Measuring Consumer Sensitivity to Audio Advertising: A Field Experiment on Pandora Internet Radio, at 1 (Apr. 21, 2018), https://ssrn.com/abstract=3166676. 150 Jason Huang, David Reiley, &and Nickolai Riabov, Measuring Consumer Sensitivity to Audio Advertising: A Field Experiment on Pandora Internet Radio, at 9-10 (Apr. 21, 2018), https://ssrn.com/abstract=3166676. These estimates are also cited in the abstract and conclusion of the paper. 151 Jason Huang, David Reiley, & Nickolai Riabov, Measuring Consumer Sensitivity to Audio Advertising: A Field Experiment on Pandora Internet Radio, at 8 (Apr. 21, 2018), https://ssrn.com/abstract=3166676. 152 Randall Lewis and David Reiley, Online Ads and Offline Sales: Measuring the Effects of Retail Advertising via a Controlled Experiment on Yahoo!, 12 Quantitative Marketing and Economics 235-266 (Sept. 2014). 153 Randall Lewis and David Reiley, Online Ads and Offline Sales: Measuring the Effects of Retail Advertising via a Controlled Experiment on Yahoo!, 12 Quantitative Marketing and Economics 235-266, 246-248, 260 (Sept. 2014). 41

Written Rebuttal Testimony of Catherine Tucker Public Version

In contrast, in the Label Suppression Experiments, the treatment is the number of recordings suppressed. While Dr. Reiley collected data on the number of spins from the suppressed record company that were erroneously delivered to each user, [ ]. As such, he is unable to use the intensity of treatment in his analysis, as he has done in previous studies.

D. The Label Suppression Experiments are not useful for estimating true long-run effects

Dr. Reiley conducted the Label Suppression Experiment over an 89-day experimental period, from June 4 to August 31, 2019. That is insufficient to assess the anticipated real world effect of losing access to content from [ ] that Dr. Reiley was attempting to measure. The main effect of such a change to the Pandora service likely would be to deter potential future Pandora ad-supported users, and increase the likelihood that existing Pandora ad-supported users would leave the service at some point in the future to switch to an alternative

rival service such as Spotify or Apple Music. This is a long-run process.

That is, in part, because it would likely take time for many current and potential Pandora listeners to learn about the compromised quality of Pandora’s webcasting service. Consumers who had noticed that the service offered less than before, may have thought this was just temporary and may have been waiting for things to return to normal, given the short length of the experiment. Consumer learning can lead to substantial differences in the measured effect

of a treatment over time.154

154 See, e.g., Ronny Kohavi, Trustworthy Online Controlled Experiments and the Risks of Uncontrolled Observational Studies, Microsoft, at 15 (2019), https://exp-platform.com/Documents/2019-08%20CausalDiscoveryExPRonnyK.pdf (noting the importance of designing experiments to address “customer lifetime value, not immediate revenue.”). Indeed, the large and significant increase in thumbs-down activity observed in the [ ] treatment group may be a leading indicator of future reduced listening behavior, providing potential evidence that long-run effects are larger than short-run effects. Given Dr. Reiley’s small sample sizes and the short timeframe of the experiment, this type of effect may not have been detectable for the other tested record companies, [ ]. Reiley Corrected WDT, at 14-15. Research suggests that short-term measures of

42

Written Rebuttal Testimony of Catherine Tucker Public Version

As such, the effects of Pandora’s loss of content on its business would likely develop over time. These long-run effects would be considered by both parties in a negotiation between a willing buyer and willing seller. Further, because the Web V proceeding will determine the statutory rate for a five-year time period, it is important to consider the potential cost of not having a license over a comparable period.

Because they were conducted for only 89 days, Dr. Reiley’s Label Suppression Experiments cannot measure the long-run effects of Pandora’s loss of content from [ ]. As Dr. Reiley testified in deposition, [

]155 Dr. Reiley explained:

[

].156

Academic research has highlighted the importance of measuring long-run effects and has cautioned against extrapolating observed short-run effects without understanding long-run processes. For example, a 2004 study examined the effect of price promotions on purchasing behavior over a two-year period and found that deeper price discounts increased future purchases by first-time customers (a positive long-run effect) but reduced future purchases by established customers (a negative long-run effect). The authors warned that “[i]f firms focus solely on short-run elasticities, or they fail to distinguish between first-time and established

consumer satisfaction can provide useful proxies for long-term effects of a treatment. Henning Hohnhold, Deirdre O’ Brien, & Diane Tang, Focusing on the Long-term: It’s Good for Users and Business (2015), htt p://static.googleusercontent.com/media/research.google.com/en//pubs/archive/43887.pdf. 155 SoundExchange Ex. 231 (Deposition of David Reiley), at 39:7-11 ([

]). 156 SoundExchange Ex. 231 (Deposition of David Reiley), at 121:5-11.

43

Written Rebuttal Testimony of Catherine Tucker Public Version

customers, then prices may be set incorrectly.”157 Another 2007 study conducted by three Google data scientists and identified as recommended reading by Dr. Reiley for his “Field Experiments” class at UC Berkeley similarly warns that “the short-term effect is not always

predictive of the long-term effect.”158

Dr. Reiley’s prior work has also highlighted the importance of directly measuring long-run effects. For example, in his co-written paper on the long-run effect of advertising on listener-

hours (the “Ad-Load Experiments”), Dr. Reiley emphasized the importance of distinguishing

between long-run and short-run effects and why the measurement of short-run effects cannot be

extrapolated to the long run.159 In that paper, Dr. Reiley noted that “how important it is that we ran the experiment for over a year… the treatment effect grows over the course of an entire

157 Eric Anderson & Duncan Simester, Long-Run Effects of Promotion Depth on New Versus Established Customers: Three Field Studies, 23 Marketing Science 4-20, at 1, 5 (Winter 2004). 158 Henning Hohnhold, Deirdre O’Brien, & Diane Tang, Focusing on the Long-term: It’s Good for Users and Business, at 1 (2015), https://static.googleusercontent.com/media/research.google.com/en//pubs/archive/43887.pdf; SoundExchange Ex. 404, David Reiley, Field Experiments (2015), https://docs.google.com/document/d/ 1BDUxgzEk1vWXMiV2nzpZzoa7Hyp8I9LBiYmqA9XtZpo/edit; SoundExchange Ex. 376, David Reiley, Suggestions for Further Reading, W241: Experiments and Causality (2015), https://docs.google.com/document/d/ 1IMsGTHmklhvetfJJfEm9dhoFM7bvb-YOkN_6mAM8kFM/edit#. Similarly, another recommended reading for Dr. Reiley’s course warns that partial equilibrium effects may differ from full equilibrium effects (i.e., when the treatment is rolled out to everyone and individuals have time to adjust their behavior in response). Dr. Reiley described “Sometimes experiments only manage to estimate ‘partial equilibrium’ effects instead of ‘general equilibrium’ effects…That is, sometimes the treatment has one effect when only a few people are being treated (as in an experiment), but when everyone is being treated (as a policy is rolled out to everyone), the total effects are quite different because, for example, market prices change.” David Yanagizawa-Drott & Jakob Svensson, Estimating Impact in Partial vs. General Equilibrium: A Cautionary Tale from a Natural Experiment in Uganda (Aug. 2012), https://epod.cid.harvard.edu/sites/default/files/2018-02/estimating_impact_in_partial_vs._general_equilibrium-_a_ cautionary_tale_from_a_natural_experiment_in_uganda.pdf. More generally, see, e.g., Deepa Chandrasekaran & Gerard J. Tellis, “Chapter 14: A summary and review of new product diffusion models and key findings” in Handbook of Research on New Product Development (Peter Golder and Debanjan Mitra eds., Edward Elgar Publishing, 2018); Andrew Baker & Naveen Naveen, “Chapter 15: Word-of-mouth processes in marketing new products: recent research and future opportunities” in Handbook of Research on New Product Development (Peter Golder and Debanjan Mitra eds., Edward Elgar Publishing, 2018). 159 Jason Huang, David Reiley, and Nickolai Riabov, Measuring Consumer Sensitivity to Audio Advertising: A Field Experiment on Pandora Internet Radio, at 10 (Apr. 21, 2018), https://ssrn.com/abstract=3166676 (“How much does it matter that we conducted a long-run rather than a short-run experiment? To see how our estimates change with longer exposure to the treatment, we run a 2SLS regression for each month of the experiment as if that month were the final one… The estimated effects of a 1% increase in ad load, on hours and days active, respectively, start out at around - 0.02% and -0.025%, slowly increasing to effects of -0.070% and -0.076%. Had we run an experiment for just a month or two, we could have underestimated the true long-run effects by a factor of 3.”).

44

Written Rebuttal Testimony of Catherine Tucker Public Version

year, stabilizing for the most part only after 12-15 months of treatment.”160 Dr. Reiley and his co-authors concluded that “[h]ad we run an experiment for just a month or two, we could have

underestimated the true long-run effects by a factor of 3.”161

In fact, Dr. Reiley’s Ad-Load Experiment indicates that a three-month treatment period, with measurement for only 28 days, may not have been able to measure a statistically significant effect. While the results of the Ad-Load Experiment suggest that the long-run effects on

listening hours are large and robust, it is not clear that the experiment would have returned a result that was statistically different from zero if the treatment effect had only been calculated

for the first 28 days of the experiment.162

Importantly, whereas in his prior study, Dr. Reiley observed that “the true long-run effects” were larger than short run effects “by a factor of 3,” Dr. Reiley did not make a claim, either in his prior published work or in his testimony in this matter, that this relationship is a general

rule or that it can be generalized to completely different situations.163 In fact, Dr. Reiley testified

at deposition that [

]164 He explained that [

160 Jason Huang, David Reiley, and Nickolai Riabov, Measuring Consumer Sensitivity to Audio Advertising: A Field Experiment on Pandora Internet Radio, at 7 (Apr. 21, 2018), https://ssrn.com/abstract=3166676. 161 Jason Huang, David Reiley, and Nickolai Riabov, Measuring Consumer Sensitivity to Audio Advertising: A Field Experiment on Pandora Internet Radio, at 10 (Apr. 21, 2018), https://ssrn.com/abstract=3166676. 162 Jason Huang, David Reiley, and Nickolai Riabov, Measuring Consumer Sensitivity to Audio Advertising: A Field Experiment on Pandora Internet Radio, at 7-8 (Apr. 21, 2018), https://ssrn.com/abstract=3166676. Specifically, based on Figure 4, the effect on listening hours in the first month of the treatment appears to be less than 0.5 percent. Table 4 indicates that the standard errors measured during the final month of the treatment were approximately 0.22 to 0.24 percent. Assuming these standard errors reasonably approximate (or understate) the standard errors in the first month of treatment, it is not clear that the effect on listener hours in the first month of treatment is statistically significantly different from zero. 163 Jason Huang, David Reiley, and Nickolai Riabov, Measuring Consumer Sensitivity to Audio Advertising: A Field Experiment on Pandora Internet Radio, at 10 (Apr. 21, 2018), https://ssrn.com/abstract=3166676. 164 SoundExchange Ex. 231 (Deposition of David Reiley), at 122:15-23 ([

]).

45

Written Rebuttal Testimony of Catherine Tucker Public Version

]165

IV. Professor Shapiro Presents Insufficient Analysis to Conclude that No Label is a “Must-Have”

In his Corrected Written Direct Testimony, Professor Shapiro uses results from the Label Suppression Experiments as an input to a model estimating the royalty rates that noninteractive services would be willing to pay to digitally perform record companies’ sound recordings.

Professor Shapiro uses the [ ] experiment (which shows the largest estimated impact of

the label suppression treatment on listening hours) for [ ].166 Professor Shapiro uses the upper bound of the 95 percent confidence interval from the [ ] experiment (which he claims is to account for the fact that participants “were presumably not aware of the blackout” and for “certain imperfections in the experimental

implementation”) and multiplies by a factor of three (which he claims is to address the short- term nature of the Label Suppression Experiments), to estimate the record company’s opportunity cost of licensing its music to a statutory webcaster, as well as each party’s gains

from trade from a licensing deal.167 These are an input into Professor Shapiro’s bargaining

model, which he uses to estimate a reasonable royalty rate.168

165 SoundExchange Ex. 231 (Deposition of David Reiley), at 121:20-122:14 ([

]). Dr. Reiley indicated that [ ] This does not appear to be the case, as demonstrated by the examples of other long-run experiments discussed in this section. 166 Shapiro Corrected WDT, at 19, 22. 167 Shapiro Corrected WDT, at 16-20, 27, Appendix F. 168 Shapiro Corrected WDT, at 23-27, 29-30, Appendix F. 46

Written Rebuttal Testimony of Catherine Tucker Public Version

Because Professor Shapiro’s analysis of reasonable royalty rates relies heavily on the results of the Label Suppression Experiments, errors in these experiments render Professor Shapiro’s reasonable royalty estimates flawed and unreliable. These errors are compounded by Professor Shapiro’s misuse of the data, unfounded and ad hoc assumptions, and inappropriate extrapolations.

A. Professor Shapiro’s estimates rely heavily on the Label Suppression Experiments

Professor Shapiro uses results from the Label Suppression Experiments to estimate reasonable royalty rates paid by noninteractive services for digital performances of sound recordings.

Professor Shapiro uses a willing buyer, willing seller framework,169 in which, under one of his two approaches, the bargaining approach, he determines the reasonable royalty as the mid- point of (1) Pandora’s willingness to pay to obtain access to a record company’s music and (2)

a record company’s opportunity cost of licensing its music to a statutory webcaster.170 Both elements of that calculation—Pandora’s willingness to pay and the record company’s

opportunity cost—crucially depend on the estimated fraction of plays that Pandora would lose if it cannot play recordings from a particular record company, an input that comes directly out

of the Label Suppression Experiments.171 As Pandora’s estimated lost performances increase, Pandora’s willingness to pay and the record company’s opportunity cost also increase,

ultimately driving up Professor Shapiro’s calculated reasonable royalty.172

Professor Shapiro explained the importance of Pandora’s estimated loss in performances if it lost access to a record company’s catalog (a number that is derived from the Label Suppression Experiments) to determining the opportunity cost:

[A]ll else equal, the opportunity cost to a record company of licensing its music to a statutory webcaster is proportional to the

169 Shapiro Corrected WDT, at 3. 170 Shapiro Corrected WDT, at 23. 171 Shapiro Corrected WDT, at 18-20, 26 and Appendix F, p. 1-3. 172 Shapiro Corrected WDT, at Appendix F, p. 1-3. 47

Written Rebuttal Testimony of Catherine Tucker Public Version

share of listening hours that the statutory webcaster would lose if they were not able to play that record company’s music. This economic fact is fundamental for the setting of reasonable per- performance royalty rates using the opportunity cost methodology. To see why, suppose that we calculate the opportunity cost to a given record company of licensing its repertoire to a statutory webcaster as $0.0025 per performance, under the assumption that this record company is “must-have” for the statutory webcaster. Next, suppose that we then learn that in fact the statutory webcaster would lose only 20% of its listener hours (not 100%, as with a “must-have” label) if it were unable to play this record company’s music. Then the true opportunity cost for this record company would be only 20% as large as we had previously estimated, namely $0.0005 per performance, not $0.0025 per performance.173

Ultimately, based on his application of the results on the Label Suppression Experiments, Professor Shapiro assumes that Pandora would lose at most [ ] of its listener hours

if it lost access to [ ].174 Based on this “new evidence,” he concludes that “no individual record company is even close to being “must-have” for Pandora’s

advertising-supported webcasting service.”175

However, because this “new evidence” comes from flawed experiments, Professor Shapiro’s conclusions, which flow directly from the results of these experiments, are also flawed. This implies that Professor Shapiro’s reasonable royalty estimates are both unreliable and appear highly deflated.

B. Professor Shapiro’s ad hoc corrections to the estimates from the Label Suppression Experiments do not result in a conservative application of those estimates

Professor Shapiro appears to be aware of some of the deficiencies in the Label Suppression Experiments and performs some ad hoc corrections to attempt to deal with them. However, the

173 Shapiro Corrected WDT, at 15 (emphasis added). 174 Shapiro Corrected WDT, at 26. 175 Shapiro Corrected WDT, at 12. 48

Written Rebuttal Testimony of Catherine Tucker Public Version

ad hoc corrections are arbitrary and do not address the fundamental limitations with the Label Suppression Experiments.

For example, Professor Shapiro notes that one limitation of the Label Suppression Experiments is that listeners may not have been aware of the label suppression treatment:

[L]isteners were presumably not aware of the blackout, and they might react more strongly if they were aware. I account for this factor, and for certain imperfections in the experimental implementation that are discussed in Appendix E, by applying the upper end of the 95% confidence interval from the Label Suppression Experiments to obtain a range of negotiated rates.176

While Professor Shapiro presents using the upper bound of a 95 percent confidence interval as a conservative estimate of the real world impact of Pandora’s failure to reach an agreement to license the catalog of a particular record company, he has not pointed to any evidence to support this claim. Dr. Reiley’s analysis is flawed and generates a biased estimate that likely vastly understates the effect of interest. Professor Shapiro’s ad hoc “correction” is untethered

to any valid procedure to produce reliable field experiment estimates.177 Far from producing a “conservative estimate,” Professor Shapiro produces an arbitrary estimate, which likely continues to understate the true effect of suppression.

Professor Shapiro also appears to be aware that the Label Suppression Experiments do not provide estimates of the long-run effects of label suppression that he requires, and he extrapolates the short-term results of the Label Suppression Experiments by assuming that the

176 Shapiro Corrected WDT, at 19. 177 The Label Suppression Experiments do not provide a useful guide to the effect on listening hours when users are aware of the missing recordings. Dr. Reiley and Professor Shapiro cannot ascertain how many users in the treatment groups, if any, independently realized that their Pandora ad-supported service was no longer playing recordings from the suppressed record company. Professor Shapiro’s use of the upper bound of the 95 percent confidence interval does not correct for the fact that the experiment did not measure the treatment of interest, and therefore his adjusted estimates do not provide insight into the true effect of the correct treatment.

49

Written Rebuttal Testimony of Catherine Tucker Public Version

long-term effects are approximately three times larger than the observed short-term effects.178 Professor Shapiro, however, provides no legitimate support for why this relationship, which was obtained from a different experiment involving a different treatment and a different experimental design, is applicable here. Professor Shapiro’s assumption is based on a previous paper co-authored by Dr. Reiley showing that the short-term effects of a treatment on listener

behavior can differ substantially from long-term effects.179 As discussed in Section III.D, Dr.

Reiley did not demonstrate that long-term effects are typically three times greater than short term effects, or that the speed with which listeners react to label suppression experiments has

any relationship to the speed with which listeners react to advertising experiments.180 Indeed, Dr. Reiley presents no empirical basis to conclude that the relationship between time and treatment effect is linear rather than exponential. As such, Professor Shapiro has no way to know whether his ad hoc adjustment adequately captures the potential increase in effect size for the label suppression treatment over time.

C. Professor Shapiro ignores the additional effects of losing access to content from [ ] on Pandora’s underlying business model

As I explain in my Written Direct Testimony, the value of services such as Pandora is driven

by the underlying unit economics of each customer.181 In cases when unit economics drive profits, it is important for companies to focus on all of the metrics that affect customer lifetime value, not just short-run changes, such as shifts in users or subscribers, that have an immediate

178 Reiley Corrected WDT, at 24-25; Shapiro Corrected WDT, at 19. 179 Jason Huang, David Reiley, & Nickolai Riabov, Measuring Consumer Sensitivity to Audio Advertising: A Field Experiment on Pandora Internet Radio, at 8, 11 (Apr. 21, 2018), https://ssrn.com/abstract=3166676. 180 [ ]. SoundExchange Ex. 231 (Deposition of David Reiley), at 122:15-23 ([

]). 181 Tucker WDT, at 12-13.

50

Written Rebuttal Testimony of Catherine Tucker Public Version

effect on revenues.182 These factors include the cost of acquiring a customer, the likelihood of retaining a customer, and the revenue generated from each customer. However, Professor Shapiro’s use of the Label Suppression Experiments only considers a loss of listening hours. He does not try to calculate the likely effects of Pandora’s failure to reach an agreement with a record company on other key profit drivers.

In reality, Pandora’s loss of access to [ ] music catalog would affect

other key metrics in addition to listening hours.183 First, the loss of recordings from [ ] likely would make it more expensive for Pandora to acquire customers, as Pandora would be competing with a degraded service against interactive services that offer a full repertoire of music from [ ]. As discussed above in Section III.A.2, Pandora’s competitors would have incentives to publicize the deficiencies in Pandora’s music catalog to consumers. This publicity may have the greatest effect on potential new customers, who are particularly likely to seek information on the relative strengths and

weaknesses of competing services.184 This implies that Pandora would likely find it harder to attract new users and may need to increase its spending on promotions and incentives to attract new customers.

Second, Pandora’s loss of access to [ ] catalog would likely make it harder for Pandora to retain customers. That is, Pandora would run the risk that, rather than

just reducing the amount of time spent listening to the service, some users may churn and never use the service again. This is especially true if Pandora’s diminished music library incentivizes

182 See, e.g., Ronny Kohavi, Trustworthy Online Controlled Experiments and the Risks of Uncontrolled Observational Studies, Microsoft, at 15 (2019), https://exp-platform.com/Documents/2019-08%20CausalDiscoveryExPRonnyK.pdf (noting the importance of designing experiments to address “customer lifetime value, not immediate revenue.”). 183 Dr. Reiley testified that [ ]. SoundExchange Ex. 231 (Deposition of David Reiley), at 166:8-16 ([

]). 184 Further, new users who have not built up a history of listening data or curated personalized stations over time may be quickly disappointed by the service’s failure to play recordings from the artist used to seed a new station if that artist’s music is associated with the suppressed record company. 51

Written Rebuttal Testimony of Catherine Tucker Public Version

customers to sign up for a competing service instead of continuing to use Pandora. As a result, Pandora may react to these circumstances by increasing its spending on promotional campaigns and/or by providing customers with other incentives to continue using its service. Furthermore, this is likely to be a long-run effect that cannot be captured by a 90-day experiment.

Third, Pandora’s loss of access to [ ] catalog may reduce advertisers’ incentives to advertise, and willingness to pay for advertising spots, on Pandora’s now degraded service. Advertisers might prioritize other competing services and some may stop

advertising on Pandora altogether.185 As a result, in addition to reducing listening hours,

Pandora’s diminished music library could also result in lower revenue per listening hour.186 Consistent with this, recently-shuttered music service 8tracks explained in a blog post that the

value of advertising spots on its service fell as its listener base declined.187

Fourth, the loss of access to [ ] content may affect Pandora’s ability to effectively upsell customers of its ad-supported service to its Plus and Premium offerings, as well as to Sirius XM’s offerings. Pandora’s executives have commented on the importance

of “funneling” its ad-supported users to more profitable paid subscriptions,188 emphasizing that ad-supported customers are a main source for subscribers for its Plus and Premium services

and are “virtually free of acquisition cost.”189 Dr. Reiley similarly testified [

]190 In addition, as discussed in my Written Direct

185 The loss of advertisers was one of the problems faced by Myspace as it lost users to rival Facebook. See, e.g., Dominic Rushe, Myspace sold for $35m in spectacular fall from $12bn heyday, The Guardian (June 30, 2011), https://www.theguardian.com/technology/2011/jun/30/myspace-sold-35-million-news. 186 This is measured by Pandora as RPM, or revenue per thousand listening hours. 187 David Porter, To Everything There is a Season, 8tracks Blog (Dec. 26, 2019), https://blog.8tracks.com/. 188 For example, Pandora executives described its plans to “leverage [its] existing audience to attract subscribers” as a central component of its “go-to-market strategy” for its Pandora Premium on-demand subscription tier. Pandora Media Inc. Q4 2016 Earnings Call Transcript, Feb. 9, 2017, at 3. 189 See, e.g., Pandora Media Inc. Q4 2016 Earnings Call Transcript, Feb. 9, 2017, at 3. 190 SoundExchange Ex. 231 (Deposition of David Reiley), at 40:6-15 ([

52

Written Rebuttal Testimony of Catherine Tucker Public Version

Testimony, Sirius XM’s acquisition of Pandora was, in large part, motivated by cross-selling and upselling opportunities based, in part, on Pandora’s large number of active ad-supported

listeners.191 A loss in listening hours may translate into a reduction in the spillover gains from

upsell opportunities, and is likely to have cascading effects on Pandora’s business.192

Losing access to [ ] catalog would influence not only Pandora’s ad- supported listener hours, but also Pandora’s unit economics and overall profitability in ways

not captured simply by direct losses of listener hours. The result of these additional effects on Pandora’s business, none of which is addressed by the Label Suppression Experiments or otherwise by Professor Shapiro, would be to decrease Pandora’s ability to attract, retain, and monetize customers and/or increase the costs of servicing them. As a result, not only would the real-life suppression of [ ] decrease listening hours and active listeners on Pandora’s ad-supported service, it also likely would have a substantial further effect on lifetime profit per user, and would threaten the viability of Pandora’s business more

severely than is suggested by just focusing on the effect of reduced listener hours.

Indeed, shifts in these variables (outside of listening hours) tend to have profound shifts in the value of a business, as demonstrated by share price movements of other web-based companies. For example, Spotify’s share price fell five percent in July 2019 after Spotify reported that monthly active users and Premium subscribers missed expectations by 4.1 percent and 0.5

]). See also, SoundExchange Ex. 399, [ ], PANWEBV_00008435 ([

]. 191 Tucker WDT, at 45-46. 192 The founders of 8tracks identified this as one reason why they decided to shut down, explaining that “the steady decline in our free, ad-supported audience resulted in a smaller base of active listeners that might eventually be converted to 8tracks Plus, our ad-free subscription offering.” David Porter, To Everything There is a Season, 8tracks Blog (Dec. 26, 2019), https://blog.8tracks.com/.

53

Written Rebuttal Testimony of Catherine Tucker Public Version

percent, despite exceeding revenue expectations.193 In October 2019, Spotify’s share price

increased 12 percent when Spotify reported higher than expected subscribers and profits.194 Similarly, social network Twitter has experienced several steep declines in its share price driven by its failure to meet expectations of active user growth. For example, in July 2015, Twitter’s share price fell by as much as 13 percent on the day it announced earnings, a reaction that some third-party commentators largely attributed to falling short of forecasted monthly

active user numbers by approximately 1.2 percent.195 Professor Shapiro’s methodology, in contrast, assumes that Pandora’s reduction in revenues if it fails to reach an agreement with [

] is proportional to its loss of performances.196

D. Professor Shapiro improperly applies the results of the Label Suppression Experiments to estimate a reasonable royalty for subscription webcasters

Further compounding these errors, Professor Shapiro’s opportunity cost analysis for subscription webcasters uses “as a proxy the same percentage loss of listening hours for a

subscription webcasting service that was found in the Label Suppression Experiments

conducted on Pandora’s advertising-supported service.”197 In other words, Professor Shapiro assumes that the effect of the label suppression treatment would be the same for subscribers as for users of the free ad-supported Pandora service. This ignores differences between users of ad-supported services and subscription services that could influence the resulting effect of the

193 Carmen Reinicke, Spotify slips after not adding as many paid subscribers as hoped (SPOT), Markets Insider (July 31, 2019), https://markets.businessinsider.com/news/stocks/spotify-earnings-2q-stock-price-reaction-disappointing- subscriber-growth-2019-7-1028403687 (calculated as 232 / 242 – 1 = -4.1 percent and 108 / 108.5 – 1 = -0.5 percent). 194 Neha Malara & Supantha Mukherjee, Spotify shares surge after surprise profit, rise in paid users, Reuters (Oct. 28, 2019), https://www.reuters.com/article/us-spotify-tech-results/spotify-shares-surge-after-surprise-profit-rise-in-paid- users-idUSKBN1X70X9. 195 Alexei Oreskovic, Twitter shares crash as user growth stalls, Business Insider (Oct. 27, 2015), https://www.businessinsider.com/twitter-earnings-q3-2015-2015-10 (calculated as 320 / 324 – 1 = 1.23 percent). 196 Shapiro Corrected WDT, at Appendix F, at 2-3. 197 Shapiro Corrected WDT, at 27. Professor Shapiro assumes that the percentage loss of subscribers is equal to the percentage loss of listening hours measured by the Label Suppression Experiments. Shapiro Corrected WDT, at 30.

54

Written Rebuttal Testimony of Catherine Tucker Public Version

label suppression treatment on listening hours. For example, Dr. Reiley testified that [

]198 Dr. Reiley also noted that [

].199

Indeed, Professor Shapiro acknowledges that “a blackout by a given record company could in principle have a different impact on listener hours for a subscription webcasting service than

for an advertising-supported webcasting service.”200 [

].201 Similarly, as they are paying for the service rather than receiving it for free, subscribers may be more sensitive to changes in Pandora’s music catalog and may be more likely to switch away from

Pandora altogether rather than reducing their listening hours if they become dissatisfied with

the service.202 This is especially the case as there are many interactive services offering a full catalog of music and a full range of functionalities to which Pandora is especially vulnerable to losing customers.

198 SoundExchange Ex. 231 (Deposition of David Reiley), at 41:9-19 ([

]). 199 SoundExchange Ex. 231 (Deposition of David Reiley), at 41:20-42:7 ([

]). 200 Shapiro Corrected WDT, at 27. 201 SoundExchange Ex. 210, Pandora September Engagement Update (Oct. 2019), PANWEBV_00005093, at 00005096. 202 Research has shown nonlinearities in behavior with respect to zero prices. Kristina Shampanier, Nina Mazar, Dan Ariely, Zero as a Special Price: The True Value of Free Products, 26 Marketing Science 742-757 (Nov.-Dec. 2007). 55

Written Rebuttal Testimony of Catherine Tucker Public Version

E. Ultimately, Professor Shapiro’s application of the results of the Label Suppression Experiments suffers from compounding errors

Ultimately, the reliability of Professor Shapiro’s estimates is compromised by compounding errors. As I discuss, there are large flaws with Dr. Reiley’s Label Suppression Experiments and analysis, which together mean that Dr. Reiley’s estimates do not provide insight into the true effect of the loss of content from [ ] on Pandora’s business. Professor Shapiro acknowledges that listeners “were presumably not aware of the blackout,

and they might react more strongly if they were aware” and notes that “the experiments measure the impact of the blackout for only three months, but the impact over a longer period

of time could well be larger.”203 These flaws all consistently suggest that Dr. Reiley is underestimating the true effect. Professor Shapiro’s arbitrary adjustments, however, do not correct these flaws. Therefore, his analysis does not reflect the true effect of a blackout, Pandora’s willingness to pay, or the record company’s opportunity cost.

Further, as discussed above, not only would the real-life suppression of [ ] decrease listening hours and active listeners on Pandora’s ad-supported service, it also likely would have a substantial effect on the lifetime value of each customer, and would threaten the viability of Pandora as a service more severely than is suggested by just focusing on the effect of reduced listener hours. Professor Shapiro’s analysis does not capture these effects.

V. The Submission by the National Association of Broadcasters (“NAB”) Misses the Importance of Simulcasting to Their Broadcasters

NAB witnesses suggest that simulcasting is a small and unprofitable aspect of their business. For example, Dr. Leonard states that “simulcast is an ‘add-on’ service that would not exist without the terrestrial radio broadcast. Indeed, the terrestrial broadcast (and the revenues

203 Shapiro Corrected WDT, at 19. 56

Written Rebuttal Testimony of Catherine Tucker Public Version

derived therefrom)—not the simulcast—are the primary driver of the radio station’s

business.”204

These arguments, however, ignore the role that simulcasting plays in the context of the broadcaster’s overall business. If these arguments were right—i.e., that simulcasting should be viewed as an independent and unprofitable line of business—broadcasters would not simulcast. Because broadcasters continue to simulcast, we know that simulcasting plays a more

complicated role in their overall businesses. As described in my initial report, industry participants are increasingly using digital music services as part of a wider economic

ecosystem.205 As such, digital music offerings can aid and complement other aspects of a broadcaster’s business, providing benefits beyond the direct profits generated from the service itself. For example, iHeartMedia has emphasized how its multi-platform strategy positions the

company to capture additional advertising revenue from digital advertising sectors.206 Similarly, industry analysts have noted that iHeartMedia’s digital platforms improve the

company’s ability to sell broadcast inventory.207

As discussed in my Written Direct Testimony, iHeartMedia has noted the importance of digital to its business in recent earnings releases. iHeartMedia continued to highlight the importance of its digital segment in its November 2019 earnings call discussing its Q3 2019 results. iHeartMedia’s financial results for the quarter suggest continued improvement, and

iHeartMedia executives emphasized that “[d]igital had another strong quarter”208 with digital

revenue up 33.4 percent from Q3 2018.209 [

204 Leonard WDT, at 20. 205 Tucker WDT, at 34-37. 206 Tucker WDT, at 80. 207 Tucker WDT, at 80. 208 iHeartMedia Inc. Q3 2019 Earnings Call Transcript, Nov. 7, 2019, at 3. 209 iHeartMedia, Inc. Reports Results for 2019 Third Quarter, BusinessWire (Nov. 7, 2019), https://www.businesswire.com/news/home/20191107005341/en/iHeartMedia-Reports-Results-2019-Quarter. See also Tucker WRT, Appendix 8.

57

Written Rebuttal Testimony of Catherine Tucker Public Version

].211

In addition to helping to retain listeners in the face of emerging digital technologies, simulcasts also affect the advertising side of the terrestrial radio business. [

].213

Consistent with this evidence that simulcasts enhance and complement a radio station’s core business, Mr. Leonard Wheeler, President and Owner of NAB member company Mel Wheeler,

210 SoundExchange Ex. 380, [ ], NAB00002609 ([ ]); SoundExchange Ex. 382, [ ], NAB00002659 ([ ]). 211 SoundExchange Ex. 398, [ ], PANWEBV_00007062, at 00007071. 212 SoundExchange Ex. 321, [ ], NAB00004542, at tab “Digital Rates.” See also, SoundExchange Ex. 389, [ ], NAB00004537. In fact, [ ]. See SoundExchange Ex. 391, NAB00006441, at tab “Summary;” SoundExchange Ex. 375, Declaration of Collin R. Jones, Jan. 7, 2020, at 2-3. 213 SoundExchange Ex. 386, [ ], NAB00004036, at 00004041.

58

Written Rebuttal Testimony of Catherine Tucker Public Version

Inc. (“Wheeler”), testified to the strategic importance of offering its radio content digitally.214 As described by Mr. Wheeler, Wheeler “feared that, without at least establishing some streaming presence, [it] would eventually lose listeners as they increasingly sought to listen to

content digitally.”215 Mr. Wheeler indicated that “Wheeler has no choice but to make our stations available digitally, to guard against the possibility that our traditional radio audience begins to tune-in, not from traditional AM/FM radios, but rather from their desktop computers,

cell phones and smart speakers.”216 At his deposition, Mr. Wheeler testified that [

]217 He testified that [

]218 Mr. Wheeler further testified that he simulcasts [

]219 This testimony is consistent with evidence that, even while it may not be a broadcaster’s core business, there is a business

214 Written Direct Testimony of Leonard Wheeler, Sept. 19, 2019 (“Wheeler WDT”). 215 Wheeler WDT, at 9. 216 Wheeler WDT, at 9. 217 Deposition of Leonard Wheeler, Dec. 4, 2019 (“Sound Exchange Ex. 230 (Wheeler Deposition)”), at 28:1-7 ([

]), 98:9-12 ([ ]). 218 SoundExchange Ex. 230, (Wheeler Deposition), at 54:22-25. 219 SoundExchange Ex. 230, (Wheeler Deposition), at 56:5-8; see also at 170:5-9 (acknowledging that [

]). Despite this business need, Mr. Wheeler claimed that he was hesitant to embrace simulcasting due to “the exorbitant SoundExchange royalties that we must pay to simulcast.” Wheeler WDT, at 8. To make this point, Mr. Wheeler stated at his deposition that [ ]. SoundExchange Ex. 230 (Wheeler Deposition), at 158:4-24; see also SoundExchange Ex. 230 (Wheeler Deposition), at 26-27, 168, 170, 192, 202, 210. [ ]. SoundExchange Ex. 390, NAB00005547; SoundExchange Ex. 431, [ ]. [

].

59

Written Rebuttal Testimony of Catherine Tucker Public Version

need for simulcasting and webcasting services, and these services provide benefits for the broadcaster beyond current direct profit generation.

Mr. Wheeler’s testimony regarding the importance of simulcasting reflects the general trend towards digital that I explained in my initial written direct testimony. This trend is driven by the rise of mobile devices, smart speakers, and connected cars, which in turn reduces the share of traditional venues where people have listened to terrestrial radio. Mr. Wheeler’s testimony

confirms that this concern motivates broadcasters, as he believes simulcasting listenership [ ] due to [

]220 The data backs up this belief. Results of a 2018 Jacobs Media Tech Survey show that in 2013, [ ] of radio listening was digital while [ ] was on terrestrial broadcasts, compared to [ ] and [

] in 2018.221 The same 2018 survey also shows that [ ] of smart speaker owners

frequently use their smart speaker to listen to music on AM/FM radio.222 [

]223

Documents produced by NAB in discovery confirm that many listeners are shifting their radio listening to smart speakers and connected cars and thereby are switching from over-the-air broadcasts to simulcasts. [

220 SoundExchange Ex. 230, (Wheeler Deposition), at 171:15-18 ([

]). 221 SoundExchange Ex. 254, Tech Survey 2018 Jacobs Media: Radio Navigates the Digital Revolution (2018), NAB00002238, at 00002250. 222 SoundExchange Ex. 254, Tech Survey 2018 Jacobs Media: Radio Navigates the Digital Revolution (2018), NAB00002238, at 00002266. 223 SoundExchange Ex. 378, The Infinite Dial: 2018, NAB00002166, at 00002178 ([ ]; see also SoundExchange Ex. 61, MusicWatch Annual Music Study 2018: Report to Pandora Media (Apr. 2019), PANWEBV_00004139, at 00004214.

60

Written Rebuttal Testimony of Catherine Tucker Public Version

]224 To the radio industry, a smart speaker replacing a bedside or kitchen radio would threaten broadcast stations that did not have simulcasts that could be accessed on that smart speaker. These broadcasters would face increased competition from streaming services and other forms of music that could be accessed on that device. According to a 2017 study done by Edison Research, [

].225 Similarly, Infinite Dial research shows that [

]226

Internal documents produced in this proceeding from iHeartMedia and Pandora clearly show

that radio businesses view streaming services as competitors, and vice versa.227 [

224 SoundExchange Ex. 377, Deloitte Insights: Technology, Media, and Telecommunications Predictions (2019), NAB00001993, at 30-31; at 31 ([ ]); at 68 ([

]). 225 SoundExchange Ex. 387, BIA Advisory Services: Market Assessment and Opportunities for Local Radio: 2018- 2022 (Apr. 2018), NAB00004126, at 00004144. 226 SoundExchange Ex. 387, BIA Advisory Services: Market Assessment and Opportunities for Local Radio: 2018- 2022 (Apr. 2018), NAB00004126, at 00004141; at 00004140 ([

]). See also SoundExchange Ex. 403, [ ], SXMWEBV_00005224, at 00005224 ([ ]). 227 See, e.g., SoundExchange Ex. 385, [ , NAB00002858, at 00002861 ([ ]); SoundExchange Ex. 397, Tech Survey 2019 Jacobs Media: Radio’s Survival Kit (2019), PANWEBV_00006670, at 00006680 ([ ]).

61

Written Rebuttal Testimony of Catherine Tucker Public Version

].231

Pandora expert Dr. Waldfogel has generally recognized the importance of digitization in the

music industry, and has written extensively about the digital renaissance in music.232 His own writings suggest that this digital renaissance affects the role and popularity of terrestrial radio as online platforms grow. For example, Dr. Waldfogel has observed that “the past decade has

seen the emergence and growth in alternative institutions, including Internet radio… New

information channels are changing the pathways to commercial success.”233 In addition, Dr.

228 SoundExchange Ex. 388, [ ] NAB00004413, at 00004473. 229 SoundExchange Ex. 381, [ ], NAB00002613, at 00002629, 00002631. See also SoundExchange Ex. 385, [ ], NAB00002858, at 00002882-83, 00002887-93, 00002912, 00002914-15, 00002917-18 ([ ]). 230 SoundExchange Ex. 396, [ ], PANWEBV_00006244, at 00006253-55. 231 SoundExchange Ex. 396, [ ], PANWEBV_00006244, at 00006280. [

], PANWEBV_00006244, at 00006264. 232 Joel Waldfogel, Digital Renaissance: What Data and Economics Tell Us about the Future of Popular Culture (Princeton University Press, 2018). In his written testimony, Dr. Waldfogel highlights the “dramatic gains” in recorded music revenues in the U.S. between 2014 and 2018. Written Direct Testimony of Joel Waldfogel, Sept. 23, 2019, at 5-7. However, Dr. Waldfogel ignores the 15-year decline in the music industry prior to this period and the fact that, despite the recent growth, U.S. recorded music revenues are still substantially lower than the industry’s peak revenues in 1999. Tucker WDT, at Appendix 1. 233 Joel Waldfogel, “Digitization and Quality of New Media Products: The Case of Music” in Economic Analysis of

62

Written Rebuttal Testimony of Catherine Tucker Public Version

Waldfogel’s research has found that “a declining share of successful artists have traditional [radio] airplay, while a growing share are covered by online radio and critics,” highlighting the

waning influence of traditional radio.234

Further, Dr. Waldfogel’s published work emphasizes that this digital renaissance in music is

due to the emergence of interactive webcasting services.235 In his report, he also discusses the rise of digital streaming services, but fails to distinguish the role of interactive services rather

than noninteractive services in leading this renaissance.236

VI. The Religious Broadcasters’ Arguments for Why They Should Pay Less Are Not Based on Economics

NRBNMLC’s written direct statement focuses on religious broadcasters in general, and on

Family Stations, Inc. (“Family Radio”) specifically.237 However, neither of those appear representative of the larger population of noncommercial webcasters for which the Judges must

set rates in this proceeding. As I described in my written direct testimony, the vast majority of noncommercial webcasters (96 percent at the parent company level) pay only the annual minimum fee, and for 2018, only 20 noncommercial webcasters paid some amount of statutory

royalties on usage in excess of 159,140 aggregate tuning hours (“ATH”) per month.238 The

latter group includes religious broadcasters with significant financial resources.239

the Digital Economy, at 410 (Avi Goldfarb, Shane Greenstein, and Catherine Tucker eds., University of Chicago Press, Apr. 2015). 234 Joel Waldfogel, “Digitization and Quality of New Media Products: The Case of Music” in Economic Analysis of the Digital Economy, at 411 (Avi Goldfarb, Shane Greenstein, and Catherine Tucker eds., University of Chicago Press, Apr. 2015). 235 Joel Waldfogel, Digital Renaissance: What Data and Economics Tell Us about the Future of Popular Culture 203- 04 (Princeton University Press, 2018). 236 Waldfogel WDT, at 5-7, 11-12. 237 See, e.g., Introductory Memorandum to the Written Direct Statement of the National Religious Broadcasters Noncommercial Music License Committee, Including Educational Media Foundation, Sept. 23, 2019. 238 Tucker WDT, at 83-84, Appendix 18. 239 Tucker WDT, at Appendix 16.

63

Written Rebuttal Testimony of Catherine Tucker Public Version

In their written direct testimonies on behalf of the NRBNMLC, Dr. Steinberg and Dr. Cordes argue that noncommercial webcasters should pay lower statutory rates than commercial webcasters. Dr. Steinberg argues that statutory rates for noncommercial webcasters should be substantially lower because, among other reasons, noncommercial webcasters rely on donations and therefore are often “struggl[ing] to survive” and cannot afford large and

“unpredictable” royalty payment obligations.240 In addition, Dr. Steinberg and Dr. Cordes also

point to evidence that “[f]or-profit firms are often willing to sell their products and services to nonprofit organizations at a substantial discount” as an indication that statutory royalty

obligations should be considerably lower for noncommercial webcasters.241

These arguments, however, ignore evidence that the average per-performance rate paid by noncommercial webcasters is already lower than the rates paid by commercial webcasters, [ ]. In addition, Dr. Cordes and Dr. Steinberg fail to recognize that, far from being large and unpredictable, statutory royalty

payments comprise a small portion of operating costs for many noncommercial webcasters. Dr. Steinberg and Dr. Cordes also ignore evidence that, while small non-profit organizations are given discounts when purchasing other goods and services, those discounts may not be extended proportionally to larger organizations, much like the webcasters that pay excess royalties at the commercial rate. Dr. Steinberg’s argument that per-performance excess royalty payments are too unpredictable to be financed by donations is not grounded in data or economics, and is inconsistent with evidence that excess royalties are both predictable and

controllable. Finally, Dr. Steinberg amended his written direct testimony to add a discussion of SoundExchange’s settlement agreements with College Broadcasters, Inc. (“CBI”) and National Public Radio (“NPR”). Neither provides support for NRBNMLC’s rate proposal.

240 Steinberg Amended WDT, at 8-9, 28; see also Cordes Corrected WDT, at 10. 241 Steinberg Amended WDT, at 20-21; see also Cordes Corrected WDT, at 10-11. 64

Written Rebuttal Testimony of Catherine Tucker Public Version

A. Family Radio is not representative of noncommercial webcasters

In the introductory memo to its written direct statement, NRBNMLC pointed to Family Radio as an example of a large, nonprofit webcaster for which “the increase in streaming rates and current onerous license reporting requirements have substantially harmed its ability to reach as

many listeners as it has in the past.”242 Ms. Burkhiser, Family Radio’s Director of Broadcast Regulatory Compliance and Issue Programming, described Family Radio’s recent financial struggles, noting that “Family Radio has experienced severe financial hardship in recent

years—made worse by increased streaming rates—that has forced it to make difficult decisions

to enable it to continue to offer its radio ministry to its listeners.”243 However, Family Radio is not representative of the broader population of noncommercial webcasters, or even other large religious broadcasters. NRBNMLC’s emphasis on Family Radio does not provide a valid basis to draw economic conclusions about noncommercial webcasters in general. In addition, Family Radio’s recent financial struggles appear largely related to unique, strategic programming

decisions, and unrelated to streaming rates.

First, as a large noncommercial religious broadcaster, Family Radio is not representative of noncommercial webcasters in general. In 2018, there were over 900 noncommercial

webcasters (at the statement of account level)244 and close to 500 at the parent company level.245 This is a diverse group of services, including many that do not offer religious programming or

are not FCC-licensed broadcasters.246 These include music-only services and services with an

express purpose of supporting artists.247 Any inferences about an appropriate statutory royalty

242 Introductory Memorandum to the Written Direct Statement of the National Religious Broadcasters Noncommercial Music License Committee, Including Educational Media Foundation, Sept. 23, 2019, at 5-6. 243 Written Direct Testimony of Jennifer D. Burkhiser, Sept. 23, 2019 (“Burkhiser WDT”), at 13. 244 Written Direct Testimony of Jonathan Bender, Sept. 20, 2019 (“Bender WDT”), at 14. 245 Based on SoundExchange-provided data in SoundExchange Ex. 430, 2020-1-6 NonComms Stated and Allocated 2018.xlsx, at tab “Parent Level Summary.” 246 Written Rebuttal Testimony of Travis Ploeger, Jan. 10, 2020 (“Ploeger WRT”), ¶ 44. 247 Ploeger WRT, at ¶¶ 44-45.

65

Written Rebuttal Testimony of Catherine Tucker Public Version

determined by examining the business and finances of Family Radio, a large religious broadcaster, would not necessarily generalize to noncommercial webcasters more broadly.

Second, Family Radio is not representative of other large noncommercial religious broadcasters. Among other things, I understand that Educational Media Foundation (“EMF”) is a participant in this proceeding in its own right and [

].248 [

].251 As I showed in my Written Direct Testimony, the largest noncommercial broadcasters in terms of excess royalties owed in 2018 are well-resourced organizations with millions of

dollars in revenues.252 Since submitting my Written Direct Testimony, I have seen Family Radio’s IRS Form 990 for 2018 and have used this information to update my previous

calculations, as summarized in Rebuttal Appendices 3 and 4. As shown in Rebuttal Appendix

3, Family Radio operated at a loss in 2018.253 In contrast, other large noncommercial religious broadcaster webcasters generated millions of dollars more in revenue than they incurred in

expenses in 2018.254 For example, in 2018, EMF generated a surplus of [ ].255

248 SoundExchange Ex. 394, [ ], NRBNMLC_WEBV_00000823. 249 Calculated as [ ]. SoundExchange Ex. 430, 2020-1-6 NonComms Stated and Allocated 2018.xlsx, at tab “Pivot.” 250 Calculated as [ ]. SoundExchange Ex. 430, 2020-1-6 NonComms Stated and Allocated 2018.xlsx, at tab “Pivot;” Bender WDT, at 14. 251 Calculated as [ ]. SoundExchange Ex. 430, 2020-1-6 NonComms Stated and Allocated 2018.xlsx, at tab “Pivot;” Bender WDT, at 14. 252 Tucker WDT, at Appendix 16. 253 Tucker WRT, Appendix 3. See also, Family Stations, Inc. Form 990 for the year ended December 31, 2018, at 1, 12. 254 Tucker WRT, Appendix 3. 255 Tucker WRT, Appendix 3.

66

Written Rebuttal Testimony of Catherine Tucker Public Version

Third, Family Radio’s current situation is a result of unique circumstances that have been well documented in the popular media and include failed doomsday predictions accompanied by expensive advertising campaigns, as well as programming antagonistic to the organized church. In connection with predictions that the world would end on May 21, 2011 or October 21, 2011 (after a previous prediction that the world would end on September 6, 1994), Family Radio engaged in a national advertising campaign possibly costing as much as $100 million,

including advertisements on thousands of billboards, while donations fell sharply.256 These unique circumstances also make Family Radio a poor example for drawing any conclusions about noncommercial webcasters in general.

Fourth, even with Family Radio’s unique circumstances, as I explained in my Written Direct

Testimony, statutory royalties do not appear material to the finances of Family Radio.257 Family Radio’s IRS Form 990 for 2018 shows that Family Radio generated overall revenues of $5,422,789 in 2018 and provides a webcasting service with an average of 150,000 to 200,000

unique monthly streamers.258 In 2018, Family Radio paid only [ ] in statutory

royalties.259 This compares to $5,422,789 in revenues and $7,267,331 in program expenses, meaning that statutory royalties constituted only [ ] of its 2018 revenues and [

256 Rick Paulas, What Happened to Doomsday Prophet Harold Camping After the World Didn’t End?, Vice (Nov. 7, 2014), https://www.vice.com/en_us/article/yvqkwb/life-after-doomsday-456; Bob Smietana, Christian radio group faces financial hard times, U.S.A. Today (May 14, 2013), https://www.usatoday.com/story/news/nation/2013/05/14/family-radio-finances-world-did-not-end/2159621/; End of the line for Christian radio network that predicted 2011 rapture, Denver Post (May 12, 2013), https://www.denverpost.com/2013/05/12/end-of-the-line-for-christian-radio-network-that-predicted-2011-rapture/; Nicola Menzie, Family Radio Founder Harold Camping Repents, Apologizes for False Teachings, Christian Post (Oct. 30, 2011), https://www.christianpost.com/news/family-radio-founder-harold-camping-repents-apologizes-for- false-teachings.html; An insider’s look at Family Radio and its leader Harold Camping, Mercury News (May 20, 2011), https://www.mercurynews.com/2011/05/20/an-insiders-look-at-family-radio-and-its-leader-harold-camping/. 257 Tucker WDT, at Appendix 16. 258 Family Stations, Inc. Form 990 for the year ended December 31, 2018, at 1-2, 12. See also, Tucker WRT, Appendix 3. 259 Tucker WRT, Appendix 3.

67

Written Rebuttal Testimony of Catherine Tucker Public Version

] of its 2018 program expenses.260 [

].261

As a result, the situation of Family Radio should not be generalized to the overall noncommercial webcaster population. In fact, neither Dr. Steinberg nor Dr. Cordes appears to have relied on Ms. Burkhiser’s testimony in formulating their opinions on behalf of

NRBNMLC.262

B. Statutory royalty payments make up a very small proportion of noncommercial webcasters’ costs

Dr. Steinberg suggests that noncommercial webcasters cannot afford to pay more than they currently do because of their non-profit status. He attributes this to a free-rider problem, noting, “[a]nyone can consume the results of total donations (religious broadcasting and webcasting) whether they have personally contributed or not, so that there is a natural tendency to let others

donate while taking a free ride on the output.”263 Dr. Steinberg asserts that, as a result of this

problem, “with rare exceptions, donative nonprofits are bare-bones operations that often

struggle to survive.”264 He also notes that “[o]rganizational expenditures on mission consist of donations minus fees for the rights to webcast recordings (and other expenses, of course), so that donors would have to give more to accomplish the same outcome when royalty fees go

up.”265

However, as shown in my Written Direct Testimony, the vast majority of noncommercial

webcasters pay only the minimum fee per station per year.266 Of the noncommercial webcasters

260 Tucker WRT, Appendix 3. 261 Family Stations, Inc. Form 990 for the year ended December 31, 2018, at 10. 262 Steinberg Amended WDT, at 34-35 (listing works consulted, but omitting Ms. Burkhiser’s testimony); Cordes Corrected WDT, at Appendix B (same). 263 Steinberg Amended WDT, at 9. 264 Steinberg Amended WDT, at 9. 265 Steinberg Amended WDT, at 12. 266 Tucker WDT, at 83-84 and Appendix 18.

68

Written Rebuttal Testimony of Catherine Tucker Public Version

that pay royalties on usage in excess of the 159,140 ATH per month threshold, many generate substantial revenues and pay a relatively small amount in statutory royalties as compared to

their revenues.267 In my Written Direct Testimony, I examined the five largest noncommercial webcasters, which account for the vast majority of excess royalties paid and include Family Radio. These are not “bare-bones operations.” As shown in my written direct testimony, statutory royalties accounted for [ ] percent to [ ] percent of their total expenses, [ ]

percent to [ ] percent of their program expenses, and from [ ] percent to [ ] percent of

their revenue.268 The more recent financial information for Family Radio that is now available to me does not change this conclusion. Its 2018 Form 990 shows 2018 total expenses of

$9,776,918, program expenses of $7,267,331, and revenue of $5,422,789.269 As shown in Rebuttal Appendix 3, its 2018 statutory royalty payment of [ ] constituted [ ] percent of its total expenses, [ ] percent of its program expenses, and [ ] percent of its revenue,

falling within the ranges established by other top noncommercial webcasters.270

Furthermore, these firms, which would be most affected by a change in royalties, appear well positioned to pay increased statutory royalties. Based on the data, SoundExchange’s proposal to increase minimum fees to $1,000 per station or channel and excess fees to $0.0028 per performance would raise the royalty burden among the five largest noncommercial royalty

payers to at most [ ] percent of total expenses, [ ] percent of programming expenses, and

[ ] percent of revenues.271 This remains true with the more recent financial information for

267 Tucker WDT, at 84-85 and Appendix 16. 268 Tucker WDT, at 84-85 and Appendix 16. Excluding one-time revenues from, among other things, the sale of station licenses, property and equipment, Family Radio generated $5,523,080 in revenues in 2017. Its 2018 statutory royalty payment of [ ] amounts to approximately [ ] percent of these revenues, falling within the ranges established by other top noncommercial webcasters. Family Stations, Inc. (A California Not-For Profit Corporation) and its Affiliates, Consolidated Financial Statements, December 31, 2017 and December 31, 2016, at 5; Tucker WDT, at Appendix 16. 269 Tucker WRT, Appendix 3; see also, Family Stations, Inc. Form 990 for the year ended December 31, 2018, at 1, 10. 270 Tucker WRT, Appendix 3. 271 Tucker WRT, Appendix 4.

69

Written Rebuttal Testimony of Catherine Tucker Public Version

Family Radio that is now available to me. As shown in Rebuttal Appendix 4, Family Radio’s royalty payment for 2018 usage at SoundExchange’s proposed 2021 rates would have been [ ], which would have accounted for [ ] percent of its total expenses, [ ] percent of

its program expenses and [ ] percent of its revenue in 2018.272 This comparison undermines Dr. Steinberg’s conclusion that statutory royalties force noncommercial webcasters to divert a meaningful amount of donations away from other mission-related expenses.

C. Though small non-profits are given discounts in some cases, those discounts may not be extended proportionally to larger non-profits

Dr. Steinberg and Dr. Cordes both provide examples of “for-profit firms [that] offer lower prices to nonprofits in the form of discounts” as evidence that noncommercial webcasters

should receive lower rates than commercial webcasters.273 However, they do not mention that these discounts may be targeted to smaller non-profit organizations or may not be proportionally extended to large non-profits.

For example, Microsoft offers discounts for nonprofit organizations that vary with the size of the organization, with larger discounts for smaller organizations. Microsoft offers its Office 365 Business Premium product for $3.00 per user per month for “small & mid-sized nonprofits” and its Office 365 Nonprofit E3 product for $4.50 per user per month for “large

nonprofits.”274 Microsoft also offers $3,500 credits per year for its Azure cloud services regardless of nonprofit size, representing a larger proportional discount for smaller nonprofits

than larger nonprofits.275 Similarly, Google offers “$10,000 USD of in-kind advertising every

272 Tucker WRT, Appendix 4. 273 Cordes Corrected WDT, at 10-11; Steinberg Amended WDT, at 20. 274 Compare Office 365 Nonprofit plans: Large Nonprofits, Microsoft, https://www.microsoft.com/en-us/microsoft- 365/nonprofit/office-365-nonprofit-plans-and-pricing?&activetab=tab:primaryr2 (last visited Jan. 10, 2020); Compare Office 365 Nonprofit plans: Small and Mid-sized Nonprofits, Microsoft, https://www.microsoft.com/en- us/microsoft-365/nonprofit/office-365-nonprofit-plans-and-pricing?&activetab=tab:primaryr1 (last visited Jan. 10, 2020). 275 Microsoft Nonprofits, Microsoft, https://www.microsoft.com/en-us/nonprofits/azure (last visited Jan. 10, 2020).

70

Written Rebuttal Testimony of Catherine Tucker Public Version

month for text ads” through its Google AdGrants program, representing a larger proportional

discount for nonprofit organizations with smaller expenditures.276 As another example, Slack offers nonprofit organizations with 250 or fewer members a free upgrade to its Standard Plan,

while organizations larger than that receive a smaller discount.277

Further, as I discuss in Section VI.D, all noncommercial webcasters already receive a discounted rate under the existing statutory royalty rate system and, consistent with some of

the examples I have discussed, smaller noncommercial webcasters receive greater average effective discounts than larger non-commercial webcasters.

D. Noncommercial webcasters already receive a discounted rate

The argument that noncommercial webcasters should get a discount based on their non-profit status ignores the fact that they effectively are receiving a discount relative to commercial webcasters due to the structure of statutory royalty payments for noncommercial webcasters.

Currently, noncommercial webcasters receive substantial discounts under the existing royalty structure. Dr. Steinberg and Dr. Cordes do not provide any reason to think those discounts are too low.

As described in my Written Direct Testimony, noncommercial webcasters are governed by a two-part royalty schedule made up of: (1) a $500 minimum fee per station per year, and (2) a per-performance fee of $0.0017 in 2016, subject to CPI adjustments for 2017 to 2020, for

transmissions in excess of 159,140 ATH per month.278 Currently the per-performance rate is

276 Google for Nonprofits: Reach more donors online with Google Ad Grants, Google, https://www.google.com/nonprofits/offerings/google-ad-grants/ (last visited Jan. 10, 2020). 277 “Slack for Nonprofits,” Slack (2019), https://slack.com/help/articles/204368833-Slack-for-Nonprofits. Other examples include Aplos Accounting, which offers additional discounts for organizations with less than $50,000 in annual revenues, and Salesforce, which offers 10 free subscriptions to its Lightning Enterprise Edition regardless of organization size. Aplos Pricing: Try Aplos for Free, Aplos Software (2019), https://www.aplos.com/pricing; Get 10 Donated Subscriptions of the World’s #1 Cloud Engagement Application, Salesforce, (2019) https://www.salesforce.org/nonprofit_product/nonprofit-editions-pricing/. 278 In re Determination of Royalty Rates and Terms for Ephemeral Recording and Webcasting Digital Performance of

71

Written Rebuttal Testimony of Catherine Tucker Public Version

$0.0018.279 As described by the Copyright Royalty Judges in the Web IV determination, this fee schedule “results in noncommercial webcasters paying a lower average per-play rate than

a commercial webcaster (that pays at the commercial rate for every performance).”280

Most noncommercial webcasters owe only the $500 minimum fee per channel and do not pay excess royalty fees. The average per-performance rate for noncommercial webcasters that use fewer than 159,140 ATH per station per month and therefore only owe minimum fees is, in

most cases, substantially lower than the average per-performance rate for non-subscription

commercial webcasters with equivalent usage.281 For a music-focused noncommercial webcaster close to the threshold of 159,140 ATH per month, the average per-performance rate

is approximately $0.000022,282 or approximately 1.2 percent of the statutory royalty rate for

commercial webcasters.283 In other words, under the existing statutory royalty rate system, music-focused noncommercial webcasters close to the monthly threshold are already receiving a discount of roughly 99 percent off the commercial per-performance rate.

Only 4.2 percent of all noncommercial webcasters pay any amount of excess royalties.284 As summarized in Appendix 17 to my Written Direct Testimony, the webcasters that account for the majority of excess royalty payments are relatively large non-profits with millions of dollars

in revenues.285 Noncommercial webcasters that exceed the 159,140 ATH threshold and owe

Sound Recordings (Web IV); Final Rule, 81 Fed. Reg. 26316, 26316 (May 2, 2016), https://www.govinfo.gov/ content/pkg/FR-2016-05-02/pdf/2016-09707.pdf (“Web IV Determination”). 279 37 C.F.R. § 380.10(a)(2). 280 Web IV Determination, at 26392 n.208. 281 The only exception is webcasters with very low usage such that the $500 minimum fee per channel is spread over relatively few performances. 282 To estimate this, I assume a conversion factor of 12 recordings per hour. See Ploeger WRT, at ¶¶ 40-42. This implies that 159,140 ATH translates to roughly 1.9 million performances (calculated as 159,140 hours × 12 recordings/hour = 1,909,680). The annual minimum fee of $500 equates to $41.67 per month (the price of 159,140 ATH), implying an effective rate of $0.000022 per performance (calculated as $41.67 / 1,909,680 performances = $0.000022). 283 $0.000022 / $0.0018 = 1.21%. A noncommercial webcaster that transmits fewer than 159,140 ATH per month, and therefore only pays the $500 annual minimum fee, would have to use less than 1.21 percent of the available ATH for its effective per-performance rate to exceed the commercial rate of $0.0018. 284 Tucker WDT, at Appendix 18. 285 Tucker WDT, at Appendix 17. 72

Written Rebuttal Testimony of Catherine Tucker Public Version

per-performance royalties on their excess usage still receive an overall discount relative to the commercial rate because of the steep discount they receive on the first 159,140 ATH of usage per month. As the usage goes up, the average per-performance discount declines, but never reaches zero.

I understand that SoundExchange’s rate proposal continues the existing payment structure, with increased minimum annual fees of $1,000 per station or channel and excess fees of

$0.0028 per performance. Because this proposal follows a similar payment structure where noncommercial webcasters receive a steep discount on the first 159,140 ATH of usage per month, noncommercial webcasters would continue to pay average per-performance rates lower than the commercial rate under SoundExchange’s proposed royalty rate increase.

E. The argument that noncommercial webcasters should not pay excess royalties because they are too unpredictable to finance with donations is not based on data or economics

Dr. Steinberg argues that the per-performance royalty payments for usage in excess of 159,140 ATH per month under the current system should be “replaced by tiered flat fees or tiered and capped flat fees” on the basis that “predictable payment obligations are important to [noncommercial webcasters] because they can finance them through regular on-air fundraising

drives with accurate campaign goals.”286 The costs associated with excess royalties under the statutory license are both predictable and controllable, however. Excess royalties are simply a function of the number of performances transmitted, which are easily tracked over time.

Noncommercial webcasters can monitor their current listenership and spending on excess royalties, presumably predict future usage based on annual trends and seasonal variations in

286 Steinberg Amended WDT, at 28.

73

Written Rebuttal Testimony of Catherine Tucker Public Version

previous years, and influence spending levels by choosing to play more or less music, playing

longer recordings,287 and managing access to streams.288

F. The CBI and NPR settlements do not provide support for NRBNMLC’s rate proposal

Dr. Steinberg amended his written direct testimony to add a discussion of SoundExchange’s

settlement agreements with CBI and NPR.289 However, neither of those agreements provides

support for NRBNMLC’s rate proposal in this proceeding.

I understand that the NRBNMLC has proposed “that noncommercial broadcasters pay: (a) a flat, $500 annual fee for each channel or station streaming digital audio transmissions up to 1,909,680 ATH in a year (159,140 ATH multiplied by 12 months/year); and (b) an additional $500 annual fee for each channel or station streaming digital audio transmissions for each

additional 1,909,680 ATH in the same year.”290 That is, as I understand it, the NRBNMLC’s proposal would maintain the $500 annual fee for small noncommercial webcasters and charge

larger noncommercial webcasters in similar increments for additional usage. However, neither the CBI nor NPR settlement agreements appear to support the NRBNMLC’s proposed fee structure or proposed royalty levels.

287 Testimony of Steven Cutler, Executive Vice President, Business Development and Corporate Strategy, iHeartMedia, Inc., Oct. 7, 2014, in the matter of Determination of Royalty Rates and Terms for Ephemeral Recording and Digital Performance of Sound Recordings (Web IV), at ¶ 13. 288 Steinberg Amended WDT, at 14 (discussing potential for limiting access); Wheeler WDT, ¶¶ 23-25 (describing station group’s process of deciding which stations to webcast), 27-28 (describing promotion strategy for webcasts); Introductory Memorandum to the Written Direct Statement of the National Religious Broadcasters Noncommercial Music License Committee, Including Educational Media Foundation, Sept. 23, 2019, at 2 (“the current ATH threshold is causing noncommercial broadcasters approaching this threshold to limit their streaming activities to avoid the obligation to pay usage fees”). Dr. Steinberg describes such management of access as a “harmful” problem, because it constrains a webcaster’s ability to pursue its mission. Steinberg Amended WDT, at 13. However, that is not an economic conclusion, and does not imply that the price of those goods and services should be lower than their fair market value. 289 Steinberg Amended WDT, at 16-21. 290 Introductory Memorandum to the Written Direct Statement of the National Religious Broadcasters Noncommercial Music License Committee, Including Educational Media Foundation, Sept. 23, 2019, at 3.

74

Written Rebuttal Testimony of Catherine Tucker Public Version

As I understand it, under the relevant regulations, a station subject to the CBI settlement qualifies as a “Noncommercial Educational Webcaster” because, among other things, it “[i]s directly operated by, or is affiliated with and officially sanctioned by, and the digital audio transmission operations of which are staffed substantially by students enrolled at, a domestically accredited primary or secondary school, college, university or other post- secondary degree-granting educational institution,” and it “[t]akes affirmative steps not to

make total transmissions in excess of 159,140 Aggregate Tuning Hours on any individual channel or station in any month, if in any previous calendar year it has made total transmissions in excess of 159,140 Aggregate Tuning Hours on any individual channel or station in any

month.”291

For each Noncommercial Educational Webcaster, its statutory rate is governed by 37 C.F.R. Part 380 Subpart C. I understand that as per 37 C.F.R. Part 380 Subpart C, for 2016 to 2020,

Each Noncommercial Educational Webcaster that did not exceed 159,140 total ATH for any individual channel or station for more than one calendar month in the immediately preceding calendar year and does not expect to make total transmissions in excess of 159,140 Aggregate Tuning Hours on any individual channel or station in any calendar month during the applicable calendar year shall pay an annual, nonrefundable minimum fee of $500 … for each of its individual channels, … for each calendar year.292

On September 23, 2019, SoundExchange and CBI announced that they reached a partial settlement in this proceeding concerning royalty rates and terms for eligible nonsubscription

transmissions made by Noncommercial Educational Webcasters over the internet during the

period 2021-2025.293 According to the Joint Motion to Adopt Partial Settlement between

291 37 C.F.R. § 380.21. 292 37 C.F.R. § 380.22(a). 293 NRBNMLC Ex. 20, Joint Motion to Adopt Partial Settlement between SoundExchange and CBI, Sept. 23, 2019, at 1-2.

75

Written Rebuttal Testimony of Catherine Tucker Public Version

SoundExchange and CBI, the settlement agreement with CBI generally adopted the terms

relevant for Noncommercial Educational Webcasters in Web IV.294 One exception is that the settlement specified that eligible Noncommercial Educational Webcasters would owe annual

minimum fees of $550 in 2021, increasing by $50 each year to $750 in 2025.295

While Dr. Steinberg suggests that “the CBI settlement rates are above the upper bound of a

reasonable rate for webcast rights” because CBI was motivated to avoid litigation costs,296 it is

not clear why CBI would be more motivated to avoid litigation costs than SoundExchange.297 Either way, Dr. Steinberg has not explained how this factor would explain the difference in the terms of the CBI settlement and those in NRBNMLC’s rate proposal.

As with SoundExchange’s agreement with CBI, the terms of SoundExchange’s settlement agreement with NPR do not provide support for NRBNMLC’s rate proposal. On September 23, 2019, SoundExchange, NPR, and the Corporation for Public Broadcasting (“CPB”) announced that they reached a partial settlement in this proceeding concerning royalty rates

and terms during the period 2021-2025.298 As I understand it, SoundExchange’s NPR and CPB settlement provides for a number of NPR affiliated public radio stations to collectively stream

up to 360 million “Music ATH” in 2021, growing to 400 million “Music ATH” in 2025.299 In

exchange, the agreement requires CPB300 to pay SoundExchange an annual lump sum payment

294 NRBNMLC Ex. 20, Joint Motion to Adopt Partial Settlement between SoundExchange and CBI, Sept. 23, 2019, at 2, 6. 295 NRBNMLC Ex. 20, Joint Motion to Adopt Partial Settlement between SoundExchange and CBI, Sept. 23, 2019, at 2, 7. 296 Steinberg Amended WDT, at 16. 297 I understand that SoundExchange incurs additional litigation costs for each party participating in the rate-setting proceedings. 298 NRBNMLC Ex. 21, Joint Motion to Adopt Partial Settlement between SoundExchange and NPR, Sept. 23, 2019. 299 NRBNMLC Ex. 21, Joint Motion to Adopt Partial Settlement between SoundExchange and NPR, Sept. 23, 2019, at 7-8. 300 CPB is a private, nonprofit entity that was founded by Congress and is funded by the federal government. Among other things, CPB provides funding for NPR. In particular, I understand that, historically, CPB has paid sound recording royalties for NPR. NRBNMLC Ex. 21, Joint Motion to Adopt Partial Settlement between SoundExchange and NPR, Sept. 23, 2019, at 2.

76

Written Rebuttal Testimony of Catherine Tucker Public Version

of $800,000.301 Neither the fee structure nor the royalty levels in the NPR agreement appear to support NRBNMLC’s rate proposal in this proceeding. As I understand it, neither SoundExchange nor NRBNMLC has filed a rate proposal including an industry-wide lump sum payment as the statutory royalty rate for religious broadcasters or other noncommercial

webcasters.302

Further, Dr. Steinberg dismisses the idea that this lump-sum payment might reflect a discount due to real advantages in terms of protection from bad debt that arises from being paid in

advance.303 Dr. Steinberg’s rationale is that:

[S]tations named by CPB as participants in the NPR agreement have unique access to relatively stable funding through tax dollars allocated as grants by CPB. Indeed, qualification to receive funding from the CPB is a requirement for originating public radio stations to participate in the NPR settlement agreement. CPB support is substantial, with $69.31 million budgeted for direct grants to local public radio stations in FY 2018.304

301 NRBNMLC Ex. 21, Joint Motion to Adopt Partial Settlement between SoundExchange and NPR, Sept. 23, 2019, at 7. According to data from SoundExchange, [ ]. SoundExchange Ex. 430, 2020-1-6 NonComms Stated and Allocated 2018.xlsx. 302 Dr. Steinberg proposes various adjustments to the NPR rate “[i]f the Judges determine that a lump sum similar to the NPR agreement is a reasonable fee structure for certain NCE webcasters (such as religious broadcasters),” but those adjustments, including scaling the NPR agreement lump sum payment based on a differential ATH cap, appear to assume a linear relationship between the lump sum amount and the ATH cap. Steinberg Amended WDT, at 17. Such a relationship is not obvious from the agreement, and may even be contradicted by the terms of the agreement. 303 Steinberg Amended WDT, at 19. 304 Steinberg Amended WDT, at 19 (internal citations omitted).

77

Written Rebuttal Testimony of Catherine Tucker Public Version

However, Dr. Steinberg fails to note that this allegedly stable source of government income is

only a small proportion of NPR revenues and is subject to congressional review.305 Like other

noncommercial broadcasters, NPR relies heavily on donations for its funding.306

SoundExchange’s settlement agreement with NPR “continues the structure of previous

settlements between the parties, while increasing the payment to be made by CPB.”307 I understand that those previous settlement agreements date back to 2001, and informed the

Judges’ determination of rates for other noncommercial webcasters in Web II.308 However, even then, the written determination suggests that the Judges found that the original NPR agreement did “not provide clear evidence of a per station rate that could be viewed as a proxy for one

that a willing buyer and a willing seller would negotiate today.”309 Dr. Steinberg does not provide any reason to believe that the new NPR agreement is more informative.

305 Public Radio Finances, NPR (last visited Jan. 10, 2020), https://www.npr.org/about-npr/178660742/public-radio- finances; Matthew Ingram, Trump Budget Has Public Broadcasting in a Fight for its Life, Fortune (Mar. 16, 2017), https://fortune.com/2017/03/16/trump-budget-public-broadcasting/; Callum Borchers, Trump’s budget will probably slash public media, but the biggest losers won’t be PBS and NPR, Wash. Post (Mar. 15, 2017), https://www.washingtonpost.com/news/the-fix/wp/2017/03/15/trumps-budget-will-likely-slash-public-media-but- the-biggest-losers-wont-be-pbs-and-npr/; Joe Concha, Trump proposes eliminating federal funding for PBS, NPR, The Hill (Feb. 12, 2018), https://thehill.com/homenews/media/373434-trump-proposes-eliminating-federal-funding- for-pbs-npr. 306 Public Radio Finances, NPR (last visited Jan. 10, 2020), https://www.npr.org/about-npr/178660742/public-radio- finances. 307 NRBNMLC Ex. 21, Joint Motion to Adopt Partial Settlement between SoundExchange and NPR, Sept. 23, 2019, at 2. 308 Web II Determination, 72 Fed. Reg., at 24097-100. 309 Web II Determination, 72 Fed. Reg., at 24098. 78

Written Rebuttal Testimony of Catherine Tucker Public Version

Written Rebuttal Testimony of Catherine Tucker Public Version

REBUTTAL APPENDIX 1

FRACTION OF USERS WITH LIMITED, REDUCED, OR NO EXPOSURE TO THE TREATMENT OR FOR WHOM THE EFFECT CANNOT BE FULLY ASCERTAINED Public Version

REBUTTAL APPENDIX 2

DISTRIBUTION OF LISTENING ACTIVITY PER REGISTERED USER JUNE 4 – AUGUST 31, 2019 Public Version

REBUTTAL APPENDIX 3

NON-COMMERCIAL BROADCASTERS TOP FIVE HIGHEST EXCESS ROYALTIES OWED ROYALTIES AND FINANCIAL DATA 2018¹

Page 1 of 2 Public Version

REBUTTAL APPENDIX 3

NON-COMMERCIAL BROADCASTERS TOP FIVE HIGHEST EXCESS ROYALTIES OWED ROYALTIES AND FINANCIAL DATA 2018¹

Page 2 of 2 Public Version

REBUTTAL APPENDIX 4

NON-COMMERCIAL BROADCASTERS TOP FIVE HIGHEST EXCESS ROYALTIES OWED ROYALTIES AND FINANCIAL DATA USING PROPOSED ROYALTY RATES OF $1,000 PLUS $0.0028 PER PLAY 2018¹

Page 1 of 2 Public Version

REBUTTAL APPENDIX 4

NON-COMMERCIAL BROADCASTERS TOP FIVE HIGHEST EXCESS ROYALTIES OWED ROYALTIES AND FINANCIAL DATA USING PROPOSED ROYALTY RATES OF $1,000 PLUS $0.0028 PER PLAY 2018¹

Page 2 of 2 Public Version

REBUTTAL APPENDIX 5

PANDORA MEDIA, INC. INCOME STATEMENTS STANDARD & POOR'S CAPITAL IQ AND SEC FILINGS 2012 – Q3 2019 Public Version

REBUTTAL APPENDIX 6

PANDORA MEDIA, INC. INCOME STATEMENTS, PER USER PER MONTH BASIS STANDARD & POOR'S CAPITAL IQ AND SEC FILINGS 2012 – Q3 2019 Public Version

REBUTTAL APPENDIX 7

PANDORA MEDIA, INC. USER, ADVERTISING, AND SUBSCRIPTION METRICS STANDARD & POOR'S CAPITAL IQ AND SEC FILINGS 2012 – Q3 2019

Page 1 of 3 Public Version

REBUTTAL APPENDIX 7

PANDORA MEDIA, INC. USER, ADVERTISING, AND SUBSCRIPTION METRICS STANDARD & POOR'S CAPITAL IQ AND SEC FILINGS 2012 – Q3 2019

Page 2 of 3 Public Version

REBUTTAL APPENDIX 7

PANDORA MEDIA, INC. USER, ADVERTISING, AND SUBSCRIPTION METRICS STANDARD & POOR'S CAPITAL IQ AND SEC FILINGS 2012 – Q3 2019

Page 3 of 3 Public Version

REBUTTAL APPENDIX 8

IHEARTMEDIA WORLDWIDE INCOME STATEMENT SEC FILINGS 2010 – Q3 2019 Public Version

REBUTTAL APPENDIX 9

INCREMENTAL DOCUMENTS RELIED UPON

Bates Numbered Documents: [ ], NAB00002794. SoundExchange Ex. 206, [ ], PANWEBV_00005332. SoundExchange Ex. 210, [ ], PANWEBV_00005093. SoundExchange Ex. 254, [ ], NAB00002238. SoundExchange Ex. 288, [ ], SXMWEBV_00005444. SoundExchange Ex. 321, [ ], NAB00004542. SoundExchange Ex. 377, Deloitte Insights: Technology, Media, and Telecommunications Predictions (2019), NAB00001993. SoundExchange Ex. 378, The Infinite Dial: 2018, NAB00002166. SoundExchange Ex. 380, [ ], NAB00002609. SoundExchange Ex. 381, [ ], NAB00002613. SoundExchange Ex. 382, [ ], NAB00002659. SoundExchange Ex. 385, [ ], NAB00002858. SoundExchange Ex. 386, [ ], NAB00004036. SoundExchange Ex. 387, BIA Advisory Services: Market Assessment and Opportunities for Local Radio: 2018- 2022 (Apr. 2018), NAB00004126. SoundExchange Ex. 388, [ ] (May 16, 2018), NAB00004413. SoundExchange Ex. 389, [ ], NAB00004537. SoundExchange Ex. 390, NAB00005547. SoundExchange Ex. 391, NAB00006441. SoundExchange Ex. 394, [ ], NRBNMLC_WEBV_00000823. SoundExchange Ex. 395, [ ], PANWEBV_00004469. SoundExchange Ex. 396, Morgan Stanley, Revival: 5th Annual Music & Radio Survey (Jan. 10, 2019), PANWEBV_00006244. SoundExchange Ex. 397, Tech Survey 2019 Jacobs Media: Radio’s Survival Kit (2019), PANWEBV_00006670. SoundExchange Ex. 398, MusicWatch: How US Customers Listen to Music, audiocensus Q4 2018, PANWEBV_00007062. SoundExchange Ex. 399, [ ], PANWEBV_00008435. SoundExchange Ex. 400, [ ], PANWEBV_00009100. SoundExchange Ex. 401, [ ], PANWEBV_00009182. SoundExchange Ex. 402, [ ], PANWEBV_00004024. SoundExchange Ex. 403, [ ] (July 2019), SXMWEBV_00005224. SoundExchange Ex. 58, [ ], PANWEBV_00003357. SoundExchange Ex. 61, [ ], PANWEBV_00004139. SoundExchange Ex. 62, [ ], PANWEBV_00004249. SoundExchange Exhibit 205, [ ], PANWEBV_00004571. SoundExchange Exhibit 207, [ ], SXMWEBV_00004833. SoundExchange Exhibit 208, [ ], PANWEBV_00006711. SoundExchange Exhibit 209, [ ], PANWEBV_00006865. [ ], NRBNMLC_WEBV_00000270.

Page 1 of 8

Public Version

REBUTTAL APPENDIX 9

INCREMENTAL DOCUMENTS RELIED UPON

Articles, Books, and Publications: Anderson, Eric and Duncan Simester, “Long-Run Effects of Promotion Depth on New Versus Established Customers: Three Field Studies,” Marketing Science, Vol. 23, No. 1 (Winter 2004): 4-20. Anderson, Simon P., et al., “Push‐ Me Pull‐ You: Comparative Advertising in the OTC Analgesics Industry,” RAND Journal of Economics, Vol. 47, No. 4 (Nov. 2016): 1029-1056. Baker, Andrew and Naveen Naveen, “Chapter 15: Word-of-mouth processes in marketing new products: recent research and future opportunities” in Handbook of Research on New Product Development, Eds. Peter Golder and Debanjan Mitra, Edward Elgar Publishing, 2018. Banerjee, Abhijit and Esther Duflo, “An Introduction to the ‘Handbook of Field Experiments,’” Aug. 2016, available at https://www.povertyactionlab.org/sites/default/files/documents/handbook_intro.pdf. Bapna, Ravi and Akhmed Umyarov, “Do Your Online Friends Make You Pay? A Randomized Field Experiment on Peer Influence in Online Social Networks,” Management Science, Vol. 61, No. 8 (Aug. 2015): 1902-1920. Burtch, Gordon, Anindya Ghose, and Sunil Wattal, “The Hidden Cost of Accommodating Crowdfunder Privacy Preferences: A Randomized Field Experiment,” Management Science, Vol. 61, No. 5 (May 2015): 949-962. Castro, Luis and Carlos Scartascini, “Tax Compliance and Enforcement in the Pampas Evidence From a Field Experiment,” Journal of Economic Behavior & Organization, Vol. 116 (2015): 65-82. Catalini, Christian and Catherine Tucker, “When Early Adopters Don't Adopt,” Science, Vol. 357, No. 6347 (July 2017): 135-136. Chandrasekaran, Deepa and Gerard J. Tellis, “Chapter 14: A summary and review of new product diffusion models and key findings” in Handbook of Research on New Product Development, Eds. Peter Golder and Debanjan Mitra, Edward Elgar Publishing, 2018. Chassang, Sylvain, et al., “Accounting for Behavior in Treatment Effects: New Applications for Blind Trials,” PLoS ONE, Vol. 10, No. 6 (June 2015). Deaton, Angus and Nancy Cartwright, “Understanding and Misunderstanding Randomized Controlled Trials,” Social Science & Medicine, Vol. 210 (Aug. 2018): 2-21. Dewan, Sanjeev, Yi-Jen Ho, and Jui Ramaprasad, “Popularity or Proximity: Characterizing the Nature of Social Influence in an Online Music Community,” Information Systems Research, Vol. 28, No. 1 (Mar. 2017): 117-136. Harrison, Glenn and John List, “Field Experiments,” Journal of Economic Literature, Vol. 42, No. 4 (Dec. 2004). Harrison, Glenn, “Cautionary Notes on the Use of Field Experiments to Address Policy Issues,” Oxford Review of Economic Policy, Vol. 30, No. 4 (2014): 753-763. Hohnhold, Henning, Deirdre O’Brien, and Diane Tang, “Focusing on the Long-term: It’s Good for Users and Business,” 2015, https://static.googleusercontent.com/media/research.google.com/en//pubs/archive/43887.pdf. Kahneman, Daniel and Amos Tversky, “Prospect Theory: An Analysis of Decision Under Risk,” Econometrica, Vol. 47, No. 2 (Mar. 1979): 263-292. Lambrecht, Anja and Catherine Tucker, “Field Experiments” in Handbook of Marketing Analytics, Eds. Natalie Mizik and Dominique Hanssens, Edward Elgar Publishing, 2018. Lambrecht, Anja and Catherine Tucker, “Paying with Money or Effort: Pricing When Customers Anticipate Hassle,” Journal of Marketing Research, Vol. 49, No. 1 (2012): 66-82. Levitt, Steven and John List, “Was There Really a Hawthorne Effect at the Hawthorne Plant? An Analysis of the Original Illumination Experiments,” American Economic Journal: Applied Economics, Vol. 3, No. 1 (Jan. 2011): 224-238. Lewis, Randall and David Reiley, “Online Ads and Offline Sales: Measuring the Effects of Retail Advertising via a Controlled Experiment on Yahoo!,” Quantitative Marketing and Economics, Vol. 12, No. 3 (Sept. 2014): 235- 266. Lewis, Randall and Justin Rao, “The Unfavorable Economics of Measuring the Returns to Advertising,” The Quarterly Journal of Economics, Vol. 130, No. 4 (Nov. 2015): 1941-1973. Oestreicher-Singer, Gal and Lior Zalmanson, “Content or Community? A Digital Business Strategy for Content Providers in the Social Age,” Management Information Systems Quarterly, Vol. 37, No. 2 (June 2013): 591-616.

Page 2 of 8

Public Version

REBUTTAL APPENDIX 9

INCREMENTAL DOCUMENTS RELIED UPON

Articles, Books, and Publications, Continued: Rochet, Jean-Charles and Jean Tirole, “Platform Competition in Two-Sided Markets,” Journal of the European Economic Association, Vol. 1, No. 4 (June 2003): 990-1029. Senat, Marie-Victoire, et al., “Endoscopic Laser Surgery versus Serial Amnioreduction for Severe Twin-to-Twin Transfusion Syndrome,” New England Journal of Medicine, Vol. 351, No. 2 (July 2004): 136-144. Slemrod, Joel, Marsha Blumenthal, and Charles Christian, “Taxpayer Response to an Increased Probability of Audit: Evidence from a Controlled Experiment in Minnesota,” Journal of Public Economics, Vol. 79 (2001): 455-483. Tucker, Catherine and Anja Lambrecht, “When Does Retargeting Work? Information Specificity in Online Advertising,” Journal of Marketing Research, Vol. 50, No. 5 (Oct. 2013): 561-576. Tucker, Catherine and Juanjuan Zhang, “Growing Two-sided Networks by Advertising the User Base: A Field Experiment,” Marketing Science, Vol. 29, No. 5 (Sept.-Oct. 2010): 805-814. Tucker, Catherine and Juanjuan Zhang, “How Does Popularity Information Affect Choices? A Field Experiment,” Management Science, Vol. 57, No. 5 (May 2011): 828-842. Tucker, Catherine, “Social Networks, Personalized Advertising, and Privacy Controls,” Journal of Marketing Research, Vol. 51, No. 5 (Oct. 2014): 546-562. Tucker, Catherine, “The Reach and Persuasiveness of Viral Video Ads,” Marketing Science, Vol. 34, No. 2 (Mar. 2015): 281-296. Waldfogel, Joel, “Digitization and Quality of New Media Products: The Case of Music” in Economic Analysis of the Digital Economy, Eds. Avi Goldfarb, Shane Greenstein, and Catherine Tucker, University of Chicago Press, Apr. 2015. Waldfogel, Joel, Digital Renaissance: What Data and Economics Tell Us about the Future of Popular Culture, Princeton University Press, 2018. Yanagizawa-Drott, David and Jakob Svensson, “Estimating Impact in Partial vs. General Equilibrium: A Cautionary Tale from a Natural Experiment in Uganda,” Aug. 2012, https://epod.cid.harvard.edu/sites/default/files/2018- 02/estimating_impact_in_partial_vs._general_equilibrium- _a_cautionary_tale_from_a_natural_experiment_in_uganda.pdf. Zeiler, Kathryn, “Cautions on the Use of Economics Experiments in Law,” Journal of Institutional and Theoretical Economics, Vol. 166, No. 1 (Mar. 2010): 178-193.

Data from Listener Suppression Experiments: [ ], PANWEBV_00008309-310. [ ], PANWEBV_00008188-8307. [ ], PANWEBV_00008067-8082; PANWEBV_00008084-8187. [ ], PANWEBV_00004982- 991. [ ], PANWEBV_00008312. [ ], PANWEBV_00008311. File Production Cross Reference, Pandora-Sirius_WebV_013_X-Ref.xls. File Production Cross Reference, Pandora-Sirius_WebV_014_X-Ref.xls. File Production Cross Reference, Pandora-Sirius_WebV_11 Cross Ref.xls. [ ], PANWEBV_00008083. [ ], PANWEBV_00008065. [ ], PANWEBV_00008308. [ ], PANWEBV_00008064. [ ], PANWEBV_00008066. [ ], PANWEBV_00008314-8433. [ ], PANWEBV_00008061.

Page 3 of 8

Public Version

REBUTTAL APPENDIX 9

INCREMENTAL DOCUMENTS RELIED UPON

Data from Listener Suppression Experiments, Continued: [ ], PANWEBV_00008062. [ ], PANWEBV_00008063. [ ], PANWEBV_00008434. [ ], PANWEBV_00008313.

Depositions: Deposition of Leonard Wheeler, Dec. 4, 2019. SoundExchange Ex. 231, Deposition of David H. Reiley, Jr., Ph.D., Dec. 16, 2019.

Testimonies and Direct Statements: Amended Written Direct Testimony of Richard Steinberg, Dec. 11, 2019. Amended Written Rebuttal Testimony of Michael Herring, Feb. 20, 2015. Corrected Written Direct Testimony of Carl Shapiro, Nov. 20, 2019. Corrected Written Direct Testimony of David Reiley, Nov. 26, 2019. Corrected Written Direct Testimony of Joseph J. Cordes, Dec. 20, 2019. Introductory Memorandum to the Written Direct Statement of the National Religious Broadcasters Noncommercial Music License Committee, Including Educational Media Foundation, Sept. 23, 2019. Testimony of Steven Cutler in Web IV, Oct. 7, 2014. Written Direct Testimony of Aaron Harrison, Sept. 22, 2019. Written Direct Testimony of Andrew Gille, Sept. 17, 2019. Written Direct Testimony of Arpan Agrawal, Sept. 18, 2019. Written Direct Testimony of Carl Shapiro, Sept. 23, 2019. Written Direct Testimony of Catherine Tucker, Sept. 23, 2019. Written Direct Testimony of Christopher Phillips, Sept. 23, 2019. Written Direct Testimony of David Reiley, Sept. 23, 2019. Written Direct Testimony of Dominique M. Hanssens, Sept. 23, 2019. Written Direct Testimony of Dr. Gregory K. Leonard, Sept. 20, 2019. Written Direct Testimony of James Russell Williams III, Sept. 23, 2019. Written Direct Testimony of Jennifer D. Burkhiser, Sept. 23, 2019. Written Direct Testimony of Jennifer Witz, Sept. 23, 2019. Written Direct Testimony of Joel Waldfogel, Sept. 23, 2019. Written Direct Testimony of Jonathan Orszag, Sept. 23, 2019. Written Direct Testimony of Joseph C. Emert, Oct. 6, 2014. Written Direct Testimony of Joseph J. Cordes, Sept. 21, 2019. Written Direct Testimony of Leonard Wheeler, Sept. 19, 2019. Written Direct Testimony of Professor John R. Hauser, Sept. 20, 2019. Written Direct Testimony of Richard Steinberg, Sept. 23, 2019. Written Direct Testimony of Robert Pittman, Sept. 18, 2019. Written Direct Testimony of Robert Willig, Sept. 23, 2019. Written Direct Testimony of Stephan McBride in Web IV, Oct. 7, 2014. Written Direct Testimony of Steven Blatter, Nov. 28, 2011. Written Direct Testimony of Steven Blatter, Sept. 23, 2019. Written Direct Testimony of Steven R. Peterson, Sept. 23, 2019. Written Direct Testimony of Steven W. Newberry, Oct. 6, 2014. Written Direct Testimony of Steven W. Newberry, Sept. 20, 2019. Written Direct Testimony of T. Jay Fowler, Sept. 18, 2019. Written Direct Testimony of Timothy Westergren, Oct. 6, 2014. Written Direct Testimony of Tom Poleman, Sept. 19, 2019.

Page 4 of 8

Public Version

REBUTTAL APPENDIX 9

INCREMENTAL DOCUMENTS RELIED UPON

Testimonies and Direct Statements, Continued: Written Direct Testimony of Waleed Diab, Sept. 18, 2019. Written Rebuttal Testimony of Travis Ploeger, Jan. 10, 2020.

Earnings Calls Transcripts: iHeartMedia Inc. Q3 2019 Earnings Call Transcript, Nov. 7, 2019.

Financial Filings: Family Stations, Inc. Form 990 for the year ended December 31, 2018. iHeartMedia, Inc. Form 10-Q for the Fiscal Quarter ended September 30, 2019. Sirius XM Holdings, Inc. Form 10-Q for the Fiscal Quarter ended September 30, 2019.

Legal Documents: NRBNMLC Ex. 20, Joint Motion to Adopt Partial Settlement between SoundExchange and CBI, Sept. 23, 2019. NRBNMLC Ex. 21, Joint Motion to Adopt Partial Settlement between SoundExchange and NPR, Sept. 23, 2019. Pandora Media, LLC’s Response and Objections to Request No. 8 from SoundExchange Inc.’s Second Set of Requests for Production of Documents, Dec. 13, 2019. Sirius XM Radio Inc. and Pandora Media, LLC’s Responses and Objections to SoundExchange Inc.’s November 11, 2019 Interrogatories Directed to Sirius XM and Pandora, Nov. 18, 2019.

Statutes: 37 C.F.R. § 380.10(a)(2). 37 C.F.R. § 380.21. 37 C.F.R. § 380.22(a).

Websites: “2019 Conference on Digital Experimentation (CODE),” MIT Initiative on the Digital Economy, Nov. 1-2, 2019, http://ide.mit.edu/events/2019-conference-digital-experimentation-code. “An insider’s look at Family Radio and its leader Harold Camping,” The Mercury News, May 20, 2011, https://www.mercurynews.com/2011/05/20/an-insiders-look-at-family-radio-and-its-leader-harold-camping/. “Aplos Pricing: Try Aplos for Free,” Aplos Software, 2019, https://www.aplos.com/pricing. “Bezos calls Amazon experiment ‘a mistake’,” BizJournals, Sept. 28, 2000, https://www.bizjournals.com/seattle/ stories/2000/09/25/daily21.html. “Compare Office 365 Nonprofit plans: Large Nonprofits,” Microsoft, https://www.microsoft.com/en-us/microsoft- 365/nonprofit/office-365-nonprofit-plans-and-pricing?&activetab=tab:primaryr2. “Compare Office 365 Nonprofit plans: Small and Mid-sized Nonprofits,” Microsoft, https://www.microsoft.com/en- us/microsoft-365/nonprofit/office-365-nonprofit-plans-and-pricing?&activetab=tab:primaryr1. “Conference on Digital Experimentation (CODE),” Oct. 15, 2016, http://ide.mit.edu/events/conference-digital- experimentation-code-0. “End of the line for Christian radio network that predicted 2011 rapture,” The Denver Post, May 12, 2013, https://www.denverpost.com/2013/05/12/end-of-the-line-for-christian-radio-network-that-predicted-2011- rapture/. “Get 10 Donated Subscriptions of the World’s #1 Cloud Engagement Application,” Salesforce, 2019, https://www.salesforce.org/nonprofit_product/nonprofit-editions-pricing/. “Google for Nonprofits: Reach more donors online with Google Ad Grants,” Google, https://www.google.com/nonprofits/offerings/google-ad-grants/. “I’m already a subscriber. Do I get a discount on additional subscriptions?” SiriusXM, 2019, https://listenercare.siriusxm.com/app/answers/detail/a_id/3680/~/im-already-a-subscriber.-do-i-get-a-discount- on-additional-subscriptions%3F.

Page 5 of 8

Public Version

REBUTTAL APPENDIX 9

INCREMENTAL DOCUMENTS RELIED UPON

Websites, Continued: “iHeartMedia, Inc. Reports Results for 2019 Third Quarter,” BusinessWire, Nov. 7, 2019, https://www.businesswire.com/news/home/20191107005341/en/iHeartMedia-Reports-Results-2019-Quarter. “Microsoft Nonprofits,” Microsoft, https://www.microsoft.com/en-us/nonprofits/azure. “Pandora Premium for Families,” Pandora, 2019, https://www.pandora.com/upgrade/premium/family- plan?TID=PM:PSE:Google&gclid=Cj0KCQiAw4jvBRCJARIsAHYewPP_2y9rQDmMsvF09oDTAUp- fZkm7fU_ixzim2U6mjrK0ku4n2nx0eYaApywEALw_wcB. “Pandora Premium Student,” Pandora, 2019, https://www.pandora.com/upgrade/premium/student. “Pandora teams up with T-Mobile as an Un-carrier partner for unlimited ad-free music,” Pandora Blog, Aug. 15, 2018, http://blog.pandora.com/us/pandora-teams-up-with-t-mobile-as-an-un-carrier-partner-for-unlimited-ad- free-music/. “Public Radio Finances,” NPR, 2020, https://www.npr.org/about-npr/178660742/public-radio-finances. “Slack for Nonprofits,” Slack, 2019, https://slack.com/help/articles/204368833-Slack-for-Nonprofits. “Soundiiz General Features,” Soundiiz, https://soundiiz.com/features. “Spotify Premium,” Spotify, https://www.spotify.com/us/premium/. Alexei Oreskovic, “Twitter shares crash as user growth stalls,” Business Insider, Oct. 27, 2015, https://www.businessinsider.com/twitter-earnings-q3-2015-2015-10. Amber Neely, “How to transfer playlists from Spotify to Apple Music,” Apple Insider, Aug. 18, 2019, https://appleinsider.com/articles/19/08/18/how-to-transfer-playlists-from-spotify-to-apple-music. Ben Sisaro, “Adele Is Said to Reject Streaming for ‘25’,” The New York Times, Nov. 19, 2015, https://www.nytimes.com/2015/11/20/business/media/adele-music-album-25.html. Ben Sisaro, “Taylor Swift Announces World Tour and Pulls Her Music From Spotify,” The New York Times, Nov. 3, 2014, https://artsbeat.blogs.nytimes.com/2014/11/03/taylor-swift-announces-world-tour-and-pulls-her-music- from-spotify/. Bill Shaikan, “For the sixth year in a row, most Dodgers fans can’t watch their team on television,” Los Angeles Times, Mar. 8, 2019, https://www.latimes.com/sports/dodgers/la-sp-dodgers-20190308-story.html. Bob Smietana, “Christian radio group faces financial hard times,” U.S.A. Today, May 14, 2013, https://www.usatoday.com/story/news/nation/2013/05/14/family-radio-finances-world-did-not-end/2159621/. Callum Borchers, “Trump’s budget will probably slash public media, but the biggest losers won’t be PBS and NPR,” The Washington Post, Mar. 15, 2017, https://www.washingtonpost.com/news/the-fix/wp/2017/03/15/trumps- budget-will-likely-slash-public-media-but-the-biggest-losers-wont-be-pbs-and-npr/. Carmen Reinicke, “Spotify slips after not adding as many paid subscribers as hoped (SPOT),” Markets Insider, July 31, 2019, https://markets.businessinsider.com/news/stocks/spotify-earnings-2q-stock-price-reaction- disappointing-subscriber-growth-2019-7-1028403687. Catherine Tucker, “Network Effects Matter Less Than They Used To,” Harvard Business Review, June 22, 2018, https://hbr.org/2018/06/why-network-effects-matter-less-than-they-used-to. Cecilia Kang, “Taylor Swift has taken all her music off Spotify,” The Washington Post, Nov. 3, 2014, https://www.washingtonpost.com/news/business/wp/2014/11/03/taylor-swift-has-taken-all-her-music-off- spotify/. Charlotte Atler, “Taylor Swift Just Removed Her Music From Spotify,” TIME, Nov. 3, 2014, https://time.com/3554438/taylor-swift-spotify/. Claire Reilly, “Netflix quietly tests price hikes in Australia,” CNET, May 14, 2017, https://www.cnet.com/news/netflix-quietly-tests-weekend-price-increases-australia/. David Porter, “To Everything There is a Season,” 8tracks Blog, Dec. 26, 2019, https://blog.8tracks.com/. Dominic Rushe, “Myspace sold for $35m in spectacular fall from $12bn heyday,” The Guardian, June 30, 2011, https://www.theguardian.com/technology/2011/jun/30/myspace-sold-35-million-news. Don Reisinger, “Here’s the Latest Taylor Swift Apple Music Ad to Go Viral,” Fortune, Apr. 18, 2016, https://fortune.com/2016/04/18/taylor-swift-apple-music/.

Page 6 of 8

Public Version

REBUTTAL APPENDIX 9

INCREMENTAL DOCUMENTS RELIED UPON

Websites, Continued: Frank Pallotta and Brian Stelter, “Adele won’t allow ‘25’ album to be streamed,” CNN Business, Nov. 19, 2015, https://money.cnn.com/2015/11/19/media/adele-streaming/. Hannah Ellis-Petersen, “Taylor Swift takes a stand over Spotify music royalties,” The Guardian, Nov. 5, 2014, https://www.theguardian.com/music/2014/nov/04/taylor-swift-spotify-streaming-album-sales-snub. Hulu (@hulu), Twitter, Sept. 16, 2019, https://twitter.com/hulu/status/1173724121726738433. Hulu (@hulu), Twitter, Oct. 3, 2017, https://twitter.com/hulu/status/915283841098682368. Hulu (@hulu), Twitter, Oct. 3, 2017, https://twitter.com/hulu/status/915294183363170306. Hulu (@hulu), Twitter, Oct. 4, 2017, https://twitter.com/hulu/status/915648154854404097. Joanna Robinson, “Now That You’re Hooked, Netflix Is Looking to Raise Its Prices Again,” Vanity Fair, May 16, 2017, https://www.vanityfair.com/hollywood/2017/05/netflix-raising-prices-weekend-surge-pricing. Joe Concha, “Trump proposes eliminating federal funding for PBS, NPR,” The Hill, Feb. 12, 2018, https://thehill.com/homenews/media/373434-trump-proposes-eliminating-federal-funding-for-pbs-npr. Josh Levenson and Quentyn Kennemer, “Apple Music vs. Spotify: Which service is the streaming king?,” Digital Trends, Nov. 11, 2019, https://www.digitaltrends.com/music/apple-music-vs-spotify/. Kia Kokalitcheva, “Thanks to Adele, Pandora Says ‘Hello’ to a Stock Price Bump,” Fortune, Nov. 25, 2015, https://fortune.com/2015/11/25/adele-pandora/. Kristen Scholer, “Adele Says Hello to Pandora,” Wall Street Journal, Nov. 25, 2015, https://blogs.wsj.com/moneybeat/2015/11/25/pandora-up-as-adele-says-hello-to-the-streaming-service-bye-to- others/. Linda Rosencrance, “Testing reveals varied DVD prices on Amazon,” CNN, Sept. 7, 2000, https://www.cnn.com/2000/TECH/computing/09/07/amazon.dvd.discrepancy.idg/hs~index.html. Lisa Eadicicco, “Microsoft hired a man named Mac Book to star in its latest ad slamming Apple's laptops,” Business Insider, Aug. 1, 2019, https://www.businessinsider.com/microsoft-slams-apple-macbook-laptops-ad-2019-8. Marco Iansiti and Karim R. Lakhani “Managing Our Hub Economy,” Harvard Business Review, Sept.-Oct. 2017, https://hbr.org/2017/09/managing-our-hub-economy. Matthew Ingram, “Trump Budget Has Public Broadcasting in a Fight for its Life,” Fortune, Mar. 16, 2017, https://fortune.com/2017/03/16/trump-budget-public-broadcasting/. Michael Potluck, “Samsung Galaxy ad uses missing iPhone 11 camera feature as bait to switch,” 9to5Mac, Sept. 13, 2019, https://9to5mac.com/2019/09/13/samsung-iphone-11-missing-camera-feature/. Mike Ayers, “Adele’s ‘25’ Won’t Be Available on Spotify or Apple Music,” Wall Street Journal, Nov. 19, 2015, https://blogs.wsj.com/speakeasy/2015/11/19/adeles-25-wont-be-available-on-spotify-or-apple-music/. Neha Malara and Supantha Mukherjee, “Spotify shares surge after surprise profit, rise in paid users,” Reuters, Oct. 28, 2019, https://www.reuters.com/article/us-spotify-tech-results/spotify-shares-surge-after-surprise-profit-rise- in-paid-users-idUSKBN1X70X9. Nelson Aguilar, “Spotify vs. Apple vs. Pandora vs. Tidal vs. Deezer vs. Amazon,” Gadget Hacks, July 19, 2019, https://web.archive.org/web/20190721232028/https://smartphones.gadgethacks.com/news/best-music-streaming- services-spotify-vs-apple-vs-pandora-vs-tidal-vs-deezer-vs-amazon-0199737/. Nicola Menzie, “Family Radio Founder Harold Camping Repents, Apologizes for False Teachings,” The Christian Post, Oct. 30, 2011, https://www.christianpost.com/news/family-radio-founder-harold-camping-repents- apologizes-for-false-teachings.html. Nigel Smith, “Adele’s new album 25 will not be streamed on Spotify,” The Guardian, Nov. 19, 2015, https://www.theguardian.com/music/2015/nov/19/adele-new-album-25-not-stream-spotify-apple-music. Pamela Engel, “Taylor Swift Pulls All Of Her Albums From Spotify,” Business Insider, Nov. 3, 2014, https://www.businessinsider.com/taylor-swift-pulled-all-of-her-albums-from-spotify-2014-11. Remi Rosmarin, “Here are the main differences between Amazon's two music streaming services, Prime Music and Amazon Music Unlimited,” Business Insider, June 26, 2019, https://www.businessinsider.com/prime-music-vs- amazon-music-unlimited.

Page 7 of 8

Public Version

REBUTTAL APPENDIX 9

INCREMENTAL DOCUMENTS RELIED UPON

Websites, Continued: Rick Paulas, “What Happened to Doomsday Prophet Harold Camping After the World Didn’t End?,” Vice, Nov. 7, 2014, https://www.vice.com/en_us/article/yvqkwb/life-after-doomsday-456. Ronny Kohavi, “Trustworthy Online Controlled Experiments and the Risks of Uncontrolled Observational Studies,” Microsoft, 2019, https://exp-platform.com/Documents/2019-08%20CausalDiscoveryExPRonnyK.pdf. Ryan Waniata and Parker Hall, “Spotify vs. Pandora: Which music streaming service is better for you?” Digital Trends, July 25, 2019, https://www.digitaltrends.com/music/spotify-vs-pandora/. Sally Kaplan, “Amazon is offering 4 months of access to its music-streaming service for $1 as a Cyber Monday deal — here’s how to take advantage,” Business Insider, Dec. 2, 2019, https://www.businessinsider.com/amazon- music-unlimited-deal. Seth Stevenson, “Mac Attack: Apple’s mean-spirited new ad campaign,” Slate, https://slate.com/business/2006/06/apple-s-mean-spirited-ad-campaign.html. Steve Knopper, “Taylor Swift Abruptly Pulls Entire Catalog From Spotify,” Rolling Stone, Nov. 3, 2014, https://www.rollingstone.com/music/music-news/taylor-swift-abruptly-pulls-entire-catalog-from-spotify-55523/. Thomas Germain, “Best Music Streaming Services,” Consumer Reports, Sept. 18, 2019, https://www.consumerreports.org/streaming-media/best-music-streaming-service-for-you/. Ty Pendlebury and Xiomara Blanco, “Best music streaming: Spotify, Apple Music and more, compared,” CNET, Nov. 24, 2019, https://www.cnet.com/how-to/best-music-streaming-service-of-2019-spotify-pandora-apple- music/. Victor Luckerson, “11 Wildly Popular Albums You Can’t Get on Spotify,” TIME, Mar. 29, 2016, https://time.com/4274430/spotify-albums/. Wendy Melillo, “Amazon Price Test Nets Privacy Outcry,” AdWeek, Oct. 2, 2000, https://www.adweek.com/brand- marketing/amazon-price-test-nets-privacy-outcry-30060/.

Other: Factiva. SoundExchange Ex. 375, Declaration of Collin R. Jones (Jan. 7, 2020). SoundExchange Ex. 376, David Reiley, “Suggestions for Further Reading, W241: Experiments and Causality,” 2015, https://docs.google.com/document/d/1IMsGTHmklhvetfJJfEm9dhoFM7bvb-YOkN_6mAM8kFM/edit#. SoundExchange Ex. 404, David Reiley, “Field Experiments,” 2015, https://docs.google.com/document/d/1BDUxgzEk1vWXMiV2nz pZzoa7Hyp8I9LBiYmqA9XtZpo/edit. SoundExchange Ex. 430, 2020-1-6 NonComms Stated and Allocated 2018.xlsx. SoundExchange Ex. 431, NAB Stipulation.

Page 8 of 8

Public Version Exhibits Sponsored by Catherine Tucker

Exhibit No. Sponsored By Description Designation

SoundExchange Ex. 017 Catherine Tucker [ Restricted ] SoundExchange Ex. 018 Catherine Tucker [ Restricted ] SoundExchange Ex. 054 Catherine Tucker; Robert [ Restricted Willig ] SoundExchange Ex. 056 Catherine Tucker; Robert [ Restricted Willig ] SoundExchange Ex. 058 Catherine Tucker; Robert [ Restricted Willig; Jonathan Orszag ] SoundExchange Ex. 060 Catherine Tucker; Robert [ Restricted Willig ] SoundExchange Ex. 061 Catherine Tucker; Robert [ Restricted Willig ] SoundExchange Ex. 062 Catherine Tucker; Robert [ Restricted Willig; Jonathan Orszag ]

SoundExchange Ex. 065 Catherine Tucker; Jonathan [ Restricted Orszag; Robert Willig ] SoundExchange Ex. 069 Catherine Tucker Annual Music Study 2018 Report RIAA April Restricted 2019 SoundExchange Ex. 070 Catherine Tucker Top Statutory Stated Liability through June 2019 Restricted SoundExchange Ex. 071 Catherine Tucker Noncomms and CPD stated and allocated 2018 Public Final SoundExchange Ex. 191 Catherine Tucker [ Restricted ] SoundExchange Ex. 192 Catherine Tucker [ Restricted ] SoundExchange Ex. 193 Catherine Tucker [ Restricted ] SoundExchange Ex. 194 Catherine Tucker [ Restricted

] SoundExchange Ex. 195 Catherine Tucker [ Restricted ] SoundExchange Ex. 196 Catherine Tucker [ Restricted ] SoundExchange Ex. 197 Catherine Tucker [ ] Restricted SoundExchange Ex. 198 Catherine Tucker [ Restricted ] SoundExchange Ex. 199 Catherine Tucker [ Restricted s] SoundExchange Ex. 200 Catherine Tucker [ Restricted ] SoundExchange Ex. 205 Robert Willig; Catherine [ Restricted Tucker ]

Page 1 of 3 Public Version

Exhibit No. Sponsored By Description Designation

SoundExchange Ex. 206 Robert Willig; Jonathan [ Restricted Orszag; Catherine Tucker ]

SoundExchange Ex. 207 Robert Willig; Catherine [ Restricted Tucker ]

SoundExchange Ex. 208 Robert Willig; Catherine [ Restricted Tucker; Jonathan Orszag ]

SoundExchange Ex. 209 Robert Willig; Jonathan [ Restricted Orszag; Catherine Tucker ]

SoundExchange Ex. 210 Robert Willig; Jonathan [ Restricted Orszag; Catherine Tucker ] SoundExchange Ex. 231 Catherine Tucker; Robert [ ] Restricted Willig; Gal Zauberman; Itamar Simonson SoundExchange Ex. 254 Jonathan Orszag; Catherine [ Restricted Tucker ] SoundExchange Ex. 288 Jonathan Orszag; Catherine [ Restricted Tucker ] SoundExchange Ex. 321 Jonathan Orszag; Catherine [ ] Restricted Tucker SoundExchange Ex. 374 Catherine Tucker Consent Motion of the National Religious Public Broadcasters Noncommercial Music License Committee to Submit Corrected Written Direct Testimony of Joseph J. Cordes SoundExchange Ex. 375 Catherine Tucker Declaration of Cullin R. Jones Public SoundExchange Ex. 376 Catherine Tucker Suggestions for Further Reading Public SoundExchange Ex. 377 Catherine Tucker Deloitte Insights: Technology, Media, and Restricted Telecommunications Predictions, 2019 SoundExchange Ex. 378 Catherine Tucker Edison: The Infinite Dial 2018 Restricted SoundExchange Ex. 379 Catherine Tucker National Association of Broadcasters Radio Board Restricted of Directors, Minutes, Oct. 25, 2016 SoundExchange Ex. 380 Catherine Tucker [ Restricted ] SoundExchange Ex. 381 Catherine Tucker [ ] Restricted SoundExchange Ex. 382 Catherine Tucker [ Restricted ] SoundExchange Ex. 383 Catherine Tucker [ Restricted ] SoundExchange Ex. 384 Catherine Tucker Coleman Insights Media Research: The Image Restricted Pyramid SoundExchange Ex. 385 Catherine Tucker Coleman Insights Media Research: National Restricted Marketplace SoundExchange Ex. 386 Catherine Tucker [ ] Restricted SoundExchange Ex. 387 Catherine Tucker [ Restricted ]

Page 2 of 3 Public Version

Exhibit No. Sponsored By Description Designation

SoundExchange Ex. 388 Catherine Tucker [ Restricted

] SoundExchange Ex. 389 Catherine Tucker [ Restricted ] SoundExchange Ex. 390 Catherine Tucker [ Restricted ]

SoundExchange Ex. 391 Catherine Tucker [ Restricted

] SoundExchange Ex. 392 Catherine Tucker [ ] Restricted SoundExchange Ex. 393 Catherine Tucker Family Stations, Inc Form 8879-EO Public SoundExchange Ex. 394 Catherine Tucker NRBNMLC Board Members Restricted SoundExchange Ex. 395 Catherine Tucker [ Restricted ] SoundExchange Ex. 396 Catherine Tucker Morgan Stanley - Revival: 5th Annual Music & Restricted Radio Survey SoundExchange Ex. 397 Catherine Tucker Jacobs Media: Radio's Survival Kit Restricted SoundExchange Ex. 398 Catherine Tucker MusicWatch: How US Consumers Listen to Music Restricted (Audiocensus Q4 2018) SoundExchange Ex. 399 Catherine Tucker [ ] Restricted SoundExchange Ex. 400 Catherine Tucker [ ] Restricted SoundExchange Ex. 401 Catherine Tucker, Itamar [ ] Restricted Simonson SoundExchange Ex. 402 Catherine Tucker [ Restricted ] SoundExchange Ex. 403 Catherine Tucker [ Restricted ] SoundExchange Ex. 404 Catherine Tucker Dr. Reiley Course Syllabus Public SoundExchange Ex. 430 Catherine Tucker [ Restricted

]

Page 3 of 3 Proof of Delivery

I hereby certify that on Friday, July 31, 2020, I provided a true and correct copy of the SoundExchange's Unopposed Motion to Submit the Corrected Written Rebuttal Testimony of Catherine Tucker to the following:

Google Inc., represented by Kenneth L Steinthal, served via ESERVICE at [email protected]

Pandora Media, LLC, represented by Benjamin E. Marks, served via ESERVICE at [email protected]

Educational Media Foundation, represented by David Oxenford, served via ESERVICE at [email protected]

National Association of Broadcasters, represented by Sarang V Damle, served via ESERVICE at [email protected]

Sirius XM Radio Inc., represented by Benjamin E. Marks, served via ESERVICE at [email protected]

National Religious Broadcasters Noncommercial Music License Committee, represented by Karyn K Ablin, served via ESERVICE at [email protected]

Signed: /s/ Previn Warren