<<

Health and the Power of Narrative Messaging in the Public Sphere

Timothy Caulfield, Alessandro R. Marcon, Blake Murdoch, Jasmine M. Brown, Sarah Tinker Perrault, Jonathan Jerry, Jeremy Snyder, Samantha J. Anthony, Stephanie Brooks, Zubin Master, Ubaka Ogbogu, Joshua Greenberg, Amy Zarzeczny, Robyn Hyde-Lay

TC – Health Law Institute, Faculty of Law, School of Public Health, University of Alberta ARM – Health Law Institute, Faculty of Law, University of Alberta BM – Health Law Institute, Faculty of Law, University of Alberta JMB – Director of Communications, Institute of Health Economics; Graduate Student, Department of Psychiatry, University of Alberta STP – University Writing Program, University of California, Davis JJ – Office for Science and Society, McGill University JS – Faculty of Health Sciences, Simon Fraser University SJA - The Hospital for Sick Children, Co-Investigator, CNTRP SB – Department of Pediatrics, Faculty of Medicine, University of Alberta ZM - Biomedical Ethics Research Program, Mayo Clinic UO – Faculty of Law, Faculty of Pharmacy and Pharmaceutical Sciences, University of Alberta AZ - Johnson Shoyama Graduate School of Public Policy, University of Regina JG – School of Journalism and Communication, Faculty of Communication and Media Studies, Carleton University RHL - Health Law Institute, Faculty of Law, University of Alberta

I. Introduction

There is growing recognition that numerous social, economic and academic pressures can have a negative impact on representations of biomedical research. Empirical evidence indicates or interpretive is injected throughout the knowledge production process, including at the stage of grant writing, in the execution of the research, and in the production of the relevant manuscripts and institutional press releases [Boutron and Ravaud, 2018; Caulfield and Condit, 2012; Yavchitz et al., 2012]. The popular press, forces, and online media also play significant roles in misrepresenting biomedical research [Caulfield and Condit, 2012; Kamenova, Reshef, and Caulfield, 2014; Turner and Knoepfler, 2016; Van Atteveldt, Van Aalderen-Smeets, Jacobi, and Ruigrok, 2014]. Here, the wilful or unintended dissemination of misinformation through increasingly interactive media platforms also stands as a very real and growing concern [Anderson and Rainie, 2017; Edelman, 2018] and presents a significant challenge to generating rational, evidence-based conversations about biomedicine generally, and its benefits and risks, in particular.

Exacerbating these forces, numerous trends have emerged which further problematize the science communication landscape, especially in the context of health. In this commentary, we review several of the forces playing an increasingly pernicious role in how information is interpreted, shared and used. In particular, we focus on the role that narrative plays in communicating science and misinformation to explore how aspects of narrative are used in different social contexts and communication environments. While most people have an inherent understanding of storytelling, narrative has been defined as a type of communication that “describes the cause-and-effect relationships between events that take place over a particular time period that impact particular characters” [Dahlstrom, 2014]. Narrative is a powerful tool that can enhance engagement and understanding of both truths and falsehoods [Glaser, Garsoffky, and Schwan, 2009]. Online, and specifically in the context of misinformation (e.g. conspiracy narratives, pseudoscience, etc.), research has shown that the construction and dissemination of narratives requires the efforts of numerous participants with differing levels of engagement [Introne, Yildirim, Iandoli, Decook, J, and Elzeini, 2018]. It is dynamic, incorporating numerous discursive elements (multiple sources, hashtags, hyperlinks, memes, etc.) as well as flexible, adopting and discarding both story elements and participants as time passes [Introne, Yildirim, Iandoli, Decook, J, and Elzeini, 2018].

This commentary is not meant to be a comprehensive survey of emerging health communication developments or an exhaustive account of the role that narrative plays in that context. Rather, we highlight elements that have increasingly troubled health communication in recent times, and present creative responses that may help counter the negative trends. Indeed, as we will see, traditional methods of communication have in many ways failed the public, and changes in approach are required.

II. Online Communities

Much has been written about the growing influence of social media on public discourse, including in the context of health [Smailhodzic, Hooijsma, Boonstra, and Langley, 2016]. While traditional news outlets – such as newspapers – remain the dominant source of science information [Funk, Gottfried, and Mitchell, 2017], social media platforms, including Facebook, Twitter, Instagram, YouTube and Reddit, have become important sources of health information and sites for public engagement and community-building [Centola, 2010; De Choudhury and De, 2014; De Choudhury, Morris, and White, 2014; Fernández-Luque and Bau, 2015; Guilbeault, Becker, and Centola, 2018] . Studies show younger generations are more willing to share health information about themselves online than older generations, and people of all generations increasingly go online to seek others with similar health concerns or conditions for information and support [Graham, Cobb, and Cobb, 2016]. All age groups in North America use social media heavily [Gruzd, Jacobson, Mai, and Dubois, 2018; Smith and Anderson, 2018], so the influence of social networking platforms is likely to grow.

We know, however, that the health and science information on these platforms is often problematic [Antheunis, Tates, and Nieboer, 2013; Madathil, Rivera-Rodriguez, Greenstein, and Gramopadhye, 2015]. Research has found, for example, that social media are used to spread harmful health messages, including, to cite just a few examples, antivaccine rhetoric [Dunn, Leask, Zhou, Mandl, and Coiera, 2015; Tomeny, Vargo, and El-Toukhy, 2017], misinformation about the Zika virus [Sharma, Yadav, Yadav, and Ferdinand, 2017] and Lyme disease [Basch et al., 2017], as well as Ebola-related prevention and treatment strategies [Oyeyemi, Gabarron, and Wynn, 2014]. Alarmingly, a 2018 study of millions of Twitter interactions over a ten-year period found that falsehoods diffused “farther, faster, deeper and more broadly” than the truth [Vosoughi, Roy, and Aral, 2018]. And while some recent research has indicated notions of the “echo chamber” might be overstated [Dubois and Blank, 2018; Flaxman, Goel, and Rao, 2013], online environments do exhibit polarization characteristics and are spaces where misinformation can spread virally [Del Vicario et al., 2016]. Information dissemination through the public is incredibly complex. While offline interactions in community settings such as clubs, organizations and social groups still carry significant weight, social media also clearly impacts bioscience communication. This impact affects not only the general public but also research funding and policy [Chafe, Born, Slutsky, and Laupacis, 2011; Pullman, Zarzeczny, and Picard, 2013].

Social media consists of diverse communication ecosystems, shaped by the algorithmic logistics of each particular platform, use by varied demographics, and the resulting creation of unique communication trends and patterns. A core component shared by these ecosystems is the dynamic of “social homophily”, which explains how people more commonly associate, and are more influenced by, those similar to themselves [Centola, 2010; Guilbeault et al., 2018; McPherson, Smith-Lovin, and Cook, 2001; Sunstein, 2017]. The clustering of individuals online into various communities places importance on the role that public intellectuals, celebrities, or influencers can play in knowledge transmission [Caulfield, 2015; Freberg, Graham, McGaughey, and Freberg, 2011; Khamis, Ang, and Welling, 2017], while platform algorithms are also becoming increasingly influential in shaping how information circulates [Muchnik, Aral, and Taylor, 2013; Striphas, 2016].

In more contentious social contexts, groups or communities invested in shaping public perceptions around particular topics can form, and at times, sharp divisions can emerge between various groups with differing perspectives. For example, research has shown that invested parties can create “discourse coalitions” [Hajer, 2002; Metze and Dodge, 2016], making use of communal terms and arguments to promote their causes [Attwell, Smith, and Ward, 2018; Marcon and Caulfield, 2017; Smart, 2011; Yardi and Boyd, 2010]. Here, heuristics such as [Stanovich and West, 2000] and information avoidance [Golman, Hagmann, and Loewenstein, 2017] can enforce established beliefs. Research has shown there is polarization between antagonistic discourse communities [Schmidt, Zollo, Scala, Betsch, and Quattrociocchi, 2018; Sunstein, 2017; Yardi and Boyd, 2010], raising questions about how to get differing groups to communicate effectively and also, importantly, how to accurately and truthfully disseminate information to all groups [Lazer et al., 2018; Sunstein, 2017; Yardi and Boyd, 2010]. This matter is becoming increasingly complicated with the rise of misinformation, which, in the modern media landscape, has taken on the popular reference or label of “” [Lazer et al., 2018; Silverman, 2016] . Research has also shown that fake news impacts everyone – even those who know the information to be false [Fazio, Brashier, Payne, and Marsh, 2015]. Indeed, mere exposure to information can influence belief [Jolley and Douglas, 2014] and repeated exposures can strengthen perceptions of authenticity [Henkel and Mattson, 2011; Pennycook, Cannon, and Rand, 2018]. Online bots (software robots) are also playing a role by taking advantage of platforms’ algorithms to promote particular stories, events or narratives, drown out others, and influence online social ecosystems in ways that will require ongoing monitoring and research [Ferrara, Varol, Davis, Menczer, and Flammini, 2016; Lazer et al., 2018; McKelvey and Dubois, 2017].

A growing body of literature suggests that narratives can have tremendous sway. Across disciplines, studies have shown how narratives facilitate recall [Dahlstrom, 2014; Neimand, 2018] and spur emotional responses [Aldama, 2015; Barraza and Zak, 2009; Morgan, Movius, and Cody, 2009; Yoo, Kreuter, Lai, and Fu, 2014; Zak, 2015], which in turn can increase empathy [Barraza and Zak, 2009; Dahlstrom, 2014; Yoo et al., 2014] and perceptions of a source's trustworthiness [Cialdini, 2007; Farmer, McKay, and Tsakiris, 2014]. Narratives therefore possess some power of [Braddock and Dillard, 2016; Cialdini, 2007; Jones and Crow, 2017], whether that be to solidify one’s membership in a particular identity group [Jones and Crow, 2017; Neimand, 2018] or merely to draw one towards a particular perspective [Braddock and Dillard, 2016]. Recent research has shown how misinformation, and even credible information interpreted and then skewed in a particular manner, can serve as a means of substantiating a particular narrative [Introne et al., 2018]. As a result, a narrative can gain strength from the supportive “evidence” it creates and draws upon [Introne et al., 2018].

Social media platforms have become powerful tools for sharing narratives about therapies [Du, Rachul, Guo, and Caulfield, 2016], experiences [Han and Wiley, 2013] and emerging science. Social media also allows individuals to form parasocial relationships or “digital buddies” [Yuksel and Labrecque, 2016], which may heighten the influence of messaging [Chung and Cho, 2017] and strengthen social homophily. Indeed, research has noted that “a person like you” is just as credible a source of information as an academic or technical expert [Edelman, 2017]. Not all participation in online communities is necessarily negative. For instance, a platform like patientslikeme® allows individuals with similar conditions to share experiences and receive support, which in turn can help them cope with depression and overcome social stigma [De Choudhury et al., 2014; Graham et al., 2016; Griffiths et al., 2015]. However, when narratives generate emotional and empathetic responses on social media, both the source and the information presented can gain authority and traction, regardless of how reliable or accurate they may be [Morgan et al., 2009].

III. Implicit Hype and “Scienceploitation”

The phenomenon of science hype – the exaggeration or excessive promotion of scientific developments and applications – is getting more attention from the scientific community [Marcon, Bieber, and Caulfield, 2018; Master and Resnik, 2013] and popular media[Wetsman, 2018]. The sources of this hype are complex and interrelated, and they exist throughout the knowledge production pipeline [Caulfield and Condit, 2012]. Science hype can cause a range of social issues, including, inter alia, eroding public trust [Master and Resnik, 2013] , confusing policy debates [Pullman et al., 2013], and facilitating the premature implementation of technologies and the marketing of unproven therapies [Caulfield, Sipp, Murry, Daley, and Kimmelman, 2016; Petersen and Krisjansen, 2015]. While the problems with explicit hype are increasingly recognized, we are now seeing the growth of a more subtle form of hype.

The popular press, for example, sometimes presents emerging therapies in a manner that implies efficacy [Rachul, Rasko, and Caulfield, 2017]. This “implicit hype” occurs when unproven or even disproven interventions are presented as routine and/or uncontroversial in media reports. For example, recent research about the media portrayal of platelet rich plasma (PRP), an unproven therapy for various ailments including musculoskeletal injuries, found that it was most commonly covered in sports-related stories, and specifically in relation to elite athletes using the therapy as part of injury recovery or performance preparation [Rachul et al., 2017]. The therapy was portrayed as routine, and its use by elite athletes may imply that it is a cutting edge treatment [Rachul et al., 2017].81 But given the actual state of research surrounding PRP [Engebretsen et al., 2010; Moraes, Lenza, Tamaoki, Faloppa, and Belloti, 2014], these representations are implicit hype. These stories may have significant sway with the public as they combine high exposure (story about a professional athlete in popular media), an interesting narrative (athlete recovering from injury), and a suggestion that an emerging therapy is efficacious. Since narrative communication is persuasive, this implicit hype may be more resonant with most audiences than typical communications about the unproven nature of a therapy.

Another issue is that of pseudoscience, that is to say theories, assertions or interventions that claim or appear to be scientific but are not. Pseudoscientific phraseology is too often accepted in popular media without any critical reflection. A recent study of Spanish science journalists found that only 44.9% agreed that pseudoscientific information in the media is dangerous, with many respondents dismissing concern or expressing apathy as to the effects of false messaging in the media [Cortiñas-Rovira, Alonso-Marcos, Pont- Sorribes, and Escribà-, 2015]. Journalistic apathy the distinction between science and pseudoscience can only further hinder public understanding of novel health or biomedical developments, especially in cases where the l public only has basic knowledge about the topic at hand.

There are also explicit marketing strategies that leverage hype. Recent research has shown, for example, that some complementary and alternative medicine (CAM) providers combine hype and stem cell language in their marketing for both unproven stem cell therapies and other pseudoscientific products and therapies [Murdoch, Zarzeczny, and Caulfield, 2018]. For instance, the language of quantum physics [Szeto, Tomlinson, and Smart, 2018], genetics [Caulfield et al., 2015], and microbiome research [Bowles, 2017] have been used to market therapies that have not been scientifically tested. By capturing the interest around the scientific domain of stem cells, marketers can increase the attractiveness of, and exposure to, their products – even if they have no actual relation to stem cells. This phenomenon, which we call “scienceploitation”, occurs in many contexts but is understudied [Murdoch et al., 2018]. Because this type of misrepresentation uses language that can confer scientific legitimacy, it can be particularly difficult to address, especially if it is accompanied by other tokens of legitimacy (e.g., reference to publications in predatory journals or registered clinical trials) [Sipp et al., 2017] and is part of a broader, memorable narrative.

IV. Patients in the Public Sphere

Patients are also harnessing the power of the narrative to promote public awareness, build community and raise money and a profile for certain therapies. For example, the use of online crowdfunding has recently grown at an explosive rate [Massolution, 2015; Young and Scheinberg, 2017]. Health related crowdfunding has proven to be a highly competitive affair, and campaign leaders often attempt to construct “worthy bodies” that justify or morally compel donation [Paulus and Roberts, 2017]. In this way, the creation of powerful and compelling narratives is a key aspect of successful crowdfunding [Berliner and Kenworthy, 2017; Kim, Hong, and Karahalios, 2018; Kim, Kong, Karahalios, Fu, and Hong, 2016]. A similar effect can occur with public solicitation for organ donation, where patients can be judged not only on their personal appearance but also the biographical narratives they create to engender sympathy [McGee, 2005; Neidich, Neidich, Cooper, and Bramstedt, 2012].

Narratives often include information about the interventions sought and their efficacy, creating problems when these interventions are unproven or pseudoscientific. Indeed, recent research has shown that the narratives of crowdfunding campaigns for unproven stem cell therapies “underemphasize risks”, “exaggerate the efficacy” and “convey potentially misleading messages about stem-cell based interventions” [Snyder, Turner and, Crooks, 2018].

These examples show another way in which persuasive narratives can mislead. Marketing can extend into the personal narratives of individuals seeking aid, as campaigns often propagate the marketing language of the clinics where treatment is sought [Snyder et al., 2018]. This can act as a legitimizing force for unproven interventions, and legitimacy is subsequently reinforced when popular media outlets publish uncritical human-interest stories about such campaigns [Murdoch, Marcon, Downie, and Caulfield, forthcoming].

V. Policy Options

As noted, science communication is happening in the context of a research pipeline full of hype [Caulfield and Condit, 2012], a media environment rife with ambiguity and false balance [Clarke. 2008; Dixon and Clarke, 2012; Friedman, Dunwoody, and Rogers, 2012], and an online environment marred by fake news [Lazer et al., 2018; Silverman, 2016]. Meanwhile, the potential sway of the misinformation is often heightened by the use of engaging narratives. These forces add to the complexity of crafting effective, evidence-based policy responses. Complicating things further is the reality that not all audiences are impacted by narratives in the same manner or to the same degree. Some research has shown, for example, that audiences engaging a topic peripherally are more likely to find more convincing and persuasive than those highly motivated to engage the topic and analyze the information [Braverman, 2008]. With a wide range of audiences encountering numerous and diverse topics in popular media at any given time, the role of narrative is likely having some impact on how the public makes sense of biomedical issues – particularly in the contexts of nascent, developing science and health topics about which little is known.

Addressing the spread of misinformation through persuasive narratives seems essential, though it will not be easy. Many of the entities that twist information operate over the Internet. When online sources and communities come under fire, they can quickly and easily spring up in a new form elsewhere. The law can be an unwieldy, slow and overly blunt tool in the face of amorphous messaging and shifting actors. Still, existing legal and regulatory tools can have important roles to play in the right contexts. We must better enforce existing truth in law, which can act to curb misrepresentations in marketing and the proliferation of unproven and disproven treatments [Murdoch et al., 2018; Ogbogu, 2015; Sipp et al., 2017]. Given this is a complaint driven regulatory framework, non-profit organizations and individuals can play an important role, as we have seen, for example, with recent claims of made against Goop by the non-profit group Truth in Advertising [Helmore, 2017]. The law of negligent and fraudulent misrepresentation is also useful for all manner of claims that are false and relied upon [Caulfield, Ogbogu, and Robertson, 2015]. And when health care professionals are involved, as is often the case [Murdoch, Carr, and Caulfield, 2016], governing regulatory bodies should take steps against members who breach practice norms through the provision of misleading information [Zarzeczny A, Caulfield T, Ogbogu U., et al. 2014; Munsie and Hyun, 2014].

Despite these useful avenues, law and policy have limits. They can be slow, expensive, and, when government action is needed, constrained by political considerations. As such, more informal policy responses should also be considered. Individual public advocacy, at both the grassroots level and among prominent experts, can have a significant effect [Abbott, 2015; Phillips, 2017; Scientific American (Editors), 2018)]. For example, David Stephan, whose son died of meningitis in 2016 after his parents treated him solely with “natural remedies”, was removed as a keynote speaker from a wellness exposition in Western Canada after backlash on Twitter caused many corporate event sponsors to threaten to pull out if he was left on the program [Mattern, 2018]. One important aspect of this success story was the timeliness of the critical response online. Real time social media interventions that rapidly counter misinformation are needed to ensure that belief systems founded on misinformation do not take hold [Tomeny et al., 2017]. Codified standards, norms, and guidelines in the scientific community defining appropriate media engagement – as some scientific societies have begun to develop [International Society for Stem Cell Research, 2016] – are imperative to encourage a sense of responsibility to engage with misinformation in the public sphere and correct it.

Importantly, it should also be possible to use a narrative communication style to improve public understanding of evidence-based medicine, both through social media and more traditional avenues. The power of social media and the impact of narrative are prevalent and strong, so there is an imperative to strategically draw on their advantages to counter some of their more problematic applications. For example, research has shown that narratives presenting the ramifications of not vaccinating – specifically children’s suffering from preventable illness – can have a real impact on intention to vaccinate [Shelby and Ernst, 2013; Capurro, Greenberg, Dubé and Driedger, 2018]. Additionally, clear and definitive statements with a narrative component, made by respected and trusted voices will prove highly useful, and also provide dependable resources upon which journalists can rely.

Opinion editorials offer another useful pathway for narrative communication – indeed, recent research has found them to have an influence on public perception [Coppock, Ekins, and Kirby, 2018]. That said, science writing could also benefit from narrative style, if applied in a manner that does not compromise the truthfulness and comprehensiveness of the content [Perrault, 2013]. We shouldn’t use narratives to fight anecdote with anecdote. Rather, narratives can serve as a vehicle to communicate science and relevant science-informed policy in a more engaging and digestible manner. The spread of misinformation causes real harm. Unfortunately, countering this noise is growing increasingly more complex and challenging. It will require the use of a host of science communication tools and strategies, including the creative use of narratives.

Acknowledgements

The authors would like to thank Alberta Innovates - Health Solutions, Genome Canada,Genome Alberta, the Canadian Institutes of Health Research, the Kidney Foundation of Canada, and the Stem Cell Network for their generous support of the following projects: PACE-’Omics: Personalized, Accessible, Cost Effective Applications of 'Omics Technologies, The Canadian National Transplant Research Program: Increasing Donation and Improving Transplantation Outcomes, Stem Cells and Misleading Marketing Claims, The Next Step: Specific Steps for Addressing the Marketing of Unproven Stem Cell Therapies, all of which supported the event and contents of this publication. Additionally the authors would like to thank all the participants that attended and participated in Mapping the Emerging Issues in the Public Representation of Bioscience and Health Issues: David Hartell, Lori West, Mike Spear, Morgan Barber, and Evan Horbay.

Abbott, A. (2015). ‘Italian scientists slam selection of stem-cell trial.’ Nature 523, pp. 15- 16.

Aldama, F.L. (2015). ‘The Science of Storytelling: Perspectives from Cognitive Science, Neuroscience, and the Humanities’. Projections 9(1), pp. 80-95.

Anderson, J. and Rainie, L. (2017). ‘The Future of Truth and Misinformation Online’. Pew Research Center; http://assets.pewresearch.org/wp- content/uploads/sites/14/2017/10/19095643/PI_2017.10.19_Future-of-Truth-and- Misinformation_FINAL.pdf

Antheunis, M.L., Tates, K., and Nieboer, T.E. (2013). ‘Patients’ and health professionals’ use of social media in health care: motives, barriers and expectations’. Patient Education and Counselling 92(3), pp. 426-431.

Attwell, K., Smith, D.T., and Ward, P.R. (2018). ‘‘The Unhealthy Other’: How vaccine rejecting parents construct the vaccinating mainstream’. Vaccine 36(12), pp. 1621-1626.

Barraza, J.A. and Zak, P.J. (2009). ‘Empathy toward strangers triggers oxytocin release and subsequent generosity’. Annals of the New York Academy of Sciences 1167(1), pp. 182-189.

Basch, C.H., Mullican, L.A., Boone, K.D., Yin, J., Berdnik, A., Eremeeva, M.E., and Fung, I.C. (2017). ‘Lyme Disease and YouTubeTM: A Cross-Sectional Study of Video Contents’. Osong Public Health and Research Perspectives 8(4), pp. 289-292.

Berliner, L.S. and Kenworthy, N.J. (2017). ‘Producing a worthy illness: Personal crowdfunding amidst financial crisis’. Social Science & Medicine 187, pp. 233-242.

Boutron, I. and Ravaud, P. (2018). ‘Misrepresentation and distortion of research in biomedical literature’. Proceedings of the National Academy of Sciences 115(11), pp. 2613-2619.

Bowles, B. (2017, December 29). Unfiltered Fervor: The Rush to Get Off the Water Grid. ; https://www.nytimes.com/2017/12/29/dining/raw-water- unfiltered.html

Braddock, K. and Dillard, J.P. (2016). ‘Meta-analytic evidence for the persuasive effect of narratives on beliefs, attitudes, intentions, and behaviors’. Communication Monographs 83(4), pp. 446-467.

Braverman, J. (2008). ‘Testimonials Versus Informational Persuasive Messages: The Moderating Effect of Delivery Mode and Personal Involvement’. Communication Research 25(5), pp. 666-694. Capurro, G., Greenberg, J., Dubé, E., & Driedger, M. (2018). ‘Measles, Moral Regulation and the Social Construction of Risk: Media Narratives of “Anti-Vaxxers” and the 2015 Disneyland Outbreak.’ Canadian Journal of Sociology, 43(1), pp. 25-48.

Caulfield, T. (2015). Is Gwyneth Paltrow Wrong about Everything?: when celebrity culture and science clash. Toronto, ON: Penguin Canada.

Caulfield, T., Borry, P., Toews, M., Elger, B.S., Greely, H.T., and McGuire, A. (2015). ‘Marginally scientific? Genetic testing of children and adolescents for lifestyle and health promotion’. Journal of Law and the Biosciences 2(3), pp. 627–644.

Caulfield, T. and Condit, C. (2012). ‘Science and the sources of hype’. Public Health Genomics 51(3-4), pp. 209-217.

Caulfield, T., Ogbogu, U., and Robertson, G. (2013). ‘Commentary: the law, unproven CAM and the referral challenge’. Focus on Alternative and Complementary Therapies 18(1), pp. 1-7.

Caulfield, T., Sipp, D., Murry, C.E., Daley, G.Q., and Kimmelman, J. (2016). ‘Confronting stem cell hype’. Science 352(6287), pp. 776-777.

Centola, D. (2010). ‘The spread of behavior in an online social network experiment’. Science 329(5996), pp. 1194-1197.

Chafe, R., Born, K.B., Slutsky, A.S., and Laupacis, A. (2011). ‘The rise of ’. Nature 472(7344), pp. 410-411.

Chung, S. and Cho, H. (2017). ‘Fostering parasocial relationships with celebrities on social media: Implications for celebrity endorsement’. Psychology & Marketing 34(4), pp. 481-495.

Cialdini, R.B. (2007). Influence: The psychology of persuasion. New York, NY: Collins.

Clarke, C.E. (2008). ‘A question of balance: The autism-vaccine controversy in the British and American elite press’. Science Communication 30(1), pp. 77-107.

Coppock, A., Ekins, E., and Kirby, D. (2018). ‘The Long-lasting Effects of Newspaper Op-Eds on Public Opinion’. Quarterly Journal of Political Science 13(1), pp. 59-87.

Cortiñas-Rovira, S., Alonso-Marcos, F., Pont-Sorribes, C., and Escribà-Sales, E. (2015). ‘Science journalists’ perceptions and attitudes to pseudoscience in Spain’. Public Understanding of Science 24(4), pp. 450-465.

Dahlstrom, M.F. (2014). ‘Using narratives and storytelling to communicate science with nonexpert audiences’. Proceedings of the National Academy of Sciences 111(4), pp. 13614-13620.

De Choudhury, M. and De, S. (2014). Mental Health Discourse on reddit: Self- Disclosure, Social Support, and Anonymity. Proceedings of the Eighth International AAAI Conference on Weblogs and Social Media; https://pdfs.semanticscholar.org/2db7/15a479c8961d3020fe906f7bedfa0311b937.pdf?_g a=2.68033489.1532648433.1533745958-1435122337.1533745958.

De Choudhury, M., Morris, M., and White, R. (2014). Seeking and sharing health information online: Comparing search engines and social media. Proceedings of the Conference On Human Factors In Computing Systems; doi:10.1145/2556288.2557214.

Del Vicario, M., Bessi, A., Zollo, F., Petroni, F., Scala, A., Caldarelli, G.,...Quattrociocchi, W. (2016). ‘The spreading of misinformation online’. Proceedings of the National Academy of Sciences 113(3), pp. 554-559.

Dixon, G.N. and Clarke, C.E. (2013). ‘Heightening uncertainty around certain science: media coverage, false balance and the autism-vaccine controversy’. Science Communication 35(3), pp. 358-382.

Du, L., Rachul, C., Guo, Z., and Caulfield, T. (2016). ‘Gordie Howe’s “miraculous treatment”: Case study of Twitter users’ reactions to a sport celebrity’s stem cell treatment’. JMIR Public Health and Surveillance 2(1), p. e8.

Dubois, E. and Blank, G. (2018). ‘The echo chamber is overstated: the moderating effect of political interest and diverse media’. Information, Communication & Society 21(5), pp. 729-745.

Dunn, A.G., Leask, J., Zhou, X., Mandl, K.D., and Coiera, E. (2015). ‘Associations between exposure to and expression of negative opinions about human papillomavirus vaccines on social media: an observational study’. Journal of Medical Internet Research 17(6), p. e144.

Edelman. (2017). 2017 Edelman Trust Barometer; https://www.edelman.com/trust2017/

Edelman. (2018). 2018 Edelman Trust Barometer Global Report; https://cms.edelman.com/sites/default/files/2018- 01/2018%20Edelman%20Trust%20Barometer%20Global%20Report.pdf.

Engebretsen, L., Steffen, K., Alsousou, J., Anitua, E., Bachl, N., Devilee, R,...Kelberine, F. (2010). ‘IOC consensus paper on the use of platelet-rich plasma in sports medicine’. British Journal of Sports Medicine 44(15), pp. 1072-1081.

Farmer, H., McKay, R., and Tsakiris, M. (2014). ‘Trust in me: Trustworthy others are seen as more physically similar to the self’. Psychological Science 25(1), pp. 290-292.

Fazio, L.K., Brashier, N.M., Payne, B.K., and Marsh, E.J. (2015). ‘Knowledge does not protect against illusory truth’. Journal of Experimental Psychology: General 144(5), pp. 993-1002.

Fernández-Luque, L. and Bau, T. (2015). ‘Health and social media: perfect storm of information’. Healthcare Informatics Research 21(2), pp. 67-73.

Ferrara, E., Varol, O., Davis, C., Menczer, F., and Flammini, A. (2016). ‘The rise of social bots’. Communications of the ACM 59(7), pp. 96-104.

Flaxman, S., Goel, S., and Rao, J.M. (2013). ‘Ideological segregation and the effects of social media on news consumption’. Becker Friedman Institute, University of Chicago; https://bfi.uchicago.edu/sites/default/files/research/flaxman_goel_rao_onlinenews.pdf.

Freberg, K., Graham, K., McGaughey, K., and Freberg, L.A. (2011). ‘Who are the social media influencers? A study of public perceptions of personality’. Review 37(1), pp. 90-92.

Friedman, S.M., Dunwoody, S., and Rogers, C.L. (2012). Communicating uncertainty: Media coverage of new and controversial science. Abingdon-on-Thames, UK: Routledge.

Funk, C., Gottfried, J., and Mitchell, A. (2017). ‘Science News and Information Today’. Pew Research Center; http://assets.pewresearch.org/wp- content/uploads/sites/13/2017/09/14122431/PJ_2017.09.20_Science-and- News_FINAL.pdf

Glaser, M., Garsoffky, B., and Schwan, S. (2009). ‘Narrative-based learning: Possible benefits and problems’. Communications 34(4), pp. 429-447.

Golman, R., Hagmann, D., and Loewenstein, G. (2017). ‘Information avoidance’. Journal of Economic Literature 55(1), pp. 96-135.

Graham, A.L., Cobb, C.O., and Cobb N.K. (2016). The Internet, Social Media, and Health Decision-Making. In Diefenbach M., Miller-Halegoua S., Bowen D. (eds), Handbook of Health Decision Science (pp. 335-355). New York, NY: Springer.

Griffiths, F., Dobermann, T., Cave, J.A., Thorogood, M., Johnson, S., Salamatian, K.,...Goudge, J. (2015). ‘The Impact of Online Social Networks on Health and Health Systems: a scoping review and case studies’. Policy & Internet 7(4), pp. 473-496.

Gruzd, A., Jacobson, J., Mai, P., and Dubois, E. (2018). ‘The State of Social Media in Canada 2017’ Version: 1.0. Ryerson University Social Media Lab; doi:10.5683/SP/AL8Z6R.

Guilbeault, D., Becker, J., and Centola, D. (2018). Complex Contagions: A Decade in Review. In Lehmann, S. and Ahn, Y. (eds.), Spreading Dynamics in Social Systems. Springer Nature (Forthcoming).

Hajer, M.A. (2002). Discourse coalitions and the institutionalization of practice: the case of acid rain in Great Britain. In Fischer, F. and Forester, J. (eds), Argument Turn Policy Anal Plan (pp. 51-84). Abingdon-on-Thames, UK: Routledge.

Han, J. and Wiley, J. (2013). Digital Illness Narratives: A New Form of Health Communication. Transactions of the International Conference on Health Information Technology Advancement, 2(1). Retrieved from https://scholarworks.wmich.edu/ichita_transactions/18

Helmore, E. (2017, August 24). Gwyneth Paltrow’s Goop faces new false advertising claims. The Guardian; https://www.theguardian.com/film/2017/aug/24/gwyneth- paltrows-goop-faces-new-false-advertising-claims

Henkel, L.A. and Mattson, M.E. (2011). ‘Reading is believing: The truth effect and source credibility’. Consciousness and Cognition 20(4), pp. 1705-1721.

International Society for Stem Cell Research. (2016). Guidelines for stem cell research and clinical translation; http://www.isscr.org/docs/default-source/all-isscr- guidelines/guidelines-2016/isscr-guidelines-for-stem-cell-research-and-clinical- translation.pdf?sfvrsn=4

Introne, J., Yildirim, I. G., Iandoli, L., Decook, J., and Elzeini, S. (2018). ‘How people weave online information into pseudo knowledge’. Social Media + Society 4(3); https://doi.org/10.1177/2056305118785639

Jolley, D. and Douglas, K.M. (2014). ‘The effects of anti-vaccine conspiracy theories on vaccination intentions’. PLoS One 9(2), p. e89177.

Jones, M. and Crow, D. (2017). ‘How can we use the ‘science of stories’ to produce persuasive scientific stories?’. Palgrave Communications 3(1); doi: 10.1057/s41599-017- 0047-7

Kamenova, K., Reshef, A., and Caulfield, T. (2014). ‘Angelina Jolie’s faulty gene: newspaper coverage of a celebrity’s preventive bilateral mastectomy in Canada, the United States, and the United Kingdom’. Genetics in Medicine 16(7), pp. 522-528.

Khamis, S., Ang, L., and Welling, R. (2017). ‘Self-branding,‘micro-celebrity’ and the rise of Social Media Influencers’. Celebrity Studies 8(2), pp. 191-208.

Kim, J.G., Hong, H., and Karahalios, K. (2018). Understanding Identity Presentation in Medical Crowdfunding. Proceedings of the 2018 CHI Conference on Human Factors in Computing Systems; http://social.cs.uiuc.edu/papers/Kim-CHI18.pdf.

Kim J.G., Kong, H.K., Karahalios, K., Fu, W.T., and Hong, H. (2016). The power of collective endorsements: credibility factors in medical crowdfunding campaigns. Proceedings of the 2016 CHI Conference on Human Factors in Computing Systems; http://jgkim2.web.engr.illinois.edu/papers/medicalcrowdfunding.chi16.kim.pdf

Lazer, D.M., Baum, M.A., Benkler, Y., Berinsky, A.J., Greenhill, K.M., Menczer, F.,...Schudson M. (2018). ‘The science of fake news’. Science 359(6380), pp. 1094-1096.

Madathil, K.C., Rivera-Rodriguez, A.J., Greenstein, J.S., and Gramopadhye, A.K. (2015). ‘Healthcare information on YouTube: a systematic review’. Health Informatics Journal, 21(3), pp. 173-194.

Marcon, A.R., Bieber, M., and Caulfield, T. (2018). ‘Representing a “revolution”: how the popular press has portrayed personalized medicine’. Genetics in Medicine; doi:10.1038/gim.2017.217

Marcon, A.R. and Caulfield, T. (2017). ‘Commenting on chiropractic: A YouTube analysis’. Cogent Medicine, 4(1); doi: 10.1080/2331205X.2016.1277450

Massolution. (2015). 2015CF Crowdfunding Industry Report; http://reports.crowdsourcing.org/index.php?route=product/product&product_id=54

Master, Z. and Resnik, D.B. (2013). ‘Hype and Public Trust in Science’. Science and Engineering Ethics, 19(2), pp. 321-335.

Mattern, A. (2018, February 11). Father convicted in son’s meningitis death will not speak at wellness expos after backlash. Canadian Broadcasting Corporation; http://www.cbc.ca/news/canada/saskatoon/david-stephan-son-meningitis-death-speaker- wellness-expo-1.4530355

McGee, E.M. (2005). ‘Using Personal Narratives to Encourage Organ Donation’. The American Journal of BioEthics, 5(4), pp. 19-20.

McKelvey, F. and Dubois, E. (2017). ‘Computational in Canada: The Use of Political Bots’. The Computational Propaganda Research Project, working paper No.2017.6; http://blogs.oii.ox.ac.uk/politicalbots/wp- content/uploads/sites/89/2017/06/Comprop-Canada.pdf.

McPherson, M., Smith-Lovin, L., and Cook, J.M. (2001). ‘Birds of a Feather: Homophily in social Networks’. Annual Review of Sociology 27(1), pp. 415-444.

Metze, T. and Dodge, J. (2016). ‘Dynamic discourse coalitions on hydro-fracking in Europe and the United States’. Environmental Communication 10(3), pp. 365-379.

Moraes, V.Y., Lenza, M., Tamaoki, M.J., Faloppa. F., and Belloti, J.C. (2014). ‘Platelet rich therapies for musculoskeletal soft tissue injuries’. Cochrane Database of Systematic Reviews 4; doi: 10.1002/14651858.CD010071.pub3.

Morgan, S.E., Movius, L., and Cody, M.J. (2009). ‘The Power of Narratives: The Effect of Entertainment Television Organ Donation Storylines on the Attitudes, Knowledge, and Behaviors of Donors and Nondonors’. Journal of Communication 59(1), pp. 135-151.

Muchnik, L., Aral, S., and Taylor, S.J. (2013). ‘Social Influence Bias: A Randomized Experiment’. Science 341(6146), pp. 647-651.

Munsie, M., and Hyun, I. (2014). ‘A question of ethics: Selling autologous stem cell therapies flaunts professional standards.’ Stem Cell Research 13(3b), pp. 647-653.

Murdoch, B., Carr, S., and Caulfield, T. (2016). ‘Selling falsehoods? A cross-sectional study of Canadian naturopathy, homeopathy, chiropractic and acupuncture clinic website claims relating to allergy and asthma’. BMJ Open 6(12), p. e014028.

Murdoch, B., Marcon, A.R., Downie, D., and Caulfield, T. (forthcoming). The Portrayal of Medical Crowdfunding in the Popular Press.

Murdoch, B., Zarzeczny, A., and Caulfield, T. (2018). ‘Exploiting science? A systematic analysis of complementary and alternative medicine clinic websites’ marketing of stem cell therapies’. BMJ Open 8(2), p. e019414.

Neidich, E.M., Neidich, A.B., Cooper, J.T., and Bramstedt, K.A. (2012). ‘The ethical complexities of online organ solicitation via donor–patient websites: avoiding the “beauty contest”’. American Journal of Transplantation 12(1), pp. 43-47.

Neimand, A. (2018, May 7). How to Tell Stories About Complex Issues. Stanford Social Innovation Review; https://ssir.org/articles/entry/how_to_tell_stories_about_complex_issues

Ogbogu, U. (2015). ‘Combatting Unlicensed Stem Cell Interventions through Truthful Advertising Law: A Survey of Regulatory Trends’. McGill Journal of Law and Health 9(2), pp. 311-335.

Oyeyemi, S.O., Gabarron, E., and Wynn, R. (2014). ‘Ebola, Twitter, and misinformation: a dangerous combination?’. BMJ 349, p. g6178.

Paulus, T.M. and Roberts, K.R. (2017). ‘Crowdfunding a “Real-life Superhero”: The construction of worthy bodies in medical campaign narratives’. Discourse, Context & Media 21, pp. 64-72.

Pennycook, G., Cannon, T., Rand, D.G. (2018). ‘Prior exposure increases perceived accuracy of fake news’. Journal of Experimental Psychology: General (forthcoming); https://poseidon01.ssrn.com/delivery.php?ID=8281181100291080751250080981080141 071030220850080580300220711250931150941110040691231230000200440110100110 950840640940020700021230800540600380650910900831181241080750260590331141 110250910990650671160951090641250771111030900810220310851160230970820251 02&EXT=pdf

Perrault, S. (2013). Communicating popular science: From deficit to democracy. Basingstoke, UK: Palgrave Macmillan.

Petersen, A. and Krisjansen, I. (2015). ‘Assembling ‘the bioeconomy’: Exploiting the power of the promissory life sciences’. Journal of Sociology 51(1), pp. 28-46.

Phillips, K. (2017, January 22). No, Gwyneth Paltrow, women should not put jade eggs in their vaginas, gynecologist says. The Washington Post; https://www.washingtonpost.com/news/to-your-health/wp/2017/01/22/no-gwyneth- paltrow-women-should-not-put-jade-eggs-in-their-vaginas-gynecologist- says/?utm_term=.5c063939aede

Pullman, D., Zarzeczny, A., and Picard, A. (2013). ‘Media, politics and science policy: MS and evidence from the CCSVI trenches’. BMC Medical Ethics 14(1), p. 6.

Rachul, C., Rasko, J.E., and Caulfield, T. (2017). ‘Implicit hype? Representations of platelet rich plasma in the ’. PloS One 12(8), pp. e0182496.

Schmidt, A.L., Zollo, F., Scala, A., Betsch, C., and Quattrociocchi, W. (2018). ‘Polarization of the Vaccination Debate on Facebook’. Vaccine 36(25), pp. 3606-3612.

Scientific American (Editors). (2018, February 1). Universities Should Encourage Scientists to Speak Out about Public Issues. Scientific American; https://www.scientificamerican.com/article/universities-should-encourage-scientists-to- speak-out-about-public-issues/

Sharma, M., Yadav, K., Yadav, N., and Ferdinand, K.C. (2017). ‘Zika virus pandemic— analysis of Facebook as a social media health information platform’. American Journal of Infection Control 45(3), pp.301-302.

Shelby, A. and Ernst, K. (2013). ‘Story and science: how providers and parents can utilize storytelling to combat anti-vaccine misinformation’. Human Vaccines & Immunotherapeutics 9(8), pp. 1795-1801.

Silverman, C. (2016, November 16). This Analysis Shows How Viral Fake Election News Stories Outperformed Real News on Facebook. BuzzFeed News; https://www.buzzfeed.com/craigsilverman/viral-fake-election-news-outperformed-real- news-on-facebook?utm_term=.tkJyGry8e#.weldQ5dle.

Sipp, D., Caulfield, T., Kaye, J., Barfoot, J., Blackburn, C., Chan, S.,....Sleeboom- Faulkner, M. (2017). ‘Marketing of unproven stem cell–based interventions: A call to action’. Science Translational Medicine 9(397), p. eaag0426.

Smailhodzic, E., Hooijsma, W., Boonstra, A., and Langley, D.J. (2016). ‘Social media use in healthcare: a systematic review of effects on patients and on their relationship with healthcare professionals’. BMC Health Services Research 16(1), p. 442.

Smart, G. (2011). Argumentation across Web-based organizational discourses: The case of . In Candlin, C.N. and Sarangi, S. (eds), Handbook of Communication in Organisations and Professions (pp. 363-386). Berlin, DE: De Gruyter Mouton

Smith, A. and Anderson, M. (2018). ‘Social Media Use in 2018’. Pew Research Center; http://www.pewinternet.org/2018/03/01/social-media-use-in-2018/

Snyder, J., Turner, L., and Crooks, V.A. (2018). ‘Crowdfunding for Unproven Stem Cell–Based Interventions.’ JAMA 319(18), pp. 1935-1936.

Stanovich, K.E. and West, R.F. (2000). ‘Individual differences in reasoning: implications for the rationality debate?’. Behavioral and Brain Sciences 23, pp. 645–665.

Striphas, T. (2015). ‘Algorithmic culture’. European Journal of Cultural Studies 18(4-5), pp. 395-412.

Sunstein, C.R. (2017). #Republic: Divided Democracy in the Age of Social Media. Princeton, NJ: Princeton University Press.

Szeto, E., Tomlinson, A., and Smart, V. (2018, February 2). 'This is snake oil': Scientists don't buy balance-boosting clips featured on Dragons' Den. Canadian Broadcasting Corporation; http://www.cbc.ca/news/business/quantum-wellness-clips-marketplace- 1.4513382

Tomeny, T.S., Vargo, C.J., and El-Toukhy, S. (2017). ‘Geographic and demographic correlates of autism-related anti-vaccine beliefs on Twitter, 2009-15’. Social Science & Medicine 191, pp. 168-175.

Turner, L. and Knoepfler, P. (2016). ‘Selling Stem Cells in the USA: Assessing the Direct-to-Consumer Industry’. Cell Stem Cell 19(2): pp.154-157.

Van Atteveldt, N.M., Van Aalderen-Smeets, S.I., Jacobi, C., and Ruigrok, N. (2014). ‘Media reporting of neuroscience depends on timing, topic and newspaper type’. PLoS One 9(8), p. e104780.

Vosoughi S., Roy, D., and Aral, S. (2018). ‘The spread of true and false news online’. Science 359(6380), pp.1146-1151.

Wetsman, N. (2018, March 5). When splashy headlines become the goal of science, the process suffers. Popular Science; https://www.popsci.com/hype-impact-factor-bad- science

Yardi S. and Boyd, D. (2010). ‘Dynamic debates: An Analysis of Group Polarization Over Time on Twitter’. Bulletin of Science, Technology & Society 30(5), pp. 316-27.

Yavchitz, A., Boutron, I., Bafeta, A., Marroun, I., Charles, P., Mantz, J., and Ravaud, P. (2012). ‘Misrepresentation of Randomized Control Trials in Press Releases and News Coverage: Cohort Study’. PLoS Medicine 9(9), p. e1001308.

Yoo, J.H., Kreuter, M.W., Lai, C., and Fu, Q. (2014). ‘Understanding Narrative Effects: The Role of Discrete Negative Emotions on Message Processing and Attitudes Among Low-Income African American Women’. Health Communication 29, pp. 494-504.

Young, M.J. and Scheinberg, E. (2017). ‘The Rise of Crowdfunding for Medical Care: Promises and Perils’. JAMA 317(16), pp. 1623-1624.

Yuksel, M. and Labrecque, L.I. (2016). ‘“Digital buddies”: parasocial interactions in social media’. Journal of Research in Interactive Marketing 10(4), pp. 305-320.

Zak, P.J. (2015). ‘Why inspiring stories make us react: The neuroscience of narrative’. Cerebrum: The Dana forum on Brain Science (January-February); https://www.ncbi.nlm.nih.gov/pmc/articles/PMC4445577/pdf/cer-02-15.pdf

Zarzeczny, A., Caulfield, T., Ogbogu, U., Bell, P., Crooks, V.A., Kamenova, K., Master, Z., Rachul, C., Snyder, J., Toews, M., and Zoeller, S. (2014). ‘Professional Regulation: A Potentially Valuable Tool in Responding to “Stem Cell Tourism”’. Stem Cell Reports 3(3), pp/ 379-384.