<<

MODERATION POLICIES AND STRATEGIES OF FOUR NEWS SITES FOR USER GENERATED CONTENT

By

ANTIONETTE ROLLINS

A THESIS PRESENTED TO THE GRADUATE SCHOOL OF THE UNIVERSITY OF FLORIDA IN PARTIAL FULFILLMENT OF THE REQUIREMENTS FOR THE DEGREE OF MASTER OF ARTS IN MASS COMMUNICATION

UNIVERSITY OF FLORIDA

2013

1

© 2013 Antionette Rollins

2

To my parents, thank you for all of your love and support

3

ACKNOWLEDGMENTS

Although I am the author of this thesis, this certainly wasn’t a solo effort. There are many people who deserve to be acknowledged for their hard work, guidance and support. I would like to thank my Chair Dr. Julie Dodd, who worked tirelessly throughout this process. Dr. Dodd’s knowledge and advice helped me tremendously and I have learned so much about research, hard work and perseverance. Thank you for your vision and enthusiasm, Dr. Dodd.

My committee members Drs. Judy Robinson and Laurence Alexander also deserve to be acknowledged. I thank you both for sharing your knowledge and time throughout this process. Your suggestions forced me to think critically and ultimately strengthened this thesis. I would also like to thank Dr. Robinson for contributing to my interest in online journalism through the course Multimedia Blogging.

Thank you to all of the interviewees that kindly participated in this research. I learned so much from speaking with each one of you, and I appreciate you for taking time out of your busy schedules to contribute.

I would also like to thank every professor and classmate that I have had the pleasure of learning from in the College of Journalism and Communications. Thank you for providing spaces that allowed me to think, engage in healthy discourse and expand my ideas about media.

Lastly, I would like to thank my parents Yolanda and Charles Williams, my friend

Dan Mathis and other family and friends who continuously took the time to listen to my ideas, edit drafts and provide encouragement throughout this process.

4

TABLE OF CONTENTS

page

ACKNOWLEDGMENTS ...... 4

LIST OF FIGURES ...... 7

ABSTRACT ...... 8

CHAPTER

1 INTRODUCTION ...... 10

2 LITERATURE REVIEW ...... 17

Online Comments ...... 20 Anonymity in Computer-Mediated Communication ...... 22 Comment Moderation ...... 26 Legal Aspect of User-Generated Content for News Organizations ...... 28 Gatekeeping Theory ...... 29

3 METHODOLOGY ...... 30

4 FINDINGS ...... 34

Cox Media Group ...... 34 The Palm Beach Post ...... 35 User Registration ...... 36 The Moderators and Moderation Strategies ...... 37 Moderation Advice and the Future of Online Commenting ...... 39 The Atlanta Journal-Constitution ...... 41 User Registration ...... 42 Moderation Advice and the Future of Online Commenting ...... 43 Dayton Daily News ...... 43 User Registration ...... 44 The Moderators and Moderation Strategies ...... 44 Moderation Advice and the Future of Online Commenting ...... 45 Tribune Company ...... 47 Orlando Sentinel ...... 47 Comment Policy ...... 49 User Registration ...... 49 The Moderators and Moderation Strategies ...... 49 Moderation Advice and the Future of Online Commenting ...... 53

5 DISCUSSION ...... 54

The New Gatekeeper ...... 55

5

Building an Online Community ...... 57 Crowd-Sourced Moderation ...... 59 Anonymity in Online Commenting ...... 60 The Role of Social Media in Online Commenting ...... 61 Suggestions for Future Research ...... 62

APPENDIX

A VISITOR AGREEMENT ...... 65

B ORLANDO SENTINEL TERMS OF SERVICE ...... 68

C THE PALM BEACH POST LETTERS TO THE EDITOR POLICY ...... 71

D THE ATLANTA JOURNAL-CONSTITUTION LETTERS TO THE EDITOR POLICY ...... 72

E DAYTON DAILY NEWS LETTERS TO THE EDITOR POLICY ...... 73

F ORLANDO SENTINEL LETTERS TO THE EDITOR POLICY ...... 74

G INTERVIEW QUESTIONS ...... 75

H SAMPLE EMAIL ...... 76

REFERENCES ...... 77

BIOGRAPHICAL SKETCH ...... 83

6

LIST OF FIGURES

Figure page

1-1 Online reader comment ...... 15

1-2 Online reader comments ...... 16

7

Abstract of Thesis Presented to the Graduate School of the University of Florida in Partial Fulfillment of the Requirements for the Master of Arts in Mass Communication

MODERATION POLICIES AND STRATEGIES OF FOUR NEWS SITES FOR USER GENERATED CONTENT

By

Antionette Rollins

May 2013

Chair: Julie Dodd Major: Mass Communication

While traditional forms of interactivity such as letters to the editor were still significant in the 21st Century, the Internet gave the public new ways to communicate with journalists and other readers. One popular interactivity tool was online commenting, which allowed individuals to leave feedback on news stories. However, commenting had drawbacks like incivility, therefore news sites moderated comments using various techniques.

Research was conducted to understand the moderation strategies of online news sites and to determine the issues that moderators considered when reviewing comments. A case study involving interviews with six professionals associated with four from two media corporations found that the goal of moderators was both to promote civil commenting and to build online communities. News sites simultaneously fostered community and controlled unwanted comments by using crowd-sourced moderation, with all four newspapers letting readers be the first line of eliminating inappropriate comments. Moderators also removed comments that contained personal attacks, hate speech, profanity or libel. These practices served to empower users,

8

promote safe commenting environments and to make comment review a collaborative effort between news organizations and readers.

Although moderators used similar practices to letters editors in terms of determining which comments remained posted, gatekeeping has transformed on the

Web. Not only do readers have more control over what comments are presented, but news site moderators said that comments provided reporters with ideas for stories and identified potential sources. These findings can be helpful to educators as they determine ways to incorporate moderation skills into journalism curriculum.

9

CHAPTER 1 INTRODUCTION

The Internet significantly impacted the daily lives of Americans in the 21st Century by allowing people to complete activities such as shopping, conducting business, communicating and consuming media on the Web. The Internet also made it possible for everyday people to produce, publish and disseminate media with greater ease than previously seen before. This user-generated content could be seen in the form of

Facebook posts, weblogs and videos. User-generated content can be defined as the content that an everyday individual publishes such as comments, videos and photos

(Hermida, 2008).

The Web also impacted journalism, as an increasing number of newspapers began to have an online presence where journalists could publish content specifically for the Web, make print stories available on websites and interact with readers in new ways. Interaction between journalists and the public transitioned from established means such as letters to the editor and radio phone-ins to electronic communication through emails and message boards (Wahl-Jorgensen, 2007). One of the most popular ways to promote interactivity on news websites during this time was through the comment section, which allowed users the opportunity to give feedback to journalists and create discourse with other users. This exchange of ideas and public discourse sometimes developed into an online community where users and journalists felt free to share, communicate and learn from each other.

Commenting allowed users to contribute their voices to local, national and global news. Stories that garnered a great amount of media attention also received many comments. For example, a May 2012 article posted on CNN’s website about the high-

10

profile Trayvon Martin case, which involved a Black teen who was murdered in Florida by a man who claimed the killing was in self defense, received 8,709 comments (Figure

1.1; Figure 1.2) by February 2013 (Botelho, 2012). Another article about the case posted on USA Today’s website in March 2012 had received 336 comments by

February 2013 (Alcindor, Bello & Copeland, 2012).

While commenting increased the interaction between users and journalists and thought-provoking and healthy discourse existed, some members of the online community left hateful, and sometimes irrelevant, remarks. These comments may have also contained libelous language that could put the publication in legal trouble.

Many publications adopted moderation policies in an effort to control the content of user comments. Moderation, which are strategies aimed at combating such comments, took many different forms including pre-moderation and post-moderation

(Diakopoulos & Naaman, 2011). This is similar to newspapers developing submission policies for letter writers, however these submission guidelines tend to focus more on procedural rules like word count rather than incivility and profanity (Wahl-Jorgenson,

2007). For example, The New York Times’ letter submission policy states the following:

Letters to the editor should only be sent to The Times, and not to other publications. We do not publish open letters or third-party letters. Letters for publication should be no longer than 150 words, must refer to an article that has appeared within the last seven days, and must include the writer's address and phone numbers. No attachments, please. We regret we cannot return or acknowledge unpublished letters. Writers of those letters selected for publication will be notified within a week. Letters may be shortened for space requirements (“How to Submit a Letter to the Editor,” n.d., para. 1-3).

While letter submission policies typically addressed these types of rules, letter editors still employed “principles for determining between publishable and unpublishable letters” beyond these rules (Wahl-Jorgensen, 2007, p. 68). Despite the medium, these

11

principles were similar to the principles that moderators at news websites used in the

21st Century.

In November 2012, the North Carolina The Robesonian announced that it would be making changes to its website’s commenting policy (Douglas, 2012).

According to editor Donnie Douglas (2012), the website would no longer allow individuals to post racial comments on stories that did not have a racial element.

Douglas wrote that those at the paper knew it was time to address the issue of racial comments when they began to receive an influx of complaints from readers.

While The Robesonian still allowed comments about race on articles that had a directly racial component, those comments were required to be related to the story and in good taste.

We understand our obligation to report the news, but we are not obligated to provide a forum for bigotry and hatred. We are a member of this community, and allowing this kind of divisiveness does not benefit anyone. It certainly doesn’t elevate the conversation (Douglas, 2012, para. 9).

The Robesonian was not the only newspaper to take action regarding comments on their website. The Gazette (Iowa City, Iowa) turned off commenting on all of their site’s articles, editorials and columns beginning in September 2012. Instead, users were invited to submit pre-screened comments to the “Daily Conversations” section of the website. Opinion Page Editor Jeff Tecklenburg (2012) wrote on the Gazette’s website:

Yes, thegazette.com has changed the rules for online commenting again, as announced Sept. 2. For good reason. Over the past few years, Gazette managers have revised the commenting process several times in hopes of encouraging a more civil, informed and broad-based online discussion. Bottom line: Nothing we’ve tried has worked very well. But we’ll keep working on it (para. 1 – 3).

12

National newspapers such as The Washington Post and The New York Times also revised their comment policies throughout the 2000’s to provide a more civil space for pubic discussion (Perez-Pena, 2010).

In addition to moderating comments, publications could also disallow or limit commenting on certain topics, like The Robesonian and Gazette, or use tactics, like requiring those who wanted to post comments to register on the news site or installing social media plugins that required those who commented to first sign on to a social media account that they had, such as Facebook. As the 21st Century progressed news sites experimented with different approaches that would simultaneously decrease the occurrences of inappropriate comments and promote community engagement.

A case study examining the moderation practices of four widely circulated publications was completed to gain a better understanding of how online news sites moderate user-generated comments. This case study included interviews with people who moderated comments and managed moderators.

While there has been an increase in research on user-generated content and online comments, at the time of this study, literature was lacking that detailed the process that professionals at news sites in the undertook in an attempt to handle comments. In fact, as of February 2013, neither the Society of Professional

Journalists nor the American Society of Newspaper Editors had established guidelines or best practices for addressing online moderation of comments on news sites (“Ethics,” n.d.; “Resources Page,” n.d.). However, the American Society of Newspaper Editors did survey newsrooms in 2009 and found that 87.6% of the 267 surveyed newsrooms invited readers to post online comments on their news sites (Keyes, 2009).

13

While past research has explored the substance of user-generated content, one purpose of this study is to present journalism educators with methods that their students may use in their future jobs as editors, moderators and community managers.

This case study sought to answer the following questions:

R1: How do news websites develop comment moderation policies?

R2: What are the components of online comments policies developed by news organizations?

R3: What strategies do news organizations use to moderate user-generated comments?

R4: Who is involved in moderating comments?

R5: What issues do journalists consider while moderating comments?

14

Figure 1- 1. Online reader comment

15

Figure 1- 2. Online reader comments

16

CHAPTER 2 LITERATURE REVIEW

The Internet’s impact on journalism in the start of the 21st Century cannot be understated. As a testament to this impact, most news publications by 2009 had an online presence (Gunter, Campbell & Touri, 2009). Along with new ways of storytelling and distributing news1, the Web gave the public more opportunities to interact with journalists and news organizations. “With the growth in sociality and interaction around online news media, news sites are increasingly becoming places for communities to discuss and address common issues spurred by news articles” (Diakopoulos &

Naaman, 2011, p. 1). While the Web ushered in new opportunities for interactivity, the press has always had a connection to community discourse.

Journalists have a complex relationship with the public. “In many ways, the public can be viewed as journalism’s ‘Other.’ Journalists exist in an anxious relationship of dependency with the public, who are their audiences, their sources, their raison d’etre, their allies, and their adversaries” (Wahl-Jorgensen, 2007, p. 4).

Wahl-Jorgensen studied this relationship and examined letters to the editor in her book Journalists and the Public (2007). In many ways, letters to the editor can be seen as a precursor to online comments, as both letters to the editor and online comments allow members of the public to discuss articles with and voice opinions to journalists and other citizens. Wahl-Jorgensen stated that the letters section is “one of a few arenas for public discussion to have survived throughout a large period of the history of

1 New storytelling approaches and ways to distribute news emerged in the 21st Century, particularly due to the rise in social media. For example, journalists used social media sites such as Twitter to disseminate stories they had written (Holcomb, Gross & Mitchell, 2011).

17

American news media” (p. 3) and are the most widely read portions of newspapers

(2007).

Although letters to the editor give readers the opportunity to comment on news, it is important to recognize that these letters typically do not represent the total readership of a publication or population of its community. Letters to the editor are moderated, as the letters that appear in the printed news publication are selected and published by media professionals. The selection process is central to the discussion of readers’ letters (Raeymaeckers, 2005). “The newsworkers who edit and moderate the letters section shape public debate, and their decisions and work practices can therefore tell us much about contemporary conditions for citizen participation in politics” (Wahl-

Jorgensen, 2007, p. 55).

Wahl-Jorgensen (2007) interviewed 23 San Francisco Bay editors and found that these professionals viewed the letters section as something that should be an open forum for public debate that was best served with “minimal editorial intervention” (p.

154). However, these editors also said that they sometimes needed to reject certain letters that were “disrespectful,” “racist,” “sexist,” “intolerant” or had other aspects that violated principles of civility. Uncivil letters that contained libelous personal attacks were most likely to be rejected by these San Francisco Bay editors (Wahl-Jorgensen, 2007, p. 154,).

Despite editors having to moderate some uncivil letters, Wahl-Jorgensen found that while letters to the editor are subjective, it was not because of these moderation practices. Instead, letters to the editor may be biased because they are not representative of the population (Wahl-Jorgensen, 2007). Hart (2001) found that letter-

18

writers in the United States were “overwhelming White” and were more likely to be married and “fairly well educated” (p. 416). Reader, Stempel and Daniel (2004) also found that letter-writers were “better educated,” “older” and “wealthier” than the average

American (p. 55).

Wahl-Jorgensen’s research on decisions regarding letters to the editor provides a useful framework for evaluating how decisions are made regarding moderation of comments made to online news stories.

While journalism has a long-standing relationship with public participation, the

Internet has allowed people to interact with media and each other in new ways. Rafaeli and Sudweeks (1997) described interactivity as a “process-related construct about communication,” rather than a characteristic of the medium (para. 10).

Interactivity places shared interpretive contexts in the primary role. Interactivity describes and prescribes the manner in which conversational interaction as an iterative process leads to jointly produced meaning. Interactivity merges speaking with listening. And it is a general enough concept to encompass both intimate, person-to-person, face-to-face communication and other forums and forms (Rafaeli & Sudweeks, 1997, para. 10).

Many forms of interactivity could be found throughout the Web in the early part of the 21st Century. One form was the bulletin board, or message board, which was an online gathering place that allowed people to have a conversation even if they were not in the same place at the same time (Bishop, 2009, p. 6). Interactivity also occurred through weblogs2, social media3 and Wikis4. (Chen, Xu & Whinston, 2011).

2 A weblog, or blog, is “a frequently updated Web site, with posts arranged in reverse chronological order, so new entries are always on top” (Blood, 2003). 3 Social media, or social networking sites, are “web-based services that allow individuals to (1) construct a public or semi-public profile within a bounded system, (2) articulate a list of other users with whom they share a connection, and (3) view and traverse their

19

Online Comments

One of the most popular interactivity tools that publications were using in the 21st

Century was the comment section (Domingo et al., 2008). One of the first American newspapers to allow user-generated comments on same-page articles was The Rocky

Mountain News (Denver, Colorado) in 1998 (Santana, 2011). Following The Rocky

Mountain News’ decision to allow online comments, the number of news sites with comment sections grew tremendously. In 2010, 92% of the top 150 newspapers in the

U.S. allowed users to post online comments (Santana, 2011). Faridani, Bitton, Ryokai and Goldberg (2010) refer to this phenomenon as “participatory culture,” in which users of websites increasingly provided feedback usually in the form of comments (p. 1175).

Through comments, the public voiced their opinions on an article, created discussions with other users and even interacted with journalists. Many sites allowed the public to leave comments using an online form at the bottom of an article, which may or may not have required users to register with the site (Hermida & Thurman, 2008).

Comment sections were also extremely popular among users. In 2011 The

Huffington Post, a popular political blog, received approximately three million comments per month (Diakopoulos & Naaman, 2011). Overall, 25% of Internet users reported that they had commented on either a news story or blog post (Purcell, Rainie, Mitchell,

Rosenstiel & Olmstead, 2010).

list of connections and those made by others within the system. The nature and nomenclature of these connections may vary from site to site” (Boyd & Ellison, 2007). 4 “A Wiki, which is so named through taking the first letters from the axiom, ‘what I know is;’ is a collaborative page-editing tool in which users may add or edit content directly through their web browser” (Bishop, 2009, p. 7 citing Feller, 2005).

20

Although these comment sections promoted public discourse, commenting had flaws. Faridani et al. (2010) explained some of the common problems:

First, thoughtful moderates are often shouted down by extremists. Online discussions, conducted through threaded lists of comments, often end in “flame wars” predicated on binary characterizations. Second, the amount of data can be overwhelming. News stories and blog posts often generate hundreds or thousands of comments. As the number of comments grows, presenting them in a chronological list is simply not a scalable interface for browsing and skimming. Third, many websites tend to attract people with like-minded viewpoints, which can reinforce biases and produce “cyberpolarization” (p. 1175).

Discussing the moderation of comments requires determining the quality of comments. Quality “refers to a degree of excellence in communicating knowledge or intelligence and normatively includes notions of accuracy, reliability, validity, currency, relevancy, comprehensiveness, and clarity” (Diakopoulos & Naaman, 2011, p. 1). Low- quality comments can be in the form of attacks on the publication and other users, hate speech, and libelous statements.

Singer (2011) found that while journalists have an unenthusiastic view of user comments because of their undesirable content, many journalists find comments useful as online comments can alter a reporter’s outlook on what is newsworthy (Santana,

2011, p. 76). However, because of the vast amount of comments that a particular article or topic may receive, journalists may have a difficult time separating quality comments from low-quality ones (Braun & Gillespie, 2010).

Singer and Ashman (2009) stated that while interactivity can be viewed as positive, the increase of user-generated content has changed the process of news, giving journalists less control. The researchers also pointed out that user-generated content brings in a host of practical and ethical questions that practicing journalists must address. Such questions include the following:

21

What might an optimal relationship between journalists and users/contributors— the people Bruns (2007) calls “produsers”—look like, and what are the challenges to achieving it? If the content space is shared, is responsibility for the content itself also shared? Who decides what is credible, true, or even newsworthy in the first place? What happens to the prized journalistic norm of autonomy in this environment? (p. 4).

Anonymity in Computer-Mediated Communication

Another question that journalists had to answer in this digital age was how to deal with the low-quality user comments, with “low-quality” defined by Diakopoulos &

Naaman (2011). One reason that people could feel inclined to post low-quality or uncivil comments is because they believe that other users will not be able to identify them because their comments were made anonymously.

For the purposes of this study, anonymity will be defined as “the inability of others to identify an individual or for others to identify one’s self. This may be within a large social context, such as a crowd, or in smaller context, such as two-person communications over the Internet” (Christopherson, pp. 3039-3040).

Previous research on computer-mediated communication (CMC), communication over the Internet, has shown that it differs from face-to-face (FtF) communication because of the lack of verbal and nonverbal social cues (Coffey & Woolworth, 2004;

Walther, 1992).

One obvious quality of CMC is the ability to completely hide one’s physical appearance from others. Unlike FtF interactions where one’s physical appearance is obvious, it is assumed that one could potentially be completely free of these physical cues in CMC. It is a well-established finding that physical appearance is an important cue in social interactions (Christopherson, 2007, p. 3045).

Physical cues include body language that signifies discomfort or verbal responses (Coffey & Woolworth, 2004). Coffey and Woolworth (2004) found that the lack of social cues coupled with virtual distance and anonymity may result in computer-

22

mediated communication that is more angry, extreme and even racist than face-to-face communication. Although computer-mediated communication has some drawbacks, it allows people to communicate with others that are separated by space and time, create and maintain social and professional contacts and encourages a greater sense of commonality between people than face-to-face communication (Hardacker, 2010, p.

223).

Even with these positives, anonymity on the Internet makes ensuring quality comments a challenge (Chen et al., 2011, p. 238). Santana (2012) conducted a study that tested the effects of anonymity on civility in user comments made on online newspapers. Through a content analysis of both anonymous and non-anonymous comments made in response to a racially-charged topic (immigration) and to a non- racial, yet controversial topic (the U.S. Tea Party movement), Santana found that a story with a racial aspect is “is apt to draw more uncivilized anonymous comments than a non-racialized one and that removing anonymity elevates the level of dialogue”

(Santana, 2012, p. v). Commenters posting about immigration were significantly more likely to write an uncivil comment than commenters posting about the Tea Party

(Santana, 2012, p. 72).

Of the 450 analyzed anonymous comments regarding immigration, 53.3% were uncivil. Additionally, “the most common vehicles for the expression of incivility in this area were comments that invoked disparaging sentiments on the basis of race/ethnicity

(16.1%), xenophobia (15.4%), name-calling (14.8%), racist or bigoted sentiments

(14.5%), and the use of stereotypes (13.9%)” (Santana, 2012, p. 60). Furthermore, the majority of these uncivil commenters made emotional appeals rather than posting facts

23

and figures. Of the anonymous comments about the Tea Party Movement, Santana found that 35.6% of the 450 comments were uncivil.

In terms of non-anonymous comments, findings showed that of the 450 analyzed comments on immigration, 44% of the comments were civil while 28.7% were uncivil and 27.3% were unclear. Santana (2012) found that anonymous commenters were significantly more likely to post uncivil comments than non-anonymous commenters.

Civility in online discussion is significant because it may affect the way information is interpreted. Hwang (2008) conducted an experiment using a simulated online discussion to examine incivility in online chatting. Participants in this experiment believed that they were in an online discussion session. Results showed that discursive incivility affected communication, as those who were uncivilly attacked felt more “moral indignation,” less open-mindedness and had an overall unfavorable attitude of others who did not share their views (Hwang, 2008, p. 90).

Uncivil comments may also have an effect on the way a journalist gathers information. While interviewing journalists about civility in online comments,

Diakopoulos and Naaman (2011) found that some reporters were having a difficult time obtaining sources for their stories, as some individuals feared that online commenters might criticize them if they were identified as a source in a story.

While commenters may be uncivil or post low-quality comments because they truly believe in the validity of their opinion, others may post simply to start an argument or make others uncomfortable, an Internet phenomenon known as trolling. “Trolling entails luring others into pointless and time-consuming discussions” (Herring, Job-

24

Sluder, Scheckler & Barab, 2002). Hardaker (2010) found that trolling is primarily made up of aggression, deception, disruption and success.

It seems clear that part of the human condition is to find a degree of entertainment in conflict, whether in the form of high-risk sports, action films, violent computer games, or linguistic aggression in television pro- grams (Culpeper 1996, 2005; Culpeper et al. 2003; Bousfield 2008). However, unlike these situations where the individual typically only watches or simulates conflict, online, with the protection of anonymity and distance, CMC users can exercise aggression against other real humans, with little risk of being identified or held accountable for their actions (Hardaker, 2010, p. 238).

Another issue that journalists had to confront on news websites was spam. Spamming refers “to any deliberate human action that is meant to trigger an unjustifiably favorable relevance or importance for some web page, considering the page’s true value” (Gyongyi & Garcia-Molina, 2005, p. 1). Spam could be posted with the intention to “unethically advertise products, distribute different malware, spread viruses and steal personal information thus posing a strong threat to the community”

(Rajendran & Pandey, 2012, para. 2). With the increase of blogs and other websites that encouraged user commenting, a new type of spam known as comment-spam had emerged (Bhattarai, Rus & Dasgupta, 2009).

It is even easier to exploit blogs’ comment section as they are open by nature to facilitate commentators to write their opinions about the piece of writing. Along with comments, people would also like to link the article with other posts in the blogosphere to express their opinions in relation to these posts. This results in links between pages of a blog or different blogs. While this kind of link can be a legitimate hyperlink to relate two different posts, spammers exploit this concept to increase their link weights by posting random comments and links in blogs (Bhattarai, Rus & Dasgupta, 2009, p. 37).

In attempts to address negatives such as incivility, trolling and spam, many journalists began to call for the end of anonymous online commenting toward the end of the first decade of the 21st Century (Santana, 2011). The policy of requiring those

25

posting comments to in some way register or identify themselves follows print journalistic practices, as anonymous letters to the editor and the use of anonymous sources is generally avoided (Santanta, 2011). One increasingly popular process used to limit anonymity was a plugin created by the social network Facebook that allowed other sites to install the Facebook plugin and required users to log into their Facebook accounts before posting a comment. Other news websites will not allow a user to comment unless the user registers for an account with the online news site using his or her real name (Santana, 2012).

Comment Moderation

Whether allowing anonymous comments or not, many news organizations by

2013 started to review user content in an effort to prevent and discourage low-quality comments. In a study examining how various United Kingdom news sites handled user- generated content, Hermida and Thurman (2008) found that some editors were concerned about the potential that user-generated content had to damage the brand of their organizations; therefore, these editors moderated comments.

Comment moderation can take many forms. Diakopoulos and Naaman (2011) identified two strategies that news organizations employed when addressing comment quality. Publications can moderate by asking users to report (also referred to as flag) comments that violate website guidelines for posting comments or comments that were abusive. If a particular comment received many flags then that comment could be deleted by the publication. A moderator could then leave a note to other users that the comment had been removed due to flags it received (Santana, 2012). This type of

26

crowd-sourced moderation5 empowers users by giving them a sense of ownership in the online community and makes moderation a collective activity (Briggs, 2010, p. 49).

News websites can also moderate comments, either before or after they are posted, in- house or by outsourcing (Diakopoulos & Naaman, 2011). While outsourcing can be beneficial, especially on controversial articles that may generate a high volume of user- generated comments, Diakopoulos & Naaman found that some editors might not trust the outside company to implement journalistic standards (2011).

While the moderation of comments varies by newspaper, comments often post immediately, though auto filters generally disallow vulgar language. In this way, unmoderated comment boards might tend to have more acerbic language than those cleaned up and closed by moderators. Still, even after the boards are moderated and closed, entire discussion sections can and do disappear; at many newspapers, online stories — and the accompanying comment forums — are free for the public’s perusal for only 30 days (Santana, 2012, p. 3).

News organizations may be encouraged to moderate comments because

moderation has been found to have an effect on the quality of user-generated comments. Chen, Xu and Whinston (2011) found that moderation systems directly affected an individual’s incentive to post useful information and could improve the quality of comments (p. 237). Furthermore, those who post comments that are being moderated may contribute quality comments in order to build up a positive reputation.

The researchers found that moderation can have an impact on posting, but that “the frequency of moderation is critical and should be properly chosen for optimal performance of the online community” (Chen et al., 2011, p. 261).

5 “At its simplest, community self-moderation (crowd-sourced moderation) is seen on sites that dynamically shift comments based on user feedback” (Briggs, 2010, p. 50).

27

Legal Aspect of User-Generated Content for News Organizations

While this research does not have a legal focus, it is important to mention the law as it pertains to online comment moderation. The United States Congress passed the

Communications Decency Act (CDA) in 1996 in an attempt to regulate obscenity on the

Internet (Ehrlich, 2002). Though most of the CDA was ruled unconstitutional because it violated the First Amendment, “one of the surviving elements is a congressional grant of immunity from suit to ISPs [Internet Service Providers] and other interactive computer services for content originating with third parties” (Ehrlich, 2002, p. 401). Although

Section 230 protects websites from liability if a user posts a comment that is defamatory or illegal, a moderator may be considered a content provider if a comment is edited in any way, in which case the moderator may not be protected from suit (Rich, 2012).

Many online publications post the legal issues that are related to user-generated content on their site, usually accessible through a link at the bottom of each page (Ruiz,

Domingo, Micó, Díaz-Noci, Meso & Masipp, 2011).

The text is a contract with the corporate entity publishing the news portal that users implicitly accept when they access the website. The fact that the legal rules are public, as the participation guidelines are, means that users cannot argue they were not aware of them (Ruiz et al., 2011, p.474).

Wahl-Jorgensen (2007) found that letters editors also considered certain legal issues when reviewing reader letters. “The editors draw on legal language, such as references to ‘censoring’ and the First Amendment, when they discuss editing and selection of letters” (Wahl-Jorgensen, 2007, p. 70). In addition, Wahl-Jorgensen (2007) found that letter editors rejected letters that may have resulted in libel suits (p. 87).

28

Gatekeeping Theory

When media professionals at online publications moderate user comments, they are acting as gatekeepers. Gatekeeping theory, as it relates to mass communication, refers to the process that information goes through before it is presented to the public.

“It is often defined as a series of decision points at which news items are either continued or halted as they pass along news channel from source to reporter to a series of editors” (Shoemaker, Eichholz, Kim & Wrigley, 2001, p. 233).

The analogy of the gatekeeper in mass communication was originally introduced by social scientist Kurt Lewin, who suggested that news items first travel through certain channels or “gates” (White, 1950; Shoemaker et al., 2001). The theory of gatekeeping and the role of journalists as gatekeepers were then popularized by David Manning

White, who conducted a case study of the news selection of a newspaper editor in 1950

(Shoemaker et al., 2001).

This study will use gatekeeping theory to explain how moderators at news websites play an important role in the public discourse on comment sections.

Specifically, those involved in implementing comment moderation policies select which voices get heard and determine which voices are silenced. User-generated content may go through a series of filters before it is posted online for the public, while content that is deemed uncivil or low quality may be deleted before or after it is posted.

29

CHAPTER 3 METHODOLOGY

In order to determine the answers to the research questions, the researcher decided to conduct interviews with individuals at news organizations who were involved in the development of and implementation of comment-moderation policies.

An interview approach was selected for collecting information about comment- moderation, as opposed to a survey, because the interview process would allow for more explanation of how and why certain procedures are used by news sites.

The interview questions (Appendix C) were informed by prior research on interactivity, user-generated content and moderation and developed after an examination of the community, or user, guidelines for all newspapers. This questionnaire consisted of 11 primary questions and additional follow-up questions based on responses. Questions were tailored for each interviewee based on their job description and role in moderation.

Each participant was originally contacted through an email letter (Appendix D), which included information about the researcher, the study and details about how the study would be conducted. Some participants responded indicating that they were willing to be interviewed, while some responded referring someone else who would be better suited to participate. The referral system was also used after some of the initial interviews were conducted, as some interviewees passed along information about this study to others who were eventually interviewed. This resulted in the study being expanded to newspapers in Georgia and that were part of the Cox Media Group.

30

This study initially focused on three newspapers in Florida that were part of regional or national news organizations. The goal was to select newspapers that had posted online user policies and that would be identified with a certain geographical community. One of the purposes of this research was to learn about moderation policies that could be an aid to news organizations that are developing moderation policies.

Therefore it was determined that the size, community and resources of these newspapers would be more relatable than a larger news site that received more daily traffic and user comments, like USA Today or The New York Times.

This study originally focused on newspapers in Florida, as the research was taking place in Florida and the researcher had contact information for journalists at several Florida newspapers. An application for the Institutional Review Board was completed to contact the aforementioned journalists, however the board determined that approval was not needed to conduct this research.

The interview process initially started with The Palm Beach Post, The Orlando

Sentinel, and one other major newspaper in Florida. The interview with the online community manager at The Palm Beach Post led to an interview with Cox Media Group, which led to interviews with The Atlanta Journal-Constitution and Dayton Daily News.

Multiple attempts, through email and phone calls, to schedule interviews with potential sources at the third major newspaper were unsuccessful, so that newspaper was eliminated from the study.

All interviews were conducted by telephone between January 2, 2013, and

February 5, 2013. Each interview was audio recorded and later transcribed. Additional

31

follow up questions were asked through telephone and email communication. In total six individuals were interviewed.

Limitations. One factor that limited this study was the lack of institutional memory. Because some of those interviewed were new to their positions or had only been with the news organization for a short period of time, they were unable to answer certain questions about comment moderation. Notably, there was an interviewee who had come into the position two weeks before the interview was conducted and another interviewee who joined the company five months prior to this study’s interview. Some important questions that were affected by the lack of institutional memory were “When did the newspaper launch its website?” and “When did the newspaper start to allow commenting?”

Another limitation was anonymity, as some participants did not want have their names published because they could potentially receive negative responses from their readership. Four interviewees did not wish to have their names mentioned in this research, so every participant is listed by job title for the sake of consistency. One participant in this study did not want to inform readers about specific moderation procedures, therefore responses about certain processes were not detailed.

Additionally, the interviews conducted for this study were not uniform in terms of asking all the pre-planned questions and, in some cases, not as in depth because of the time limitations of the individuals interviewed. Factors included scheduling conflicts and illnesses.

This study originally sought to understand how comment policies were developed. However, requests for interviews from individuals who developed the

32

policies received no response. Therefore, the researcher relied on information from those who were interviewed. Because of this some important questions such as “When was the policy developed?” could not be answered in this study.

33

CHAPTER 4 FINDINGS

In order to determine the policies used to moderate readers’ online comments and the process used in moderating comments, six individuals were interested who had responsibility for online moderating. Those individuals were employed at The Palm Beach Post, The Atlanta Journal-Constitution, The Dayton Daily

News and The Orlando Sentinel at the time of the interviews, which took place in

January and February 2013.

Cox Media Group

Cox Media Group, Inc., a media company, owned television broadcast stations, a cable news channel, radio stations, digital services and newspapers in 2013. In 2011 the company owned eight daily newspapers and 16 non- daily papers. Cox-owned newspapers included The Palm Beach Post, The Atlanta

Journal-Constitution and The Dayton Daily News (“About,” n.d.).

Former Ohio governor and United States presidential nominee James A. Cox founded Cox Enterprises, Inc. in 1898 in Dayton, Ohio (“History,” n.d.). After the acquisition of several newspapers, radio stations and cable systems, Cox Enterprises became one of the largest broadband communications companies in the United States during the 21st century (“History,” n.d.). Cox developed the visitor agreement, what they called their commenting policy, which was used on all of its properties’ websites in

2013.

The Social Media Manager/Strategist, who had been in this position for over three years, provided information about the Cox comment policy. The Social Media

Manager/Strategist helped build the content management system that Cox websites

34

used for commenting in 2013 and advised Cox properties about ways to utilize social media and engage the online community. The Social Media Manager/Strategist previously worked as an Internet producer at The Palm Beach Post and worked in news for approximately nine years.

Comment Policy

According to the Social Media Manager/Strategist, Cox developed the visitor agreement that was used by all the Cox properties and posted on Cox property websites in 2013. Cox’s agreement stated that each Cox affiliate had adopted the terms of the user agreement (Appendix A). Cox developed the agreement, but moderation of the sites was left to the discretion of the individual newspapers. Individual Cox newspapers also had their own letters to the editor policies (Appendix C, Appendix D,

Appendix E and Appendix F). All Cox properties ran on the same commenting system in

2013. The Social Media Manager/Strategist at Cox said that Cox was centralized in the sense that all of the news sites use the same commenting system and those at the corporation encouraged its news properties to allow commenting and moderating, but community management, which involves overseeing user participation (Braun &

Gillespie, 2010), was something that individuals at the newspapers performed themselves.

The Palm Beach Post

In 2013, The Palm Beach Post was a Cox-owned newspaper based in West

Palm Beach, Florida. The paper was founded in 1908 as a weekly originally named The

Palm Beach County. In 1916 The Palm Beach County became a daily known as The

Palm Beach Post.

35

Cox purchased a number of Palm Beach and West Palm Beach, Florida, newspapers in 1969. A decade later, the newspaper The Palm Beach Times was renamed The Evening Times and in 1987 merged with The Post forming The Palm

Beach Post (“About Us,” n.d.).

The information about The Palm Beach Post’s moderation of online comments was obtained through a telephone interview with the Digital Manager, who held this position for three years at the time of the interview. The Digital Manager’s primary job responsibility in 2013 was overseeing The Palm Beach Post’s online content. The

Digital Manager’s previously held positions at The Palm Beach Post included Online

Editorial Director and Assistant Online Director. The Digital Manager had over 15 years of news experience.

As of 2012, The Palm Beach Post reached an average of 142, 679 readers with its Sunday edition, 88,231 readers Monday through Friday and a readership of 89,335 with its Saturday edition (Alliance for Audited Media, 2012).

According to the Digital Manager at The Palm Beach Post, The Palm Beach Post started to allow commenting shortly after the website was launched in the early 2000s.

The Digital Manager was unable to recall the exact date.

User Registration

In order for someone to comment on The Palm Beach Post’s website during the time of this study, he or she must have registered an account on The Palm Beach

Post’s website, using a valid email address and agreed to the visitor agreement. One also had to register an account in order to “report a comment,” which meant that a reader found a comment posted by another reader to be inappropriate. Those reported comments were the ones that the Digital Manager would review to determine if the

36

comment should be removed. By requiring an individual to register in order to report a comment, the same individual could not report the same comment multiple times.

However, the Social Media Manager/Strategist acknowledged that a drawback of the system was that someone could create multiple accounts with different email addresses.

Site registration rules had changed six months before this research was conducted. Prior to June 2012, users did not have to register an account in order to comment on a news story or report a comment that violated the visitor agreement. The

Social Media Manager/Strategist said that prior to June 2012 there was no way to trace what user reported what comment. Because there was no way to verify an email address, The Palm Beach Post decided to disable commenting on crime stories.

Digital Manger: You wouldn’t have to be registered; you could just pop in there anonymously and leave comments. So on a lot of crime stories we would not offer commenting at all by default because those would tend to turn into ugly discussions. Once we put the registration requirement in place, now we have the ability that if somebody is commenting consistently in a way we don’t like we can just ban them and they would have to reregister with a different email address, and it becomes harder for them to do that on a consistent basis.

The Digital Manager also said that commenting had become “more civil” since the implementation of the registration requirement.

The Moderators and Moderation Strategies

The Palm Beach Post did not have full-time moderators at the time of this study.

The Digital Manager explained that moderating comments was a shared task that any staff member performed during their shift. Moderation was “shared by all” and staff members who work on the website would review flagged comments when there was free time during their shift.

37

According to the Digital Manager, The Palm Beach Post paper did not pre-screen comments that appeared on the news organization’s website. In other words, every user comment automatically appeared on the site when posted by the reader. The paper used a type of crowd-sourced moderation in which users reported, or flagged, comments that they believed violated the visitor agreement. Every comment on the website had a “report abuse” button next to it that allowed people to report a comment to the paper. For example, a user could flag a comment that he or she thought was offensive. Once a comment was deleted, a message appeared in place of the original comment that indicated the comment was removed, but the poster’s user name would still be present on the site. Community moderation helped those at the newspaper better handle the vast amount of comments the site received. The staff of The Palm

Beach who reviewed the online comments referred to the visitor agreement to determine whether a flagged comment should be removed or not.

Those at the paper found that some people would report a comment simply because they did not agree with it, not because was objectionable. This is the reason that once a comment was reported, it stayed visible on the site. It was also sent to a queue at the administration end of the site where moderators could review it and remove it from the site if it was judged to have violated the visitor agreement. If the comment was deemed appropriate, then the comment it stayed on the site. While a moderator reviewed queued comments periodically throughout the day, there could be times when comments could not be reviewed immediately. In 2013, The Palm Beach had a system in place where comments that received five individual flags were automatically removed from the site. Once a certain comment was removed from

38

posting on the website, moderators were still able to review it to determine if it violated the visitor agreement. If the comment was determined by the moderators to not violate the visitor agreement, the comment could be republished.

The visitor agreement used terms such as “obscene” and “harmful” when describing unwanted comments (Appendix A). Although moderators did not receive any official moderation training, the Digital Manger said that determining whether a comment violated the visitor agreement was often simple. These types of comments included those that contained profanity, personal attacks or were overtly racist. In incidents where a moderator was unable to make a determination about whether or not a comment violated the visitor agreement, the moderator would go to the Digital Manager or another editor for advice. For example, The Palm Beach Post received a large number comments on various stories that expressed disapproval over immigration laws, and a moderator may have had difficulty in determining whether the language in these comments violated the commenting policy. The Digital Manager said having a debate about immigration on the site was appropriate, but racist language was not allowed.

Digital Manager: If it’s more of a borderline comment, they’ll tend to come to me or to an editor and sort of bounce it against us and ask for our opinion. The guidelines have kind of been shaped in that way over time; they’ve kind of gotten the feel for what can pass and what cannot. In general they will come to an editor for a judgment call if it’s something they’re not sure about.

Moderation Advice and the Future of Online Commenting

The Digital Manager said that any news organization that uses crowd-sourced moderation should never go half of the day without reviewing flagged comments, as the conversation cold turn from the news article to the lack of moderation. The Digital

Manager said that newsroom budgets might not allow funding to allocate someone to

39

constantly moderate comments, but the Digital Manger emphasized that user comments are something that must be addressed regularly throughout the day.

Both the Digital Manager and the Social Media Manger and Strategist had a favorable view of readers commenting and the community environment that reader comments brought to the site.

Digital Manger: You can even get sources out of the comments or news tips out of the comments. We’ve got a lot of reporters who write a story and read something in the comments that leads them to another ending to that story or follow-up, or they’ll discover a source that they’ll wind up keeping for a long time through the comments.

The Digital Manager also said, “We’ve always had this vast readership out there that doesn’t know about each other and now they do.”

The Social Media Manager/Strategist and Digital Manager both supported the use of online anonymity. Both interviewees said that a user may have legitimate or reasonable views, but may not want to express those views in a public forum using a real name. Examples given were business owners not wanting customers to know their political views or an employee not feeling comfortable with his or her employer reading certain opinions. The Social Media Manager/Strategist also said that certain members of society, particularly women, young people and those with safety issues, may feel vulnerable if others on the Web knew their real identities.

The advantages associated with anonymity was one of the reasons cited for why

The Palm Beach Post had not adopted the Facebook plugin for its site as of 2013.

Another reason stated was the functionality of the plugin, which interviewees said did not complement the newspaper’s registration or moderation systems.

Digital Manger: We want people to be able to sign into our site and sign up for newsletters, comment on stories and all these different things

40

native to our system, and then if you thrust in the plugin from Facebook it kind of pulls them out of that environment.

The Social Media Manager/Strategist said, “You just kind of give [moderation] up to

Facebook because you can report comments for abuse, but you don’t actually have control to publish or unpublish comments through Facebook commenting.”

The Atlanta Journal-Constitution

The Atlanta Journal-Constitution (Atlanta, Georgia) was a Cox-owned newspaper with a total average circulation of 402,606 for its Sunday issue in March 2012 (Lulofs,

2012). The paper was the result of a merger between The Atlanta Constitution, founded in 1868, and The Atlanta Journal, which was founded in 1883. In 1939 James A. Cox purchased The Atlanta Journal and later purchased The Atlanta Constitution in 1950.

Cox combined the papers’ Sunday editions the same year. Finally, the newsrooms of both The Atlanta Journal and The Atlanta Constitution merged in 1982 and became The

Atlanta Journal-Constitution in 2001 (“Our History,” n.d.).

The paper’s website AJC.com has been online since 1998. The site did not allow commenting on news articles until the summer of 2012.

The Editor of AJC.com was interviewed to determine how another Cox Media

Group newspaper handled online comments. The Editor managed the staff that was responsible for The Atlanta Journal-Constitution’s website in 2013. This included monitoring the website to ensure that the “best possible content” was there and collaborating with marketing and advertising to create “new ways to increase engagement” on the site. The Editor had been in this position for two weeks at the time of this interview. The Editor was previously the Digital Manager for Strategic and

41

Subscriber Initiatives at The Atlanta Journal-Constitution and had over a decade of journalism experience.

As of 2013, a commenting option was automatically attached to every story.

However, the Editor said that the paper did not allow commenting on any type of death story out of respect for the deceased’s family.

User Registration

At the time of this study an individual must have registered an account with the site using an email address. Users also had to sign in with that account to comment or to report a comment.

The Moderators and Moderation Strategies

As of 2013, The Atlanta Journal-Constitution did not have full-time moderators.

Instead the website was divided into content panels, such as sports, entertainment and news. If a comment was reported in a certain content panel, then a moderator who worked within that panel would be responsible for reviewing it, according to the Editor.

The Atlanta Journal-Constitution primarily relied on a crowd-source moderation system to moderate comments that users found inappropriate. Every comment on the website had a “report abuse” button next to it that allowed people to report a comment to the paper. A comment would be automatically removed from the site if it received five reports. The Atlanta Journal-Constitution used the same process as The Palm Beach

Post. That is, when a comment was deleted, a message appeared in place of the original comment that indicated the comment was removed. However, the commenter’s user name would still be present on the site.

The Editor said that The Atlanta-Journal Constitution did not monitor every comment on the website. In some cases the paper would receive an email from a user

42

informing the paper that there was a comment on the site that may be unsuitable. In this instance a moderator would review that comment to determine whether it needed to be removed. Moderators removed comments that contained racial slurs, profanity or threatened other users. The Editor of the site said that while it was hard to gauge whether a comment is inappropriate, those at the paper would leave comments on the site that were not profane, racist or threatening. The editor explained that disagreements in comments were not always removed if the comments contributed to the discourse. “That’s part of the conversation, so we won’t necessarily police for that, but we will police it for the slurs and the bad language,” the Editor said.

Moderation Advice and the Future of Online Commenting

The Editor said that the “community feel” of the news site was strengthened by the user comments. She praised the site’s sports section, which had a strong community-like environment where users were comfortable with policing each other and moderating for inappropriate comments. The Editor recommended newspapers that are moderating comments encourage community engagement.

Editor: Try to be as open as possible to provide that engagement for the users and to allow them to kind of create that community, because if you try to stifle that community you’re not going to see that engagement level that you might be looking for.

Dayton Daily News

The third Cox paper in this study, the Dayton Daily News, was founded when

James A. Cox purchased, and eventually renamed, the Dayton Evening News in 1898.

(“History & Direction,” n.d.). At the time of this research, the newspaper served 12 counties in Ohio and reached approximately 555,000 adult readers each week (“History

& Direction,” n.d.).

43

The Digital Product Manager was interviewed to find out the moderation strategies used by the Dayton Daily News. The Digital Product Manager, who had been in the position for three years, was responsible for the various social media platforms for the Dayton Daily News and other Cox properties in Ohio. Previously, the Digital Product

Manger worked as a photojournalist for Cox Media Group Ohio.

The Dayton Daily News did not allow commenting on its website for a period of time that was not known by the Digital Product Manager, however commenting was enabled approximately six months prior to this research.

The Digital Product Manager said that commenting was not allowed on every story posted on the newspaper’s website. Specifically, the paper disabled commenting on crime stories.

Digital Product Manager: Usually the topics turn into things that are not productive. We’re giving our community a voice to express their freedom of speech, but most of the time on those types of stories there’s very negative comments and we don’t feel that it’s actually helping the conversation... Because we don’t have time to keep up with all of the different comments, and those conversations tend to be very negative-focused, we don’t have commenting available to them.

User Registration

In 2013 users had to register an account using a valid email address in order to comment or report a comment on the newspaper’s website, according to the Digital

Product Manager.

The Moderators and Moderation Strategies

The Digital Product Manager said that the Dayton Daily News did not have full- time moderators in 2013. Instead the newspaper had a digital department, in which members had various responsibilities such as reviewing comments.

44

The Dayton Daily News followed the visitor agreement that was developed by

Cox. According to the Digital Manager, in the cases when commenting was allowed on stories, users were able to flag comments that possibly violated the visitor agreement.

Once a comment was flagged, moderators would often review the entire conversation to understand the context in which the flagged comment was used and determine whether it should be removed from the site. However, moderators did not review entire conversations when removing spam from the website in 2013.

Digital Manager: When we get complaints we definitely dig into the story to see what’s going on. If it turns out to be a controversial issue and the commenting was on, then we go in and look at everything and evaluate what comments are ok and which ones need to be taken down.

The Digital Product Manager said that there was not a great deal of moderation on the newspaper’s website at the time of this study because the site did not receive a large amount of comments.

Moderation Advice and the Future of Online Commenting

The Digital Product Manager said that the Dayton Daily News received more comments on its Facebook page than its website in 2013. The Dayton Daily News launched a Facebook page on April 24, 2009 and had 20,032 “likes”6 at the time of this study. The paper posted stories from its site onto the Facebook page and allowed

Facebook users to leave comments.

The Dayton Digital News used the same comment policy for its website and

Facebook page. The policy was not posted on the Facebook page at the time of this research. The Digital Product Manger said that the user-posting rules would be

6 In 2013, an individual could “like” a Facebook page to follow it for updates and receive messages (“Like,” n.d.).

45

extracted from the visitor agreement and posted on the Facebook page shortly after the interview.

The Dayton Daily News moderated comments on its Facebook page in 2013, but the Digital Product Manager said that the moderators tended to “lean on the side of not removing them.” A moderator would simply remove comments that were inappropriate after the comment was posted on the Dayton Daily News Facebook page. Moderators removed comments that were considered spam and those that may have contained libelous statements from the Facebook page. They also removed spam and libelous comments from the newspaper’s website.

Moderators also removed comments that were considered uncivil, which included comments that incited arguments, contained hate speech and personally attacked individuals. The Digital Product Manger said that these types of comments “derailed” the conversation and those at the Dayton Daily News expected Facebook commenters to “keep on the topic and be respectful to each other.” The Digital Manger also said that they “try to balance freedom of speech versus whether that opinion is valuable to the conversation.”

The Digital Product Manger said that the Dayton Daily News may have received more comments on its Facebook page versus its website because users were trained to post on Facebook when the newspaper site did not allow comments. Additionally, the

Digital Product Manger said that the Facebook page may be receiving a large amount of comments because the Facebook users were very engaged and had created an online community. The Digital Product Manager also interacted with the community by explaining the commenting policy on the Facebook page and knowing several users.

46

Although staff at the Dayton Daily News interacted with Facebook users, the

Digital Product Manager said that newspapers must ensure that they are engaging communities “correctly” by not starting conversations that are going to lead to arguments or that do not promote the topic at hand. “It’s a fine line because we want to engage with people, but we also don’t want to sensationalize it or just get comments to get a lot of comments; we want to get valuable comments too,” the Digital Product

Manger said.

The Digital Product Manger said that the next step in online commenting after

2013 will depend on the digital tools available. While the Dayton Daily News received more comments on its Facebook than its website in 2013, the Digital Product Manager said that the Facebook commenting plugin would be beneficial to the paper’s website, as users would have more accountability when posting through their Facebook accounts and their Facebook network would be able to view their comments.

Tribune Company

The second news organization that was included in the research study was The

Tribune Company. In 2013 Tribune Company was a media business based in Chicago,

Illinois. The Tribune Company was founded in 1847, which is the same year that the company’s newspaper the Chicago Tribune was first published (“Tribune Company

History,” n.d.). As of 2013 the Tribune Company owned radio and television stations, websites and newspapers that include the Los Angeles Times, The Baltimore Sun and the Orlando Sentinel (“About Tribune Company,” n.d.).

Orlando Sentinel

In 2013 the Orlando Sentinel was a newspaper based in Orlando, Florida, that was the result of the 1973 merger of The Orlando Morning Sentinel and the Reporter

47

Star (“The History Of The Orlando Sentinel,” 2004), Tribune Company acquired the newspapers in 1965 (“Tribune Company Timeline,” n.d.). The paper changed its name from the Sentinel Star to the Orlando Sentinel in 1982.

The newspaper became available on the Internet in 1995 through the company

America Online and launched its own website in 1996 (“The History Of The Orlando

Sentinel,” 2004).

In 2012 the Orlando Sentinel had a circulation of 271,824 for its Sunday edition,

185,262 for its Saturday edition and 162,636 for its Monday through Friday issues

(Alliance for Audited Media, 2012).

The Online Comment Manager and the Primary Moderator were interviewed for this research. The Online Content Manager had been at The Orlando Sentinel for five months at the time that this research was conducted. The Online Content Manger supervised the team of staff members who managed the Orlando Sentinel’s homepage.

The Online Content Manager was also responsible for reviewing comments when the

Primary Moderator was not available.

The Primary Moderator was responsible for reviewing a flagged comment to determine whether the comment violated the terms of service, which is what Tribune

Company called their commenting policy. The Primary Moderator, who held this position for three years and was employed by The Orlando Sentinel for over a decade, also worked on search optimization and online audience development. The Primary

Moderator had over 20 years of journalism experience and received training in chat hosting from another company.

48

Comment Policy

The Orlando Sentinel implemented the terms of service that was developed by

The Tribune Company for all of its properties. The policy and commenting system was created with the help of the company’s lawyer. The Primary Moderator at The Orlando

Sentinel described the policy as a document that “a lot of work went into” and provided very specific guidelines regarding commenting.

Online Content Manger: The idea is that we did want to encourage freedom of speech; that we did want to encourage an active participation and engagement for our users. We want them to respond to our stories, to our online content. But yet we want to make sure that people who are commenting are doing so in a fairly constructive manner, that they aren’t attacking anyone or spreading any malicious or falsehoods that could be defamatory or could be considered to be essentially attacking other users.

User Registration

The Primary Moderator and Online Content Manager said that in order for an individual to comment or report another comment he or she was required to sign into the site with an account that was registered using a valid email address.

The Moderators and Moderation Strategies

According to the Orlando Sentinel’s Primary Moderator, at the time of this study the news site had a team of individuals responsible for online activities, which included selecting stories that appeared on the site, managing social media and moderating user commenting.

Within this team was the Primary Moderator, who determined whether a flagged comment violated the terms of service. Another member of the team would take on these responsibilities when the Primary Moderator was out of the office. The Primary

Moderator would train anyone covering the shift. This training consisted of an overview

49

of the procedure used to review comments. However, those taking over the shift would not ban users. They could remove a comment, but left that task to the Primary

Moderator.

As of 2013, The Orlando Sentinel had filters in place on its website to reduce the amount of inappropriate comments that were posted. These filters mainly blocked comments containing profanity and racial slurs, but the team refined the filter list by adding offensive words that they encountered while moderating.

Primary Moderator: Sometimes people will use words like “retard” and things that are inflammatory to people with disabilities. We don’t allow that either. It’s not appropriate and we want our message boards to be a safe and comfortable place for people to express their opinions no matter what side they’re on and be heard without being attacked or having somebody make fun of them for other reasons.

The team also added to the filter list profane words that users have spelled incorrectly to get through the site’s filters.

At the time of this study the Orlando Sentinel did not monitor every comment on every story posted on orlandosentinel.com, instead relying on users to report comments that violate the paper’s terms of service.

If a reported or flagged comment was reviewed by the moderator and did not violate the terms, the comment would stay active on the site. If the moderator determined that a flagged comment violated the terms of service, then the comment would be unpublished from the site with no message that alerted other users. In addition, the moderator would send an email to the user stating that he or she has violated the terms of service. In instances where it was the user’s first violation, then the user would receive a warning explaining that he or she would be banned from

50

commenting on the site if another inappropriate comment was posted. If the user had already received a warning, then the moderator’s email to the user would explain that the user had been banned from the site for repeated violation of the terms of service. In

2013, the Orlando Sentinel had certain functions incorporated into their system that helped the team identify who had been banned from commenting, which decreased the likelihood of a banned user reregistering with a new account.

A user who posted spam on the website would automatically be banned without warning, as the team had found that those posts often were not from authentic users.

When a potentially inappropriate comment made it through the filter and was subsequently flagged, the moderator was responsible for determining whether that comment violated the terms of service. If a moderator could not make the decision to remove the comment then he or she would ask coworkers or a supervisor.

Online Content Manager: If there’s a comment that’s made, there have been many times we will have an internal discussion about whether that comment is appropriate. If someone flags something that doesn’t necessarily mean that we’re going to remove it from the site—that means that we’re going to review it to see if it violates those terms, and if our primary moderator or someone who is moderating that post is unsure, then they can send it up the chain of command and get more eyes on it and then at that point it would be a consensus of “yes this is a violation” or “no, it’s not a violation.”

The Primary Moderator explained the process of reviewing a comment.

Primary Moderator: I also will keep in mind how the user communicates on the board, because there may be one comment like that that’s a little ambiguous, but we can see their other comments and we can tell what types of things they’re saying on the board. Chances are if we think in that particular case it might be inappropriate; there’s probably five more examples of inappropriate comments.

51

The Primary Moderator and Online Content Manager said that many flagged comments were fairly easy to review because they blatantly violated the terms of service. For example, the moderators at the Orlando Sentinel had zero tolerance for comments that attacked other users on the basis of race, gender, ethnicity, or sexual preference.

The Primary Moderator said that the team tried to be “fair” when members reviewed flagged comment. Each flagged comment was reviewed on a case-by-case basis, and the moderators would take a user’s personal situation into account. For example, if a user informed those at the Orlando Sentinel that he or she should not be banned because they never received a warning, the moderator would give the user the benefit of the doubt and reinstate the account. The moderator also recognized that the paper covered topics that have received national readership, so a new user to the site may not be familiar with the way that the paper operated its website. The Primary

Moderator also said that the team is sensitive to cultural differences.

Primary Moderator: You have examples of people who are from different parts of the nation or from different parts of the world, and they may put something that, it’s not acceptable for us, but it may be more acceptable in their culture. So sometimes you can kind of get that feeling from somebody and you jut tell them, “Hey, this is the way you have to do it if you want to participate on the boards” and give them a chance.

At the time of this study the Primary Moderator checked flagged comments at least every two hours throughout the work shift, which was from 9 a.m. until 5 p.m. on weekdays. The Primary Moderator said comments were “always looked at every day,” and if the Primary Moderator was not at work then someone else from the team reviewed flagged comments.

52

Moderation Advice and the Future of Online Commenting

The Online Content Manger at The Orlando Sentinel recommended that any media organization implementing comment moderation policies should use the commenting option as a way of engaging their audience and should ask users to actively participate. However, while the manager advised organizations to giver users a wide latitude with commenting, moderators should not allow bullying or malicious attacks on the site. The Orlando Sentinel’s Primary Moderator advised media organizations to set up a clear, specific terms of service, place the policy somewhere visible on the website and ensure that every person on the moderating team is on one accord.

The Online Content Manager explained that social media plugins would be the future of online commenting in the years following 2013 and said that users may be more even-tempered when commenting from their social media profiles. The Primary

Moderator said that websites using social media plugins may see a decrease in the number of comments as users may not want to post certain opinions, such as political stances, using their real names out of fear that someone, like an employer, may read it.

Primary Moderator: I think the determination would be that you might try something like that out and if it doesn’t work we would go back to the way that we did it before. And that’s what the Internet is all about. In my opinion it would be great as a moderator, but I do think that the comments would drop off and we would get fewer comments on stories if people had to sign in with their real names.

53

CHAPTER 5 DISCUSSION

This research set out to determine how online news sites moderate user comments and what issues journalists consider when moderating comments. Through a series of interviews with six professionals associated with four newspapers, this study found that news sites relied on crowd-sourced moderation to both monitor comments and encourage community. Users would flag comments that potentially violated the commenting policies that were developed by the newspapers’ parent company.

Moderators would then review flagged comments and typically remove ones that contained personal attacks, profanity and hate speech, as these comments did not contribute to the constructive community environment. These processes are consistent with how journalists reviewed letters to the editor and suggest that certain journalistic practices for promoting public dialogue and interactivity have simply shifted to an online medium.

As the Web’s significance grew in the 21st Century, the medium also continued to impact journalism. Not only did an increasing number of newspapers have an online presence, the Internet gave members of the public new ways to interact with journalists and each other. In fact, in 2012 the Pew Research Center for People and the Press found that there was an increasing number of Americans who received news online, specifically from social networks as “the percentage of Americans saying they saw news or news headlines on a social networking site yesterday had doubled” (“In Changing

News Landscape, Even Television is Vulnerable,” 2012, para. 2).

One interactivity feature, the online comment section, was popular among news organizations and readers alike. Commenting increased the interaction between the

54

public and journalists, gave commenters the opportunity to leave feedback on news, and let readers respond to each other’s comments. However, allowing online commenting forced news organizations to deal with comments that contained hate speech, libelous language, spam, off-topic dialogue and incivility.

These aforementioned features of online commenting led professionals at news organization to moderate comments, which is an extension of the journalist’s traditional role as a gatekeeper.

The New Gatekeeper

The gatekeeping theory in mass communication refers to the series of filters that content, such as news stories, letters to the editor, and photographs, go through before they are presented to the public. This process includes an item of information passing from the source, to the reporter, to various editors and finally to the public (Shoemaker,

Eichholz, Kim & Wrigley, 2001).

Although gatekeeping traditionally meant that journalists had a role in what news was presented to the public and how it was presented, journalists still hold on to the gatekeeping role when they moderate comments on the Internet. Corporations and news organizations start the gatekeeping process when they develop a commenting policy. Individual moderators at news sites continue the process when they determine what comments stay on the website and what comments are removed. In addition, comments may also go through literal filters before they are even posted on a site. In this instance, a reader who may have a valid comment may be silenced if he or she uses profanity or a racial slur. If a comment is not presented in the manner that the news organization has established as acceptable, that comment may be blocked from being posted or deleted after it is posted.

55

Half of the moderators in this study indicated that it was not always easy to determine whether a comment should be unpublished from the news site. However, most comments that they said were removed had blatantly violated the newspapers’ posting policies. Comments that contained profanity, racial slurs and spam were removed. Personal attacks and libelous statements were also removed from the websites almost automatically. If a particular moderator could not decide whether to remove a comment or not, then he or should would discuss it with coworkers or supervisors to reach a consensus.

This gatekeeping process is similar to the way that editors determined which letters to the editor would be published in newspapers. Wahl-Jorgensen (2007) found that editors only rejected “personal attack letters that might result in libel suits” and letters that were “openly racist, sexist, or homophobic and do not in any way contribute to the public debate” (p. 87). These findings suggest that while the technology for interaction may change, the gatekeeping practices that journalists use stay in tact.

While gatekeeping practices of a moderator may be similar to that of a letters editor, the findings of this study also suggest the emergence of a new trend that can be viewed as reverse gatekeeping, in which user comments affect how journalists perform their jobs and the stories that are reported. For example, the Digital Manager at The

Palm Beach Post stated that reporters at the newspaper would often read comments and discover tips, new story angles and future sources. This sort of reverse gatekeeping is not unique to The Palm Beach Post, as other reporters also found online comments useful because they could change a journalist’s perspective about what is newsworthy and what is not (Santana, 2011). In reverse gatekeeping the online community of

56

readers and commenters have more control in Web content when compared to their level of control in print publications.

Building an Online Community

The findings of this research suggest that creating their own online community where users feel comfortable to interact is important to online news sites. Many decisions that moderators made were done to foster a safe environment for commenters and promote communication. This included removing comments that contained personal attacks and hate speech. In addition, moderators also communicated with users and provided explanations of the newspapers’ commenting policies. In the case of The Orlando Sentinel, a moderator emailed a user who violated the acceptable use policy. In the email, the moderator explained the policy and warned the individual that a second unacceptable posting would ban the user or, in the case of a user who had a second violation, would explain why the user was being banned.

This communication between moderators and commenters serves to strengthen the online community. Not only do repeat commenters become familiar with each other as an online community grows, moderators also learn the user names and commenting habits of the community members.

Fostering an online community can result in several positive outcomes for a publication. In this digital age where members of the public can get news from a variety of outlets, including national news websites, blogs, cable news networks, magazines and social media, a local newspaper has to find a way to attract and maintain readership. A newspaper with an established Web presence and engaged online community may be inviting to readers who want to interact. An online community can

57

also foster a sense of loyalty that makes commenters want to come back to a particular news site.

Another positive related to repeat commenters is familiarity. Members of online communities are often familiar with the site’s moderation policy; therefore they may be less likely to violate it. Furthermore, if a member of the community does violate the policy then he or she should not be surprised when the inappropriate comment is removed. This may lead to more even-tempered responses to comments being unpublished and users being warned or banned because the user is expecting it.

Promoting community engagement also involved being conservative when removing comments. Several of the individual interviewed as part of this study said that it was important to not to suppress engagement by over moderating. This is also similar to the approach letters editors used. Like those interviewed in this study, Wahl-

Jorgensen (2007) found that while letters editors may be “uncomfortable with the tenor of the debate,” they viewed a “policy of limited editorial intervention as the only way to ensure an open and honest debate” (p. 155). Moderators in this study took the user’s posting history, the context of the conversation, and cultural differences into account when reviewing flagged comments to ensure that that the comment was truly unconstructive.

Future journalists and others implementing online moderation policies can foster community by encouraging comments, as online comments give users an opportunity for dialogue. Those wanting to build an online community should also promote freedom of expression and diverse opinions in commenting sections. However, comments should not discourage other users from participating because the posted comments

58

contain uncivil language, as language that is blatantly sexist, racist, homophobic or personally attacks someone does not contribute to a productive discourse.

Crowd-Sourced Moderation

All five of the newspapers in this study used crowd-sourced moderation, in which users could report comments that they felt violated the commenting policy. This crowd- sourced moderation strategy required users to register an account with a valid, working email address before they could comment or report comments. User registration allowed moderators to keep track of users who violated the commenting policy and users who reported comments. The Orlando Sentinel used this information to email a warning to users who violated the terms of service and inform repeat offenders that they would be banned from the website.

This crowd-sourcing strategy also increased user engagement and gave readers more power to determine what content appeared on news sites. As Briggs (2010) stated, the collective experience of crowd-sourced moderation gives users a sense of ownership in their online communities. This was evident in the sports section of The

Atlanta Journal-Constitution’s website, in which users formed a community and were not afraid to self-police within that community.

This type of moderation makes it possible for news organizations to moderate comments without the news staff being responsible for reviewing every single comment that is posted on the site. Crowd-sourcing moderation can be particularly useful for news sites that may receive too many comments for moderators review. In a time where newsrooms may be lacking funding for adequate staffing, crowd-sourced moderation can also be beneficial to sites that do not employ full-time moderators. All of the

59

newspapers in this study employed moderators who were also responsible for other duties.

While the newspapers who used crowd-sourced moderation did not review every comment posted on their website, the moderators at The Palm Beach Post and the

Orlando Sentinel said that it was essential to check the flagged comments queue often.

This reinstates Chen, Xu and Whinston’s (2011) finding that the frequency of moderation has a significant affect on the performance of an online community (p. 261).

Users may begin to feel uncomfortable if a flagged comment has not been reviewed or removed in a timely manner. In addition, the conversation may shift from the article posted on a paper’s website to the lack of moderation.

Anonymity in Online Commenting

While users had to create accounts with valid email addresses on all of the newspaper sites in this study, a user did not have to register with his or her legal name.

Instead, a commenter’s user names could be a pseudonym. Although previous research has indicated that anonymous comments tend to be more uncivil, three interviewees in this study mentioned that anonymous commenting could be positive. Interviewees from

Cox Media Group, The Palm Beach Post and the Orlando Sentinel said that user pseudonyms permitted users the freedom to express valid views and opinions that they may not be comfortable with other people, such as employers, associating with them.

Anonymity can be particularly beneficial when users are contributing to political discussions.

Although anonymity was mentioned as a positive feature, some of those interviewed said that the future of online commenting would involve news organizations using social media plugins, which would make online commenting less anonymous as

60

these plugins may allow a commenter’s social media followers to identify him or her.

The Facebook plugin was gaining popularity among newspaper sites (Santana, 2012) and those who mentioned Facebook pointed to that particular plugin because Facebook was the largest social network at the time of this research. Not only would commenting through Facebook add another layer of user accountability, it would also give newspapers the opportunity to engage audiences on multiple social platforms. However, some interviewees said that that anonymity was important enough to not adopt the

Facebook plugin and the plugin may even result in a decrease in comments.

The Role of Social Media in Online Commenting

While some moderators said that a social media plugin might be the future of online commenting, one newspaper had already made the transition to Facebook. The

Dayton Daily News posted stories from its website on to its Facebook page for the over

20,000 followers to read, like and comment on. Instead of devoting a large amount of time to moderating the few comments on the newspaper’s website, those at the newspaper focused more on moderating comments and engaging the audience on its

Facebook page.

This type of interactivity could indicate a future in which newspaper commenting and social media has merged in a way that goes beyond a plugin. Meeting users where they are already participating in pubic discourse may increase interactivity between newspapers and its readers. In essence, journalists could engage with an audience who is already engaged. This social media interactivity also has the potential for community building. Additionally, newspapers may observe more civil discussions when users comment using social media accounts. A newspaper may even gain new readers with an active social media presence.

61

While social media has great potential for community engagement, certain platforms may also harm an online community. Social media plugins installed on news site may take users out of the news site’s environment. For example, users who have to comment through a Facebook plugin may feel as though they are on Facebook’s website rather than a newspaper site. This was the reason given by the Digital Manager at The Palm Beach Post, who said that a Facebook plugin would pull readers out of the newspaper’s online experience. Social media networks can be online communities, but it is important for news sites to develop their own, individual online communities. Thus, technology similar to the Facebook plugin may be counterproductive to that goal.

From a technical aspect, a social media plugin may not be compatible with a news site’s moderation system. In addition, moderators may not be able to filter comments, contact commenters or ban policy violators from their site if they are using a plugin. This loss of control could have adverse effects on a newspaper’s online community, as users could repeatedly leave uncivil comments that could potentially discourage other users from interacting.

Suggestions for Future Research

This study found that moderators at newspapers try to foster a community environment on the newspapers’ websites. Future research should explore whether the level of community engagement correlates with civility in online comments. For example, this research would test whether engagement practices, such as the moderator commenting on stories or emailing users directly, resulted in more civil comments. The study could also examine whether a strong online community resulted in a wider online audience. Findings of such a study would aid journalists in the quest to create commenting sections where users feel comfortable enough to contribute to public

62

discourse without the fear of being personally attacked. These findings would also be significant to journalism educators who in teaching online management will teach students how to manage online communities.

While past research addressed who wrote letters to the editor, feature research should study who typically posts comments to news sites. Findings of the research would be valuable, as they would inform newsrooms about what portion of the population engages in this form of interactivity. These findings could also be compared to the demographics of typical letter writers.

All five of the newspapers included in this study followed moderation policies that were created by parent companies. Future research should examine how moderation policies are developed to determine whether this is a common practice. Findings of this study would be significant to newsrooms and websites who are developing moderation policies, as well as educators who are teaching journalism or law students how user terms are developed and implemented.

Research studying the inclusion of comment moderation in journalism programs would also be important. A study examining how curricula has changed and what techniques educators are using to teach comment moderation would be valuable to journalism schools that wish to incorporate the teaching of these important skills into their programs.

A future study might address the high job turnover within the journalism industry and whether the lack of institutional knowledge has an affect on how comments are moderated. Such research would be useful for newsrooms in determining whether there

63

should be a set training process for new hires in order to ensure consistent moderation practices.

The findings of this research are useful in that they may assist educators in preparing journalism students for careers as moderators, online editors and community managers—some of the new roles in the evolving media landscape. Online commenting has become an ever-present feature on news websites during the 21st Century; therefore it is becoming more likely that future journalists will need to possess moderation skills. This study has presented the process by which newspapers simultaneously moderate user-generated content and foster online communities.

64

APPENDIX A COX MEDIA GROUP VISITOR AGREEMENT

REGISTRATION

To obtain access to certain services on our Service, you may be required to register with us. Children under the age of 13 may not register for the Service. You agree that the information you supply during that registration process will be accurate and complete and that you will not register under the name of, nor attempt to use this Service under the name of, another person. We reserve the right to reject or terminate any user name that, in our judgment, we deem offensive. You will be responsible for preserving the confidentiality of your password and will notify us of any known or suspected unauthorized use of your account.

USER-PROVIDED CONTENT

Your License to Us. By submitting material (including, but not limited to, any text, photos, video or other content) to us, you are representing that you are the owner of the material, or are making your submission with the express consent of the owner. By submitting any materials via this Service, you grant us, and anyone authorized by us, including, without limitation, our Affiliates, a perpetual, irrevocable, royalty-free, unlimited, worldwide, transferable, non-exclusive and unrestricted license to use, reproduce, modify, archive, publish, sell, exploit, display, create derivative works from, publicly perform, and otherwise distribute such material in any medium (whether now known or hereafter developed), in any manner we see fit, and for any purpose that we choose. The foregoing grant includes the right to exploit any proprietary rights in materials you submit to this Service, including, but not limited to, rights under copyright, trademark or patent laws that exist throughout the world. Without limiting the generality of the previous sentence, you agree that we may use, distribute, share or otherwise provide such material under any terms we see fit to any third party without the requirement of providing you any form of compensation. You also agree that we, and anyone authorized by us, may identify you as the author of any of your postings by name, email address or screen name, as we or they deem appropriate. We also reserve the right (but assume no obligation) to delete, move, or edit any postings that come to our attention that we consider unacceptable or inappropriate, whether for legal or other reasons. You understand that the technical processing and transmission of the Service,

65

including content submitted by you, may involve transmissions over various networks, and may involve changes to the content to conform and adapt it to technical requirements of connecting networks or devices.

USE OF COMMUNICATIONS SERVICES

Specific Prohibited Uses. Without limiting the foregoing, we may immediately terminate your use of any Communications Service if you engage in any of the following prohibited activities:  Uploading, posting, emailing, transmitting or otherwise making available any content that is unlawful, harmful, threatening, abusive, libelous, or obscene;  Impersonating any person or entity, or falsely stating or otherwise misrepresenting your affiliation with a person or entity;  Forging headers or otherwise manipulating identifiers in a manner that disguises the origin of any content you transmit through any Communications Service;  Uploading, posting, emailing, transmitting or otherwise making available any content that you do not have a right to make available under any law or under any contractual or fiduciary relationship (such as inside information, proprietary and confidential information learned or disclosed as part of employment relationships or under nondisclosure agreements);  Uploading, posting, emailing, transmitting or otherwise making available any content that infringes any patent, trademark, trade secret, copyright or other proprietary right of any party;  Uploading, posting, emailing, transmitting or otherwise making available any unsolicited or unauthorized advertising, promotional materials, or any other form of solicitation, without our express written approval;  Gathering for the purpose of "spamming" any email addresses that users post in our chat rooms, forums and other public posting areas;  Uploading, posting, emailing, transmitting or otherwise making available any content or material that contains software viruses, worms or any other computer code, files or programs designed to interrupt, destroy or limit the functionality of any computer software or hardware or telecommunications or other equipment, or to cause a security breach of such software, hardware or telecommunications or other equipment;  Posting fraudulent classified listings;

66

 Uploading or posting any off-topic or irrelevant material to any chat room or forum;  Interfering with or disrupting any servers or networks used to provide the Communications Services, or disobeying any requirements, procedures, policies or regulations of the networks we use to provide the Communications Services;  Violating any applicable local, state, national or international law, including, but not limited to (1) all applicable laws regarding the transmission of technical data exported from the United States or the country in which you reside, (2) regulations promulgated by the U.S. Securities and Exchange Commission, and (3) any rules of any national or other securities exchange, including, without limitation, the New York Stock Exchange, the American Stock Exchange or the NASDAQ;  "Stalking" or otherwise harassing another;  Instigating or encouraging others to commit illegal activities or cause injury or property damage to any person;  Collecting or storing personal data about other users;  Gaining unauthorized access to our Service, or any account, computer system, or network connected to this Service, by means such as hacking, password mining or other illicit means; or  Obtaining or attempting to obtain any materials or information through any means not intentionally made available through this Service.

67

APPENDIX B ORLANDO SENTINEL TERMS OF SERVICE

Registration. Registration is not required to view certain Content. However, you are required to register if you wish to post a comment or upload a video, or view certain other Content. If you become a Registered Member of OrlandoSentinel.com, you accept responsibility for all activities that occur under your Registration Account. You agree to provide true, accurate, complete, and correct information at the time of registration, and to promptly update this information as needed so that it remains true, accurate, complete, and correct. We reserve the right to terminate your access and use of OrlandoSentinel.com if individuals from more than one household access OrlandoSentinel.com using any single Registration Account. You are responsible for maintaining the confidentiality of your password and for restricting access to your computer so others outside your household may not access OrlandoSentinel.com using your name in whole or in part without our permission. If you believe someone has accessed OrlandoSentinel.com using your Registration Account and password without your authorization, e-mail us immediately at [email protected].

User Content Representations and Warranties. By placing material on OrlandoSentinel.com, including but not limited to posting content or communications to any OrlandoSentinel.com bulletin board, forum, blogspace, message or chat area, or posting text, images, audio files or other audio-visual content to the site ("User Content"), you represent and warrant: (1) you own or otherwise have all necessary rights to the User Content you provide and the rights to provide it under these Terms of Service; and, (2) the User Content will not cause injury to any person or entity. Using a name other than your own legal name in association with the submission of User Content is prohibited (except in those specific areas of OrlandoSentinel.com that specifically ask for unique, fictitious names).

User Content License. For all User Content you post, upload, or otherwise make available ("Provide") to OrlandoSentinel.com, you grant Tribune Interactive, Inc. ("TI"), its affiliates and related entities, including OrlandoSentinel.com and its affiliated newspapers, Web sites, and broadcast stations, a worldwide, royalty-free, perpetual, irrevocable, non-exclusive right and fully sub-licensable license to use, copy, reproduce, distribute, publish, publicly perform, publicly display, modify, adapt, translate, archive, store, and create derivative works from such User Content, in any form, format, or medium, of any kind now known or later developed. Without limiting the generality of the previous sentence, you authorize TI to share the User Content across all Web sites, newspapers, and broadcast stations affiliated with Tribune Company, to include the User Content in a searchable format accessible by users of OrlandoSentinel.com and other TI Web sites, to place advertisements in close proximity to such User Content, and to use your name, likeness and any other information in connection with its use of the material you provide. You waive all moral rights with respect to any User Content you provide to OrlandoSentinel.com. You also grant TI the right to use any material, information, ideas, concepts, know-how or techniques contained in any communication you provide or otherwise submit to us for any purpose whatsoever, including but not limited to, commercial purposes, and developing, manufacturing and marketing

68

commercial products using such information. All rights in this paragraph are granted without the need for additional compensation of any sort to you.

User Content Screening and Removal. You acknowledge that OrlandoSentinel.com and/or its designees may or may not pre-screen User Content, and shall have the right (but not the obligation), in their sole discretion, to move, remove, block, edit, or refuse any User Content for any reason, including without limitation that such User Content violates these Terms of Service or is otherwise objectionable.

User Content Assumption of Risk. OrlandoSentinel.com cannot and does not monitor or manage all User Content, and does not guarantee the accuracy, integrity, or quality of User Content. All User Content provided to OrlandoSentinel.com is the sole responsibility of the person who provided it. This means that you are entirely responsible for all User Content that you provide. To protect your safety, please use your best judgment when using OrlandoSentinel.com forums. We discourage divulging personal phone numbers and addresses or other information that can be used to identify or locate you. You acknowledge and agree that if you make such disclosures either through posting on any bulletin board, forum, blogspace, message or chat area, or uploading text, images, audio files or other audio-visual content, in classified advertising you place or in other interactive areas, or to third parties in any communication, you do so fully understanding that such information could be used to identify you.

User Content Posting Rules. Any decisions as to whether User Content violates any Posting Rule will be made by OrlandoSentinel.com in its sole discretion and after we have actual notice of such posting. When you provide User Content, you agree to the following Posting Rules:

If the photo or video depicts any children under the age of 13, you affirm that you have written permission from the child's parent or guardian to provide the photo or video.

Do not provide User Content that: contains copyrighted or other proprietary material of any kind without the express permission of the owner of that material.

 contains vulgar, profane, abusive, racist or hateful language or expressions, epithets or slurs, text, photographs or illustrations in poor taste, inflammatory attacks of a personal, racial or religious nature.

 is defamatory, threatening, disparaging, grossly inflammatory, false, misleading, fraudulent, inaccurate, unfair, contains gross exaggeration or unsubstantiated claims, violates the privacy rights of any third party, is unreasonably harmful or offensive to any individual or community.

69

 violates any right of OrlandoSentinel.com or any third party.

 discriminates on the grounds of race, religion, national origin, gender, age, marital status, sexual orientation or disability, or refers to such matters in any manner prohibited by law.

 violates or encourages the violation of any municipal, state, federal or international law, rule, regulation or ordinance.

 interferes with any third party's uninterrupted use of OrlandoSentinel.com.

 advertises, promotes or offers to trade any goods or services, except in areas specifically designated for such purpose.

 uses or attempt to use another's Registration Account, password, service or system except as expressly permitted by the Terms of Service.

 uploads or transmits viruses or other harmful, disruptive or destructive files, material or code.

 disrupts, interferes with, or otherwise harms or violates the security of OrlandoSentinel.com, or any services, system resources, accounts, passwords, servers or networks connected to or accessible through OrlandoSentinel.com or affiliated or linked sites.

 "flames" any individual or entity (e.g., sends repeated messages related to another user and/or makes derogatory or offensive comments about another individual), or repeats prior posting of the same message under multiple threads or subjects.

WARNING: A VIOLATION OF THESE POSTING RULES MAY BE REFERRED TO LAW ENFORCEMENT AUTHORITIES.

70

APPENDIX C THE PALM BEACH POST LETTERS TO THE EDITOR POLICY

Send a letter to The Post We welcome original letters about issues of interest and material that has appeared in The Post. Letters are subject to editing and must include the writer’s name, address, e- mail address and daytime phone number.

71

APPENDIX D THE ATLANTA JOURNAL-CONSTITUTION LETTERS TO THE EDITOR POLICY

Letters to the editor section Your letter will be submitted to the Atlanta Journal-Constitution for publication in the newspaper. Please include your first and last names (no initials, please) and, for verification purposes only, your home address and both your daytime and nighttime telephone numbers. Use the link below or send an email to [email protected]

Also, we'd appreciate your including a dab of bio information, namely, what you do for a living. This bio-info is optional. We hope to hear from you soon.

72

APPENDIX E DAYTON DAILY NEWS LETTERS TO THE EDITOR POLICY

Have your say The Dayton Daily News welcomes letters to the editor. Letters should be 250 words or fewer. We need your full name, address and phone number for verification purposes. Due to volume, not all letters can be published. To publish as many letters as possible, they may be edited. No attachments, please. You can send your letter the following ways: > Use the form below > E-mail a letter to the editor > Fax your letter to (937) 225-7302 > Mail: Letters to the editor, 1611 S. Main St., Dayton, OH 45409 > E-mail a Speak Up! item to [email protected]

We respect your privacy: Your street address, phone numbers, and email address will not be published. We require that information to verify that you are a real person and to contact you if necessary.

73

APPENDIX F ORLANDO SENTINEL LETTERS TO THE EDITOR POLICY

Letters to the Editor

Letters must be exclusive to the Orlando Sentinel and include your full name, address and phone number. We publish only your name and home city. Letters should be no more than 250 words.

E-mail column submissions to [email protected].

74

APPENDIX G INTERVIEW QUESTIONS

1. Please describe your involvement in implementing the comment moderation

policy.

2. When did (publication) launch its website?

3. When did (publication) start allowing comments on its website?

4. Who moderates the comments on your website?

 Is moderation a fulltime job?

5. How are moderators trained?

6. Please explain the process used to determine if a comment violates the comment

policy.

7. What advice would you offer to other news organizations developing comment

moderation policies?

8. What do you think is the next step in online comment policies?

 How will online publications address anonymity?

9. Is there anything else that you would like to share about the process of

developing or implementing the comment moderation policy?

75

APPENDIX H SAMPLE EMAIL

Hello (Potential Interviewee),

My name is Antionette Rollins and I am a graduate student in the University of Florida’s College of Journalism and Communications. As a journalist who is particularly interested in online publishing and user-generated content, I feel it is important to study the Web and the way it has transformed how we report and interact with the public.

I am currently working on my thesis, which seeks to understand the development and implementation of online comment moderation policies. In order to better understand these policies, I’m hoping to interview journalists and other professionals who took part in developing comment moderation policies, and also those who are responsible for moderation.

Because comment moderation is becoming increasingly important, educators may need to adjust curriculum to teach this important editing skill to future journalists. One of the goals of this thesis is to help journalism educators in that endeavor. (Referrer) suggested that I speak with you because of the knowledge you could provide on this subject. Although this is research, I feel that this thesis could have positive, practical implications.

We could conduct the interview, which will be recorded and transcribed, over Skype or phone at your convenience. The interview should take about 30 minutes. I would be extremely appreciative if you were able to participate.

Thank you and I look forward to speaking with you, Antionette Rollins

76

REFERENCES

About. (n.d.). Retrieved from http://www.coxmediagroup.com/about/.

About tribune company. (n.d.). Retrieved from http://corporate.tribune.com/pressroom/?page_id=4200.

Alcindor, Y., Bello, M., & Copeland, L. (2012, March 21). In wake of black teen Trayvon Martin's death, America is soul-searching. USA Today. Retrieved from http://usatoday30.usatoday.com/news/nation/story/2012-03-20/trayvon-martin- teen-shot-stereotypes/53677634/1.

Allison, J. E. (2002). Technology, development, and democracy: International conflict and cooperation in the information age. (). Albany: State University of New York Press.

Arthur D Santana. (2011). Online readers' comments represent new opinion pipeline. Newspaper Research Journal, 32(3), 66.

Banks, J. (2010). Regulating hate speech online. International Review of Law, Computers & Technology, 24(3), 233-239. doi: 10.1080/13600869.2010.522323.

Berkowitz, D. A. (1997). Social meanings of news: A text-reader. Thousand Oaks, Calif: Sage Publications.

Bhattarai, A., Rus, V., & Dasgupta, D. (2009). Characterizing comment spam in the blogosphere through content analysis. Paper presented at the Computational Intelligence in Cyber Security, 2009. CICS'09. IEEE Symposium on, 37-44.

Bishop, J. (2009). Enhancing the understanding of genres of web-based communities: The role of the ecological cognition framework. International Journal of Web Based Communities, 5(1), 4-17.

Blood, R. (2003). Weblogs and journalism: Do they connect? Nieman Reports, 57(3), 61-63. Retrieved from https://search.ebscohost.com/login.aspx?direct=true&db=ufh&AN=10976577&sit e=ehost-live.

Botelho, G. (2012). What happened the night Trayvon Martin died. Retrieved from http://www.cnn.com/2012/05/18/justice/florida-teen-shooting-details.

Boyd, D. m., & Ellison, N. B. (2007). Social network sites: Definition, history, and scholarship. Journal of Computer-Mediated Communication, 13(1), 210-230. doi: 10.1111/j.1083-6101.2007.00393.x.

Braun, J., & Gillespie, T. (2010). Hosting the public discourse: News organizations, digital intermediaries, and the politics of making newsmedia social. Paper presented at the 11th International Symposium on Online Journalism, Austin, TX.

77

Briggs, T. (2010). Social media's second act: Toward sustainable brand engagement. Design Management Review, 21(1), 46-53. doi: 10.1111/j.1948- 7169.2010.00050.x.

Brown, J., Broderick, A. J., & Lee, N. (2007). Word of mouth communication within online communities: Conceptualizing the online social network. Journal of Interactive Marketing, 21(3), 2–20. doi: 10.1002/dir.20082.

Chen, J., Xu, H., & Whinston, A. B. (2011). Moderated online communities and quality of user-generated content. Journal of Management Information Systems, 28(2), 237-268. doi: 10.2753/MIS0742-1222280209.

Christopherson, K. M. (2007). The positive and negative implications of anonymity in Internet social interactions: “On the internet, nobody knows you’re a dog”. Computers in Human Behavior, 23(6), 3038-3056. doi: doi:10.1016/j.chb.2006.09.

Citron, D. K., & Norton, H. L. (2011). Intermediaries and hate speech: Fostering digital citizenship for our information age. Boston University Law Review, 91(1435).

Coffey, B., & Woolworth, S. (2004). Destroy the scum, and then neuter their families: The web forum as a vehicle for community discourse? The Social Science Journal, 41(1), 1-14. doi:10.1016/j.soscij.2003.10.001.

Cohen, K. R. (2006). A welcome for blogs. Continuum: Journal of Media & Cultural Studies, 20(2), 161-173. doi: 10.1080/10304310600641620.

Dan Johnson. (2000). Anonymity and the Internet. The Futurist, 34(4), 12.

Daniels, J. (2009). Cloaked websites: Propaganda, cyber-racism and epistemology in the digital era. New Media & Society, 11(5), 659-683. doi: 10.1177/1461444809105345.

Daniels, J. (2010). Cyber racism: White supremacy online and the new attack on civil rights. Journal of Popular Culture, 43(5), 1137. doi: 10.1111/j.1540- 5931.2010.00790_5.x.

Daniels, J. (2009). Cyber racism: White supremacy online and the new attack on civil rights. Lanham: Rowman & Littlefield Publishing Group, Inc.

Davis, R. (2009). Typing politics: The role of blogs in American politics. Oxford: Oxford University Press.

Diakopoulos, N., & Naaman, M. (2011). Towards quality discourse in online news comments. New York, NY. 133-142. doi: 10.1145/1958824.1958844 http://doi.acm.org.lp.hscl.ufl.edu/10.1145/1958824.1958844.

78

Dijk, T. A. v. (1992). Racism, elites, and conversation. Atlantis: Revista De La Asociación Española De Estudios Anglo-Norteamericanos, 14(1-2), 201-257.

DiMaggio, P., Hargittai, E., Neuman, W. R., & Robinson, J. P. (2001). Social implications of the internet. Annual Review of Sociology, 27, 307-336. doi: 10.1146/annurev.soc.27.1.307.

Domingo, D., Quandt, T., Heinonen, A., Paulussen, S., Singer, J. B., & Vujnovic, M. (2008). Participatory journalism practices in the media and beyond. Journalism Practice, 2(3), 326-342. doi: 10.1080/17512780802281065.

Ehrlich, P. (2002). Communications decency act section 230. (Harmful speech regulation). Berkeley Technology Law Journal, 17(1), 401.

Ellis, D., Oldridge, R., & Vasconcelos, A. (2004). Community and virtual community. Annual Review of Information Science and Technology, 38, 145-186.

Entman, R. M., & Bennett, W. L. (2001). Mediated politics: Communication in the future of democracy. Cambridge, UK: Cambridge University Press.

Ethics. (n.d.). Retrieved from http://www.spj.org/ethics.asp.

Faridani, S., Bitton, E., Ryokai, K., & Goldberg, K. (2010). Opinion space: A scalable tool for browsing online comments. Paper presented at the Proceedings of the 28th International Conference on Human Factors in Computing Systems, 1175- 1184.

Gunter, B., Campbell, V., Touri, M., & Gibson, R. (2009). Blogs, news and credibility. Aslib Proceedings, 61(2), 185-204. doi: 10.1108/00012530910946929.

Gyongyi, Z., & Garcia-Molina, H. (2005). Web spam taxonomy. Paper presented at the First International Workshop on Adversarial Information Retrieval on the Web (AIRWeb 2005).

Hardaker, C. (2010). Trolling in asynchronous computer-mediated communication: From user discussions to academic definitions. Journal of Politeness Research: Language, Behavior, Culture, 6(2), 215-242. doi: 10.1515/jplr.2010.011.

Hermida, A., & Thurman, N. J. (2009). Comments please: How the British news media are struggling with user-generated content. 8th International Symposium on Online Journalism, 2009(April 3), 1-28.

Hermida, A., & Thurman, N. J. (2008). A clash of cultures: The integration of user- generated content within professional journalistic frameworks at British newspaper websites. Journalism Practice, 2(3), 343-356.

Herring, S. C., Scheidt, L. A., Wright, E., & Bonus, S. (2005). Weblogs as a bridging genre. Information Technology & People, 18(2), 142.

79

Herring, S., Job-Sluder, K., Scheckler, R., & Barab, S. (2002). Searching for safety online: Managing "trolling" in a feminist forum. Information Society, 18(5), 371- 384. doi: 10.1080/01972240290108186.

Hill, J. H. (2008). The everyday language of white racism. Wiley-Blackwell.

History. (n.d.). Retrieved from http://www.coxenterprises.com/about- cox/history.aspx#.URsoctCYYb6.

Holcomb, J., Gross, K., & Mitchell, A. (2011). How mainstream media outlets use twitter. Retrieved from http://www.journalism.org/analysis_report/how_mainstream_media_outlets_use_t witter.

How to submit a letter to the editor. (n.d.). Retrieved from http://www.nytimes.com/content/help/site/editorial/letters/letters.html.

Hurwitz, J., & Peffley, M. (2005). Playing the race card in the post-Willie Horton era: The impact of racialized code words on support for punitive crime policy. The Public Opinion Quarterly, 69(1), 99-112. doi: 10.1093/poq/nfi004.

Hwang, H. (2008). Why does incivility matter when communicating disagreement?: Examining the psychological process of antagonism in political discussion. ProQuest.

In Changing News Landscape, Even Television is Vulnerable (2012). Retrieved from http://www.people-press.org/2012/09/27/in-changing-news-landscape-even- television-is-vulnerable/.

Josey, C. S. (2010). Hate speech and identity: An analysis of neo racism and the indexing of identity. Discourse & Society, 21(1), 27-39. doi: 10.1177/0957926509345071.

Johnson, K. A., & Wiedenbeck, S. (2009). Enhancing perceived credibility of citizen journalism web sites. Journalism & Mass Communication Quarterly, 86(2), 332- 348. doi: 10.1177/107769900908600205.

Keyes, S. (2009). Fiery forums. Retrieved from http://tae.asne.org/StoryContent/tabid/65/id/458/Default.aspx.

Like. (n.d.). Retrieved from https://www.facebook.com/help/like.

Lulofs, N. (2012). The top U.S. newspapers for March 2012. Retrieved from http://accessabc.wordpress.com/2012/05/01/the-top-u-s-newspapers-for-march- 2012/.

Leccese, M. (2009). Online information sources of political blogs. Journalism and Mass Communication Quarterly, 86(3), 578.

80

Munksgaard, D. C. (2010). Warblog without end: Online anti-islamic discourses as persuadables. (Unpublished PhD). University of Iowa, doi: http://ir.uiowa.edu/etd/715.

Nardi, B. A., Schiano, D. J., & Gumbrecht, M. (2004). Blogging as social activity, or, would you let 900 million people read your diary? Proceedings of the 2004 ACM Conference on Computer Supported Cooperative Work (2004), Pp. 222-231, Doi:10.1145/1031607.1031643 Key: Citeulike:77453, 222-231. doi: doi:10.1145/1031607.1031643.

Our history. (n.d.). Retrieved from http://projects.ajc.com/services/info/history/.

Perez-Pena, R. (2010). News sites rethink anonymous comments. Retrieved January, 2013, from http://www.nytimes.com/2010/04/12/technology/12comments.html?_r=1&.

Purcell, K., Rainie, L., Mitchell, A., Rosenstiel, T., & Olmstead, K. (2010). Understanding the participatory news consumer: How internet and cell phone users have turned news into a social experience. Pew Research Center, March.

Qi Li, Yunhong Zhang, Mi Shi, & Jing Luo. (2010). Can anonymity network increase the utilitarian in personal moral decision? Paper presented at the Web Society (SWS), 2010 IEEE 2nd Symposium on, 181-184.

Raeymaeckers, K. (2005). Letters to the editor-A feedback opportunity turned into a marketing tool: An account of selection and editing practices in the flemish daily press. European Journal of Communication, 20(2), 199-221. doi: 10.1177/0267323105052298.

Rafaeli, S., & Sudweeks, F. (1997). Networked interactivity. Journal of Computer- Mediated Communication, 2(4).

Rajendran, B., & Pandey, A. K. (2012). Contextual strategies for detecting spam in academic portals. (pp. 250-256). Berlin, Heidelberg: Springer Berlin Heidelberg. doi: 10.1007/978-3-642-27308-7_26.

Reader, B., Stempel, G. H. I., & Daniel, D. K. (2004). Age, wealth, education predict letters to editor. Newspaper Research Journal, 25(4), 55.

Resources page. (n.d.). Retrieved from http://asne.org/content.asp?contentid=19.

Rich, C. (2004). Writing and reporting news: A coaching method. Belmont: Thomson Wadsworth.

Ruiz, C., Domingo, D., Micó, J. L., Díaz-Noci, J., Meso, K., & Masip, P. (2011). Public sphere 2.0? the democratic qualities of citizen debates in online newspapers. The International Journal of Press/Politics, 16(4), 463-487. doi: 10.1177/1940161211415849.

81

Santana, A. D. (2012). Civility, anonymity and the breakdown of a new public sphere. University of Oregon). ProQuest Dissertations and Theses, 163. Retrieved from http://search.proquest.com/docview/1038153023?accountid=10920. (1038153023).

Shoemaker, P. J. (2001). Individual and routine forces in gatekeeping. Journalism and Mass Communication Quarterly, 78(2), 233-246. doi: 10.1177/107769900107800202.

Siapera, E. (2008). The political subject of blogs. Information Polity, 13(1/2), 51.

Singer, J. B., & Ashmanb, I. (2009). "Comment is free, but facts are sacred”: User- generated content and ethical constructs at the guardian. Journal of Mass Media Ethics, 24(1), 3-21. doi: http://www.tandfonline.com/doi/full/10.1080/08900520802644345#tabModule

Slocum, F. (2001). White racial attitudes and implicit racial appeals: An experimental study of ‘Race coding’ in political discourse. Politics & Policy, 29(4), 650-669. doi: 10.1111/j.1747-1346.2001.tb00609.x.

Tecklenburg, J. (2012). Online comments: Let's try this again. Retrieved from http://thegazette.com/2012/09/12/online-comments-lets-try-again/.

The history of the Orlando Sentinel. (2004). Retrieved from http://articles.orlandosentinel.com/2004-01-01/features/0401020323_1_sentinel- communications-morning-sentinel-sentinel-star.

Thomas Paul Bonfiglio, & Jane H Hill. (2011). The everyday language of white racism. Cambridge: Cambridge University Press. doi: 10.1017/S0047404511000753.

Tribune company history. (n.d.). Retrieved from http://corporate.tribune.com/pressroom/?page_id=2313.

Tribune company timeline. (n.d.). Retrieved from http://corporate.tribune.com/pressroom/?page_id=2315.

Wahl-Jorgensen, K. (2001). Letters to the editor as a forum for public deliberation: Modes of publicity and democratic debate. Critical Studies in Media Communication, 18(3), 303-320. doi: 10.1080/07393180128085.

Wahl-Jorgensen, K. (2007). Journalists and the public: Newsroom culture, letters to the editor, and democracy. Cresskill, N.J: Hampton Press.

Webb, T. J. (2010). Verbal poison—Criminalizing hate speech: A comparative analysis and a proposal for the American system. Washburn Law Journal, 50, 445-482.

82

BIOGRAPHICAL SKETCH

Antionette Rollins received a Master of Arts in Mass Communication from the

University of Florida, with a specialization in journalism, in the spring of 2013. She received a Bachelor of Arts in journalism from the University of Georgia in 2010, where she majored in magazines. Her primary research interests include online journalism and user-generated content.

83