Commercial Content Moderators”

Commercial Content Moderators”

A Keyword Entry on “Commercial Content Moderators” Alyssa Miranda Abstract In today’s digital economy, our enthusiasm for social media platforms often causes us to overlook a hidden network of labourers who endeavour to sustain friendly online spaces where people are free to creatively express themselves and feel safe from hostile behaviour. These individuals, referred to as “commercial content moderators,” play a crucial role in balancing the diverse interests of site visitors, business corporations, and content creators. Commercial content moderators must regularly endure undesirable working conditions and psychological stress, which can take a heavy toll on the quality of their personal lives and raises ethical concerns regarding labour regulations in the countries where they are employed. This paper will examine the role of commercial content moderators in a scholarly, policy, and public-related context. It will also identify experts, policy events, and key resources pertaining to the invisible pool of knowledge workers in this industry. Keywords: Commercial content moderation, precarity in digital labour, online harassment, human rights Introduction Keyword studies were pioneered by University of Cambridge academic Raymond Williams, who became fascinated by terminologies that had different contextual meanings depending on the culture and time period in which they were used, after serving abroad in World War II (Raymond, 1985, p. 11-12). Today, keywords are increasingly becoming an important staple of the information society (Peters, 2016, p. xiii). Not only do they seek to define and explain the relevance of prevalent digital phenomena, but they also contain a collection of resources that serve as a point of reference for further reading among scholars and individuals who are interested in the topic (Peters, 2016, p. xiv-xv). This keyword entry will examine the usage of the term “commercial content moderators” in a scholarly, MIRANDA A KEYWORD ENTRY ON “COMMERCIAL CONTENT MODERATORS” policy, and public related context. In particular, it will investigate and describe how commercial content moderators are instructed to discretionally use their services in a discriminatory manner for profit, exploitation and precarity in digital labour, and the tensions that arise when balancing users’ rights to freedom of expression with their equal rights to protection from online harassment. The Employment of Discriminatory Practices for Profit In scholarly discourse, commercial content moderators are viewed as “digital gatekeepers” for social media platforms and websites (Roberts, 2016b, p. 148). They play an important role in protecting the reputation of company brands by deleting inappropriate material in user-generated content that may be posted by site visitors or members (Roberts, 2016b, p. 147). The two categories of content moderation utilized by most social media companies are active moderation and reactive moderation (Chen, 2014). “Active moderation” involves the real-time screening of every post before it is uploaded to the social media platform, which is a very time-consuming and laborious task (Chen, 2014). During this process, commercial content moderators must frequently refer to a list of guidelines that include keywords such as gore, pornography, and racism, to remind themselves of what they are looking for in moments of uncertainty (Chen, 2014). By contrast, “reactive moderation” only requires commercial content moderators to filter posts that have been flagged as obscene by users based on site regulations, local norms, and values (Chen, 2014; Roberts, 2016b, p. 147). The rigorous methods and proactive reasoning behind active moderation have made it more common among social media platforms like Facebook and YouTube, in comparison to reactive moderation (Chen, 2014). During the decision-making process of whether or not to delete ambiguously offensive user- generated content, moderators are required to consider the potential virality of these posts (Roberts, 2016b, p. 150). The term “virality” refers to content that has accumulated millions of views, comments, likes, shares, and parodies by fans as a result of its online popularity (Roberts, 2016b, p. 150). If the likelihood of a questionable post going viral is high, commercial content moderators are instructed to allow it to be uploaded onto the company’s website or platform because this will increase the brand’s recognition in mainstream media, and thus boost their revenue (Roberts, 2016b, p. 150). However, these business-motivated decisions are often made at the expense of vulnerable minority groups, as many posts that go viral contain humour that is racist, sexist, classist, or homophobic (Roberts, 2016b, p. 150). For example, a news report of Antoine Dodson, the brother of a rape victim, went viral on YouTube due to the popularly perceived humour found in his accent, flamboyant mannerisms, and use of slang vocabulary, even though it has offended many African-Americans, the LGBTQ community, and members of the working class (Roberts, 2016b, p. 153). This raises many ethical concerns surrounding the use of profit-driven practices that continue to hide their discriminatory effects against minority groups, since very few people are aware of the work that moderators do (Roberts, 2016b, p. 156). The iJournal (2)2, Winter 2017 2 MIRANDA A KEYWORD ENTRY ON “COMMERCIAL CONTENT MODERATORS” Exploitation and Precarity in Digital Labour In a policy-related context, active moderation services for social media websites and platforms are usually outsourced to countries like the Philippines due to soft labour laws and a lack of enforcement mechanisms for legal regulations (Chen, 2014). Commercial content moderators from developing countries are often paid poorly and work long hours in spite of the important and specialized work that they are responsible for doing (Chen, 2014). Moderators must familiarize themselves with the culture that the majority of their audience identifies with in order for them to correctly interpret the intentions behind each post within a matter of seconds (Chen, 2014). Furthermore, the disturbing text and imagery containing pornographic, violent, and/or morally objectionable content that commercial content moderators encounter on a daily basis can have a severe psychological effect on their wellbeing (Chen, 2014). Moderators reportedly experience symptoms such as an overactive sex drive, or an irrational paranoia towards other people, which heavily interferes with their personal and professional lives by negatively affecting their interpersonal relationships with partners and colleagues (Chen, 2014). In an interview with Wired magazine, Jane Stevenson, the founder of an occupational health and wellbeing service called Workplace Wellbeing, has equated these adverse effects with those experienced by child pornography and anti-terrorism investigators from the law enforcement profession who are suffering from post-traumatic stress disorder (Chen, 2014). Stevenson explains that although the police department provides its agents with extensive services to mitigate the emotional damage caused in their line of work, companies either offer inadequate psychiatric support to the moderators they employ, or none at all (Chen, 2014). Moderators are also unable to discuss the psychological stress of their work with their family or friends due to strict non-disclosure agreements that they sign when they start their contracts, and often cannot afford the services of external psychologists (Roberts, 2016a). Low wages, long hours and an inability to confide in trusted individuals often results in commercial content moderators choosing to leave their positions to seek other work that is less onerous (Chen, 2014). Nonetheless, the industry itself remains highly esteemed, given the affiliation of prestigious brands (Chen, 2014). Policies and regulations need to be implemented to address issues of abuse and negligence towards moderators that are prevalent within this industry. Freedom of Expression and Online Harassment Public dialogue about commercial content moderators focuses on their capacity to balance users’ rights to freedom of expression with other users’ rights to protection from hate speech and virtual harassment (Roberts, 2016b, p. 152). The Internet is increasingly becoming a site for activism and for awareness of ongoing social issues (Roberts, 2016a). Consequently, moderators are put under a lot of pressure to make the right decisions when it comes to such matters, which is why it is imperative for them to be well-versed in the norms and values of the societies that have access to the content that they are evaluating (Chen, 2014). If they incorrectly categorize a social activist’s post as offensive or The iJournal (2)2, Winter 2017 3 MIRANDA A KEYWORD ENTRY ON “COMMERCIAL CONTENT MODERATORS” fail to remove an obscene comment directed at another user, then their client company will receive significant backlash from consumers and media outlets, which is damaging to their reputation and societal influence. For example, feminists continue to express outrage and retaliate against Facebook’s ban on photographs of mothers breastfeeding their children because it violates the website’s community guidelines governing nudity, which has attracted a large amount of negative publicity (Gibbs, 2016). Although the decisions they make are subject to much critique, moderators play an important public role by mediating conflicting user interests in online communities. Knowledge

View Full Text

Details

  • File Type
    pdf
  • Upload Time
    -
  • Content Languages
    English
  • Upload User
    Anonymous/Not logged-in
  • File Pages
    6 Page
  • File Size
    -

Download

Channel Download Status
Express Download Enable

Copyright

We respect the copyrights and intellectual property rights of all users. All uploaded documents are either original works of the uploader or authorized works of the rightful owners.

  • Not to be reproduced or distributed without explicit permission.
  • Not used for commercial purposes outside of approved use cases.
  • Not used to infringe on the rights of the original creators.
  • If you believe any content infringes your copyright, please contact us immediately.

Support

For help with questions, suggestions, or problems, please contact us