Allman & Medeiros, Rotten Tomatoes and Chill? MRAs and Their Impact on Decision-making Rotten Tomatoes and Chill? MRAs and Their Impact on Decision-Making Sharon Allman and Jenny Lee-De Medeiros Abstract : The purpose of this research was to examine whether young adults (aged 18-32) look at user- and/or critic-generated movie review aggregates (MRAs) to decide which film to watch, or whether other factors impact their decision-making. The literature on this topic most notably shows a correlation between highly rated movies and better box office results, a preference for MRAs, and potential market benefits of MRAs. This research, which fo- cused on the North American context, contained both quantitative and qualitative methods in the form of an online survey, focus groups, and key informant interviews. The results in- dicate that MRAs are not the preferred method to decide what movie to watch, and instead factors such as family or friends’ recommendations and marketing decisions of the film most affect young adults’ decisions about which films to watch. Keywords: movie review aggregate, MRA, movies, ratings, rating metric, scoring system DOI 10.33137/ijournal.v6i1.35269 © 2020 Allman, S. Medeiros, J. Rotten Tomatoes and Chill? MRAs and Their Impact on Decision-Making. This is an Open Access article distributed under CC-BY. iJournal, Vol 6, No. 1, 1 Allman & Medeiros, Rotten Tomatoes and Chill? MRAs and Their Impact on Decision-making Introductory Statement of Significance Rotten Tomatoes is a movie review aggregate (MRA) available online that assigns a score to a movie based on critic and/or user reviews. According to Alexa, a web traffic estimator, it is the 143rd most popular website in Canada, while IMDB, a website with user-generated reviews, is 26th most popular (Alexa, n.d.). Media corporations emphasize the impact of the scores which websites like Rotten Tomatoes and IMDB assign to new film releases. In fact, Fandango, a compa- ny that sells movie tickets in the U.S., purchased Rotten Tomatoes, and now displays these scores beside the movie as consumers buy tickets online. This suggests an increased market interest in MRAs and the commercial impact they have on box office results.V arious researchers have found a correlation between positive critic reviews and higher box office grossing, but very few have looked directly at the relationship between the act of visiting MRAs and the decision to see a movie (Terry et al., 2004; Hennig-Thurau et al, 2012). This research aims to fill this gap in the literature. Aims and Objectives Filling this gap in the literature will help explore why there seems to be a commercial ben- efit for movies that have high reviews. This is an area that is of growing interest to movie compa- nies and executives and can have bigger implications for box office performance on movies and firm valuation. Specifically, we hope to answer the following questions: Do people trust and use MRAs? What is considered a good score for an MRA? Does a “good score” affect behaviour? What factors drive people to watch a movie, either in theaters or at home? Literature Review Currently the literature assessing the use of MRAs by North Americans falls into three main categories: preferences for movie aggregates, box office performance based on critics’ movie reviews, and the perceived quality of movie reviews. iJournal, Vol 6, No. 1, 2 Allman & Medeiros, Rotten Tomatoes and Chill? MRAs and Their Impact on Decision-making Preferences for Movie Aggregates The preference for MRAs has been examined by Doguel and Xiaoming (2016), who found that U.S. participants demonstrated a preference for aggregate user information, favouring movies with high star ratings, compared to the movies with positive user reviews favoured by Singapor- ean participants. This research suggests that MRAs are the preferred method for Americans to look at reviews for movies (Doguel & Xiaoming, 2016). Yep et al. chose to compare review sites, social media, personal blogs, and instant messaging sites to discover the preferred electronic word of mouth (E-WOM) platform for movie reviews. They found that review sites were the preferred E-WOM for movies, specifically Rotten Tomatoes (Yep et al., 2014). Box Office Performance Based on Critics Movie Reviews Critics have had a long-standing role in the success of films on release through box office performance. Terry et al. examined different factors that affect box office performance of mov- ies and found that a critics’ approval rating of at least 40%, and the number of Academy Awards a film received both correlated with box office success. A 10% increase in critic approval can add approximately 7.8 million in additional revenue for a movie, indicating that positive critics’ reviews have a positive impact on viewer decision-making (Terry et al., 2004). Similarly, Hen- nig-Thurau et al. (2012) found that a high number of positive critics’ reviews has a notable impact on long-term film success, especially for comedies and dramas (2012). Movie Reviews and Their Perceived Quality Since movie reviews are often user-generated, the perceived quality of movie reviews is an important factor to consider, and one which has been studied by Koh et al. They examined three markets – China, Singapore, and the U.S. – and the differences between the online average user reviews and the perceived views from participants from the three markets, and found that reviews from the U.S. were subjected to higher rates of under-reporting bias, whereby people with stron- ger opinions were more likely to leave reviews than people with more moderate opinions (Koh et al., 2010). This skews the results to extreme scores compared to the viewers’ total perceived qual- iJournal, Vol 6, No. 1, 3 Allman & Medeiros, Rotten Tomatoes and Chill? MRAs and Their Impact on Decision-making ity of the film. Ullah et al. (2015) approached the presence of such biases in user reviews using sentiment analysis; they examined the impact of reviews’ emotional sentiments on their helpful- ness to the readers, using a variety of reviews from IMDB. They found that reviews with positive emotions were more likely to be perceived as helpful while reviews with negative emotions were less likely to be perceived as helpful (Ullah et al., 2015). Chintagunta et al., 2010 furthered this discussion by investigating the effects of online user reviews on movies’ box office performance. They focused on user-generated film review site Yahoo! Movies, and argued that these word of mouth reviews have a large impact on viewer deci- sion-making (Chintagunta et al., 2010). Rather than the quantity of reviews on a film, the authors found that the valence of the reviews has a greater impact, corroborating Ullah et al. (2015)’s findings. Chintagunta et al., 2010 also accounted for pre-release advertisement of a film as a con- founding variable and used this as a contrast to the impact of the film reviews (2010). It is inter- esting to note that in all three of these studies, the reviews which had a strong influence on viewer decision-making were user-generated. Other Factors Affecting Movie Sales Moretti chose to focus on social learning and found that movies with better-than-expect- ed appeal in the first week of release tend to do better in their second week as a result of positive word of mouth reviews. This impact is directly proportional to the size of an audience’s social networks and could account for up to 32% of sales of a film with positive surprise (Moretti, 2010). The existing literature on movie reviews and MRAs clearly indicates that there are strong correlations between positive movie reviews, but there is little literature that directly examines whether North American – specifically Canadian – consumers rely on these movie reviews to make their movie watching decisions. Some studies have indicated that there is a correlation for other types of online reviews and consumer behaviours, but very little research has been done on MRAs, despite the commercial benefits of this research. Our own study attempts to fill the void between studies of the impact of user-generated reviews and the professional reviews on a site like Rotten Tomatoes. iJournal, Vol 6, No. 1, 4 Allman & Medeiros, Rotten Tomatoes and Chill? MRAs and Their Impact on Decision-making Research Design Methods This study employed both qualitative and quantitative research methods as a mixed-meth- ods approach. Our quantitative tool was an online survey, completed by 292 participants. This survey also had some qualitative aspects, as participants could submit their own open-ended responses. Our primary qualitative tools were two focus groups conducted concurrently, as well as two key informant interviews conducted concurrently over email. Tools/Instruments Online Survey: An online survey was deployed using Q-fi, an online survey tool. Respondents were asked questions about their movie going behaviours, their views on MRAs, and how they make movie-watching decisions. Most of the questions were ranking or 5-point Likert-scale questions. Focus Groups: Two focus groups with five participants each were conducted concurrently using a semi-structured moderators’ guide. Participants were chosen through convenience sampling, where recruiting posts were made on the researchers’ personal Facebook accounts. Participants were asked about their movie-going habits and how they make decisions on what movies to see, and they were given examples of movie scores from Rotten Tomatoes and asked to give their im- pressions of them. Online Key Informant Interviews: Two email interviews were conducted, one with a participant who loved MRAs and one with a participant who disliked them. These participants were sourced from the researchers’ personal networks. The interviews were conducted in two parts: the questions, specific to each participant, were emailed to them in a Word document which they were to fill out and email back to the researchers; and then the researchers returned the document to the partic- ipant with additional follow up questions, which the participants answered and sent back to the researchers.
Details
-
File Typepdf
-
Upload Time-
-
Content LanguagesEnglish
-
Upload UserAnonymous/Not logged-in
-
File Pages18 Page
-
File Size-