<<

December 13th, 2016

Dear IMDb Research Team,

We have attached the report you requested on our usability study of IMDb.com. We conducted our research over the course of ten weeks, from September to December, 2016.

The main findings of the report are: ● Usability issues with the navigation to genres ● Usability issues with the clickability of certain movie page features ● Usability issues with the amount of content located within the IMDb site

If you have any questions about the report, please do not hesitate to contact us.

Regards, HCDE 417 Usability Research Team 4

Lydia Davison Monica Lee Karin Vaughan Andy Truong [email protected] [email protected] [email protected] [email protected] ​ ​ ​ ​

University of Washington Department of Human Centered Design and Engineering

1

IMDb.com Final Report HCDE 417 Fall 2016

Group 4 Lydia Davison Monica Lee Karin Vaughan Andy Truong

2

December 13th, 2016 Table of Contents Table of Contents 3

List of illustrations 5

Executive Summary 6

Summary of Recommendations 7

Introduction 8 The Internet Movie Database 8 An .com Company 8 Testing Team 8 Definitions of Unique Terms 8 Severity Ratings 8 Overview of the Document 9

Methods 9 Participants 9 Table 1. Participant Demographics 10 Table 2: Reasons for IMDb Use 10 Study Design 10 Task Scenarios 10 Procedures 12 Test Environment 13 Types of Data Collected 13 What We Did 14 Research Questions 14 Audience Analysis 14 Heuristic Evaluation 14 Recruitment 15 Planning 15 Pilot Test 15 Test Sessions 15 Data Compilation 16 Analysis 16 Figure 1: Notes on each of the five participant separated by color. 16 Figure 2: Affinity analysis and grouping of notes on participants. 17 Summary 17

3

Key Findings 17 Navigation to Genres 17 Figure 3: Auto-search results for “comedy” 19 Figure 4: The homepage and advanced search feature 19 Clickability of Movie Pages 20 Figure 5: The movie suggestions section on IMDb for Monster’s Inc. 20 Amount of Content 21

Secondary Findings 22 Information Organization 22 Search Functionality 23 CTRL + F 24 Expand the “People who also liked” section 25 Placement of Storyline 25

Specific Heuristic Evaluation Findings 26 Device Responsiveness Consistency 26 Desktop vs. Mobile Site Consistency 26 Text Style Consistency 27 Back Navigation 27 Figure 6: A history of the sequence of events on Amazon.com (Amazon, 2016). 28 Help and Documentation 29

Conclusions and Final recommendations 29

References 30

Appendices 31 Appendix A: Consent Form 31 Appendix B: Pre-Test Questionnaire 32 Appendix C: Introductory Script 35 Appendix D: Study Script 36 Appendix E: Homepage Warmup 39 Appendix F: Post-Test Questionnaire 40 Appendix G: Post-Test Interview 41 Appendix H: Tasks 43 Appendix I: Data-Logging Forms 46

4

List of illustrations Table 1. Participant Demographics 10 Table 2: Reasons for IMDb Use 10 Figure 1: Notes on each of the five participant separated by color. 16 Figure 2: Affinity analysis and grouping of notes on participants. 17 Figure 3: Auto-search results for “comedy” 19 Figure 4: The homepage and advanced search feature 19 Figure 5: The movie suggestions section on IMDb for Monster’s Inc. 20 Figure 6: A history of the sequence of events on Amazon.com (Amazon, 2016). 28

5

December 13th, 2016

Executive Summary IMDb is a great resource for users to find information on movies, celebrities, and other entertainment news. That is why we, the Usability Test Group 4, believe is essential for the platform to meet the usability criterias of users in order to perform optimally. Through heuristic evaluations, our team has discovered several usability problems with IMDb we wanted to address. Our goal for this study was to improve the usability of the website. Through conducting five usability test sessions, we got a better understanding of how users interact with the IMDb website and the issues they encountered. In the document, we also recommend solutions to these issues to improve the user experience and increase the retention rate of the website.

We identified usability issues with navigating to genres, the clickability of certain movie page features, and usability issues with the amount of content located within the IMDb site. In order to improve the navigation to genres, we recommend that IMDb include a movie genres search function as well as adding genres to the “Movies, TV, and Showtimes” tab. For improving the clickability of movie pages, we recommend that IMDb increases the clickability of certain features of the movie pages, such as the “People who liked this also liked section” and the reviews area of the movie pages. In order to address the amount of content located within the IMDb site, we recommend that IMDb conducts further studies that are specifically targeted at understanding the specific content that users desire on the IMDb website.

6

Summary of Recommendations Based on the usability issues that we identified within our usability study session, we recommend the below changes be implemented within the IMDb website. The recommendations are based on our key findings, secondary findings, and heuristic evaluation findings.

The recommendations should be implemented within the IMDb website: ● Include movie genres as a search results and have it within the "Movies, TV, and Showtimes" tab. ● Run further studies to track user behaviors on the website to better understand what content users desire to eliminate undesired and overwhelming amount of content. ● Improve the naming conventions of category titles and labels to properly reflect the category content. ○ Have "Actors and Actresses" above the column of the names and images of the actors and actresses, and have "Characters" title above the column of character names they played in the movie. ● Make clickable objects outlined in red and enlarge when users hover over them. ● Include a search bar within the cast list section of a movie page for users to find cast members easily. ● Expand the diversity and recommendations of movies found in the "People who also liked" section based on selected movie page. ● Place the storyline section higher on the movie page, directly below the awards and nominations section. ● Make the mobile version of the website default to the desktop view so users have all the functionalities of the desktop version. ● Make all clickable links blue. ● Add breadcrumbs on the desktop website to track pages users go to and help users navigate back to their previous pages, and make it so users can swipe backwards to access their previous page on the mobile site. ● Provide a link to a page-specific help page rather than a general help page.

7

Introduction

The Internet Movie Database We are a group investigating IMDb, short for Internet Movie Database. IMDb is an online database that provides a myriad of information for entertainment such as movies, TV shows, and celebrities. IMDb provides information on ratings, plot summaries, and reviews. IMDb can be accessed online or through their mobile app.

IMDb can help users find a movie they may be interested in watching. Our objective is to study and improve the usability of navigation and make the information architecture and categorization clear and consistent to create a logical and efficient flow through the website. For the purposes of this study, we tested the usability of the desktop IMDb website on a Mac computer.

An Amazon.com Company IMDb launched in 1990 and has become a subsidiary of Amazon.com since 1998.

Testing Team On our testing team, we have Andy Truong, Karin Vaughan, Lydia Davison, and Monica Lee. All of the members of our team are undergraduate students and usability researchers for the Human Centered Design and Engineering (HCDE) department at the University of Washington.

Definitions of Unique Terms

Severity Ratings The following severity ratings are based on a lecture on November 22, 2016 presented within the course HCDE 417: Usability Research Techniques (Sanocki & Zhang, 2016): ● Severity 1: An issue that blocks a substantial number of players from continuing the game ● Severity 2: An issue that block a substantial number of player from making use of an important feature or accomplishing an important task ● Severity 3: An issue that hinders some users from accomplishing a task or goal

8

● Severity 4: An issue that causes some users to become mildly frustrated or express minor complains about an element

Overview of the Document In this document, we will outline the methods, results, findings, and design recommendations for our study of IMDb. In the Methods section, we outline the participants we tested with, the study design, test procedure, testing environment, types of data collected, and the research questions we focused on. In Results, we organize our data by themes and discuss how we analyzed our findings. In the Findings section, we outline our findings and design recommendations. After these sections, we summarize our results and conclude the study. This document contains several appendices, including important test materials.

Methods The Methods section describe the details of the following sections: ● Participants ● Study Design ● Task Scenario ● Procedure ● Test Environment ● Type of Data Collected ● What We Did: ○ Developed research questions ○ Audience analysis ○ Heuristic evaluation ○ Recruitment ○ Planning ○ Pilot test (maybe more than one) ○ Test sessions (including researcher roles and how data was collected) ○ Data compilation and summary

Participants Each of the participant demographics are explained below for a better understanding of our participants testing the IMDb website. They are labeled with their participant number and the letter “P” in order to quickly be identified within our report. Table 1 and Table 2

9

below show more details of our participants regarding their demographics and their reasons for using IMDb.

Table 1. Participant Demographics ​ Participant Age Range Gender Frequency of IMDb Last Used IMDb Use

P1 18-24 Female Once a month 1 month ago

P2 18-24 Female Once a month 1 month ago

P3 18-24 Male Few times each year 2-3 months ago

P4 18-24 Female Once a month 1 month ago

P5 18-24 Female Once a month 1 month ago

Table 2: Reasons for IMDb Use ​ Participant Cast Movie Movies Related Trivia Read View Information Reviews in Movies Information “Parent’s Trailers Theaters Guide”

P1 X X

P2 X X X

P3 X X X X

P4 X X X X

P5 X X

Study Design For our study, all five of the participants noted that they were infrequent users of IMDb. In terms of the task order, all of the participants ran through the tasks in the following order: Task 1, Task 2, Task 3, and then Task 4.

Task Scenarios

Task #1

10

Scenario It’s Thanksgiving night, and you and your family want to stream a comedy movie on your laptop, but cannot decide which movie to watch. Find out what kinds of comedy movies there are on IMDb.

Starting State IMDb.com homepage

Expected Steps 1. Find the comedy genre list 2. Find a movie within the comedy genre that has at least a 7.0 rating 3. Select a comedy movie to watch 4. View the plot of the movie 5. After you’ve looked over the movie’s page, go back to the IMDb homepage

Goals See if the navigation of the website is intuitive and observe the different ways a user can navigate to complete the task.

Task #2

Scenario On Halloween, you and your friend watched “ 2.” You want to find another movie to watch and remember that you really enjoyed the actress who played the daughter, Judy, in the movie. Find another movie with that actress in it.

Starting State IMDb.com homepage

Expected Steps 1. Find the actress who plays Judy Warren 2. Find another movie with that actress 3. Select the movie 4. View the actress’s character in that movie 5. Once you’ve viewed the actress’s new character, go back to the IMDb homepage

Goals Assess the content organization of the website and see if the navigation is intuitive for users.

Task #3

Scenario You heard one of your talking excitedly about a new movie called “Arrival.” You think it sounds interesting. Try to look it up yourself to see what’s so good about it.

Starting State IMDb.com homepage

Expected steps 1. Find the movie “Arrival” 2. Look at the reviews of “Arrival” 3. View what genres the movie is labeled as

11

4. Based on the information provided, decide if you want to watch “Arrival” by choosing to watch the movie on Amazon 5. After you’ve looked over the movie’s page, go back to the IMDb homepage

Goals 1. Assess the detailed information for “Arrival” to get an understanding for why your friend may have found it interesting. 2. Assess the ease of use of the search feature of the website and observe the types of information users look for in a movie.

Task #4

Scenario Think of a movie that you recently . Find a movie similar to it in order to help you decide on what movie to watch next.

Starting State IMDb.com homepage

Expected Steps 1. Find the movie that was recently watched 2. View the information on that movie’s page 3. Find a movie similar to the movie that was recently watched 4. View reviews of that movie

Goals Assess how easy it is to find related movies.

Procedures In order to have consistency among each of the participant study session, an introduction and study script was created for the moderator to read and follow. The introductory script welcomed the participant to the testing area and informed them of what we would be doing for that day’s study session. The moderator then asked participants if it would be alright if we took screen, audio, and video recordings and present them with our consent form (See Appendix A). Once the consent form was signed, we started up LookBack and Skype on the laptop. Next, the participant was asked to complete the pre-test survey. After the pre-test survey was completed by the participant, the moderator then asked participants if they had any questions, before they read the study script.

Before starting the sessions tasks, the moderator asked participants to conduct a short think aloud warmup. The think aloud warmup was aimed at getting participants used to thinking aloud during the task sessions and help them feel more comfortable with

12

thinking aloud. After practicing thinking aloud, the moderator asked the participant to explore and comment on the IMDb homepage. Participants were asked not to click on anything, but they were told that they were free to scroll around on the home page. After participants shared their thoughts about the homepage, the moderator asked the following questions:

● What do you think this website is for? ● What do you think you can do on this page? ● What strikes you about this site?

The note taker recorded the participants answers to the above questions, while the moderator focused on asking the questions and diving deeper into questions, when an opportunity presented itself. After the home page critique, the participants were asked to complete tasks one through four, taking as much time as they needed with each. Once a task was completed, participants were asked to fill out a short survey at the bottom of each task scenario. After all of the tasks were completed, participants were asked to fill out a post-test questionnaire. Once the post-test questionnaire was completed, the moderator asked participants several questions about their overall experience, their opinions,and their thoughts on the navigations through IMDb. To wrap up the study, the moderator asked the participant if they had any final questions or comments. The note taker recorded the final questions asked by the moderator as well as any final questions or comments that the participant had.

Test Environment Each of the participant study sessions were conducted in private conference rooms, in the Foster Business library, Odegaard Undergraduate Library, and Sieg Hall on the University of Washington Seattle campus. Only one participant at a time was scheduled for each study session. Each study session took approximately one hour. For each of the study sessions, there was at least one notetaker and one moderator. The test computer’s screen was shared with the notetaker through the Skype screen sharing function. Additionally, LookBack was implemented in order to record the participant's screen, their audio, and their facial expressions. Lastly, we also brought snack and refreshments for our participants, so that they would be comfortable during the study session.

Types of Data Collected For each study session, we recorded information on the following types of data:

13

● Comments made by participants ● What the participants hover over ● Errors made by participants ● Number of hints given ● Success of task completion ● Steps taken to complete tasks ● Number of steps taken to complete all four tasks ● Questions asked by moderator ● Frustration levels

What We Did

Research Questions We addressed the following research questions with our usability study:

● Can users easily search for certain features, such as movies, genres, and actors, on IMDb’s website? ● Can users easily navigate to and access different movie pages? ● Can users easily understand the information that is presented on IMDb’s website? ● How intuitive is the categorization of the IMDb website?

Audience Analysis With our target users being young adults between the ages of 18 and 24 years, members of our testing team closely related to the participants of this study. We analyzed and edited the wording of the screening surveys, consent form, pre-test questionnaire, think-aloud warm up, tasks, post-test questionnaire, and interview questions to make it more readable, understandable, and appropriate for the intended age group.

Heuristic Evaluation During the initial phases of planning for our usability study, we intended to conduct our test on the mobile platform of IMDb. Each member of our testing team submitted individual heuristic evaluations on October 14th, 2016. On October 15th, the testing team met to talk about and compile the results of our individual evaluations into one whole group heuristic evaluation to identify common points of usability issues encountered on the IMDb website. However, after reflecting and reviewing on feedback from our usability test plans, we decided to conduct the test on the desktop version of 14

the IMDb website. The reasons for this is because the mobile and desktop versions of IMDb had inconsistent navigation and content, so testing on the desktop version would more accurately test for the more frequent use case of the IMDB platform.

Recruitment We recruited participants for our usability study through an online screening survey created through Google Forms which was posted to public Facebook groups on November 10, 2016. We advertised a chance to win a $25 Amazon gift card for participants in our study. We then sent out emails to qualified candidates for our study on November 13th to schedule usability test sessions from November 14th to the 19th.

Planning We planned the dates for conducting usability tests based on the availability of our team members and the eligible participants. To assess our team’s availabilities, we created a document of our availabilities via the When2meet application. To see which participants were available during our availabilities, we also made two different excel sheets: one as a compilation of the responses to the survey questions asking about the participants’ availabilities, and another that acted as a schedule of our finalized participants for certain days.

Pilot Test We conducted a pilot test on November 13th, going through the whole process with a volunteer participant. The pilot test consisted of going over the introductory script, giving out the consent form, giving out a pre-test questionnaire, and going over the think-aloud warmup. Then, our team went over briefly the four tasks we wanted participants to complete, each followed by a short post-task survey. After the tasks, our team covered the post-test questionnaire and reviewed the final interview questions. After the test pilot, we developed a better understanding of the flow of the usability test and reworded some tasks to make it easier to understand for participants.

Test Sessions Including the pilot test, we conducted a total of six test sessions on November 15, 16, 17, and 19. In each test session, we had one moderator and at least two note-takers. Each session was run on a Macbook Pro that also recorded the screen and the participants’ faces (with their consent). Each recording was saved to LookBack to allow easy access when we needed to watch the recordings. In each session, the moderator sat next to the participant while the note-takers sat at the opposite end of . The

15

note-takers used another laptop to view the participant’s progress via screen-sharing with Skype. While watching the participant through the other laptop, the note-takers also took handwritten notes on data-logging sheets (see Appendix A for the data-logging sheet used to collect data during the test sessions).

Data Compilation We began to analyze and compile data by dumping our data into one excel spreadsheet where we could compare data for participants side by side. Then, we performed an affinity analysis using Post-It notes.

Analysis To fully dissect and analysis the information that we collected during the study we did affinity analysis diagramming using post it notes, a whiteboard, and whiteboard markers. We each focused on one participant at a time and wrote comments and problematic points that the participant ran into on Post-Its, and placed them on a whiteboard, seen below in Figure 1 and Figure 2. We each focused individually on a participant for 5 minutes. After time was up, we rotated and focused on the next participant until each participant had been noted by 2-3 group members. We did this because we anticipated that each group member will notice different things. Next, we compiled overlapping notes within participants, and then began to group similar notes across participants to identify the most widespread issues. We then analyzed each group and identified the severity of various problems.

Figure 1: Notes on each of the five participant separated by color. ​

16

Figure 2: Affinity analysis and grouping of notes on participants. ​ Summary In order to alert our potential development and additional IMDb teams of the results of our usability study, we compiled the results from our compiled data and affinity analysis and created a preliminary analysis small report that detailed our participant demographics and the biggest issues we found while completing the study.

Key Findings Based on our compiled data and our affinity analysis we identified the results of our study and created connections between our heuristic evaluations and our usability study results. We then created themes for each of the issues and assigned each of the themes severity ratings. Based on the evidence collected from our heuristic evaluation results and our usability study results, we have provided recommendations at the end of each section of our results.

Navigation to Genres Severity 2 Issue: The most noticeable issue our participants experienced was navigating to the ​ ​ genre page. We believe that this is a Severity 2 issue because sorting by genre is an ​ ​ important feature within IMDb, yet participants were unable to clearly identify genre lists

17

and had to experiment with various difficult IMDb features in order to feel like they completed the task. ● Heuristic Evaluation Results: ○ Search results for genres were inconsistent. For example, two of our group members typed “horror” into the search bar to search for movies in the horror genre. One member received results that were restricted only to movies with the word “horror” in their titles, while the other member saw what she expected to see: movies in the horror genre. ○ We later discovered that this was a difference in selecting an auto-search result rather than results on the search result page. The horror genre only appears seasonally around Halloween. This lack of consistency between seasons and search methods is frustration. ● Usability Study Results: ○ All of the participants resorted to using the search bar when they could not find a proper tab dedicated to displaying genres. ■ P2, P4, P5 searched “comedy” to find the comedy genre ● This does not show the comedy genre as a search result, but rather a list of titles including the word “comedy” in them as shown in Figure 3 below ■ P1 used advanced search to see genre lists ● The advanced search location is difficult to find as shown in Figure 4 below ■ P3 searched “top 100 IMDb” on Google, then sorted the IMDb list by genre ■ After failing to see Comedy as a genre result anywhere else, P4 searched a movie known in the genre to find similar ones ○ P2 suggested that it would be nice to have a “Genres” tab near the top for convenience. ■ P2 said “it was hard to find the movies by genre” ○ P4 never pressed enter when searching “comedy” but did not see it as an auto-search result. A preview of the auto-search results for “comedy” are shown below in Figure 3 below. ○ On the post-test questionnaire, three out of five participants stated that they were not confident in their ability to find the information they needed.

18

Figure 3: Auto-search results for “comedy” ​

Figure 4: The homepage and advanced search feature ​ ​

Recommendation: Because of the inconsistency of search results of movie genres, we ​ ​ recommend including movie genres as a search result as well as under the “Movies, TV, and Showtimes” tab. Many participants first checked the navigation bar for genres, then searched the movie genre (ex: “comedy”). Search results, including genres, should continue to automatically appear while typing for efficiency and helpful responsiveness. We believe this is a strong design recommendation because we also learned from our pre-test questionnaire that many users go on IMDb with a specific purpose; for example, finding more information on a specific movie. Furthermore, when searching “horror,” the “horror” genre only seasonally appears in the results. This builds an expectation that users can search genres; however, other genres like “action” or “comedy” do not show up, which make the search function inconsistent and frustrating.

By adding movie genres as a search result, users would be able to quickly access movies by genre through a more consistent and user-friendly search function. This improved search function and navigation bar addition helps make the functionality of the IMDb website match up better with users’ expectations of it.

19

Clickability of Movie Pages Severity 3 Issue: We believe that this is a Severity 3 issue because participants were hindered ​ ​ ​ from accessing the page they desired and experienced delays. Usability Study Results: ○ When participants were completing Task #4 to find a movie to watch that is similar to a movie they watched recently, four out of five participants searched a movie they had in mind immediately, while one participant found a movie based on finding an actor. ○ All the participants went to the movie web page of their recently watched movie and looked at the “People who liked this also liked…” section, displayed in Figure 5 below, to find movie suggestions. ○ P1 and P5 noted they expected to be brought to the movie web pages when they clicked the movie cover images, highlighted in red boxes, within this section; however only the synopsis showed. This reveals that user expectations for clicking movie cover images within this web page were not met. ○ Another example of a clickability issue is when P1 clicked on the star icon, expecting to be brought to the movie review page, but it was not clickable. This presents another element that users thought was clickable, but unfortunately was not.

Figure 5: The movie suggestions section on IMDb for Monster’s Inc. ​

20

Recommendation: We recommend that aspects of the movie page be made more ​ clickable and also be more obvious about which areas are clickable or not, specifically the related movies area and the reviews section located on the various movie pages. An idea to demonstrate this would be to have the image or clickable objects on the page enlarge and have a faint outline when the user hovers over it, prompting a user to interact with that element.

Amount of Content Severity 3 Issue: All five participants mentioned that they thought there was too much information ​ ​ presented on the IMDb website. ● Heuristic Evaluation Results: ○ There is a lot of information given to users at any given page, forcing users to spend more time scanning through the pages to find the information they want. ● Usability Study Results: ○ They mentioned that the homepage showed more information than was needed and they would rather only view basic information about movies, TV shows, and actors. P5 looked through the front page extensively, noting the disorganization and the overwhelming amount of hyperlinks. ○ Participants thought that certain features on the homepage were unnecessary. For example, P1 said that seeing celebrities’ birthdays wasn’t vital to the front page. P4 said they don’t usually go to the front page because they are looking for something specific when they visit IMDb. ○ Some participants stated that the addition of unnecessary information made it difficult to find the features they would normally use. Every participant used the search bar multiple times during the study for information available to access in other forms (such as the navigation bar). This may show that desired information is too hidden under unnecessary information, so participants navigate through the website in the way they feel they have the most control. ○ Some participants thought that out of all the content available on IMDb, the information they desired was still not present. For instance, P4 mentioned that they wish they could see more details on the director , such as news of upcoming movies, rather than seeing the emphasis be placed on presenting extraneous and infrequently used

21

information. This shows that IMDb has an overwhelming presence of unnecessary content, but lacks some of the content users want most. Recommendation: We recommend running further studies to better understand the ​ specific content that users desire on IMDb. Among other things, this would include interviews with real users. Furthermore, we recommend that IMDb tracks what users click on (especially on the front page) to find out what undesired information can be eliminated, as well as track what users search to see what types of information users are actually looking for. After learning more about users in these ways, we recommend strategic elimination of unwanted and overwhelming information to make it easier for users to see what they way to see.

Secondary Findings

Information Organization

Severity 3 Issue: IMDb attempts to cater to various users through the massive amounts of ​ information that they provide on their mobile site and website. However, IMDb lacks successful categorization of the groups of data. Many of the current labels categorizing ​ data on the IMDb desktop site are inconsistent or misleading, making it difficult to correctly identify the information organized underneath the labels. ● Heuristic Evaluation Results: ○ The hamburger menu located on the heading of all of the IMDb mobile pages presents an unorganized and illogical set of categorizes. While some information is organized by “Movies” and “Television” other information located on the hamburger menu as well as the subheadings provided are not easy to interpret or do not seem necessary. ○ Tabs such as “Special Features” are at first unidentifiable as to what exactly is underneath the heading. Once “Special Features” is selected a list of awards and entertainment options, such as “Scary Good” and “” are presented that do not, at first, appear related or relevant to each other and the “Special Features” heading. ○ There are many subcategories within categories on the IMDb site. For example, within the “Horror” category, there are “comedy”, “drama”, and “sci-fi” subcategories. ○ Within the “Comedy” category, there are “action”, “horror”, and “romance” subcategories. If users click “Comedy” in either the comedy category or the horror subcategory different results will be provided. This standard

22

way of labelling information can cause users to be confused and frustrated, because the results they are attempting to view may not be located under to category or subcategory they select. ○ There is various amount of information on the movie pages, ranging from soundtracks to reviews; however one of the categories on standard movie pages is titled as “Details”. ○ All of the information located on the movie page is details about the movie, yet there is a category titled at “Details.” ○ It is difficult for users to initially understand what information is located under the category “Details” ● Usability Study Results: ○ P2 called the labels “weird,” while P3 asked for more clarification. ○ P3 did not understand the difference between the categories titled “Top Rated Movies” and “Top Rated Genre.” ○ When P1 was asked if they agree that section headers make content easy to find, she disagreed. ○ While looking at cast lists, P5 said she could not tell which column was the actors’ names and which one was the characters’ names because they were not clearly labeled. ○ P2 noted that it was hard to navigate through the tabs in the menu because some of the subcategories listed under the main tabs were unrelated to the name of the main tab they were looking at. Recommendation: The information organization that IMDb presents with similarly titled ​ categories and subcategories may make users initially confused and frustrated because they do not easily know what information is located under which category or subcategory they select. We recommend creating category titles/labels that accurately and clearly reflect the category content. The titles need to be distinct and easy to recognize. An example would be to have the "Actors and Actresses" title above the column of the names and images of the actors and actresses, and have "Characters" title above the column of character names they played in the movie to make it easy for users to filter out the cast information. By improving this aspect and applying where ​ necessary, users will be able to easily access the information they desire with fewer delays and less frustration.

Search Functionality Severity 3 Issue: While participants relied heavily upon the search functionality of IMDb, there ​ were several aspects of the search functionality that participants believed were lacking.

23

● Usability Study Results: ○ P2 stated that the search bar made things easier for them and they utilized the search bar in every task. ○ P5 search for information within the search bar, but stated that the results given were not expected. ○ Additionally, P5 cited that they wanted more search function capabilities. ○ P3 stated that they expected the search to autofill based on the characters they typed into the search bar. ○ P1 was the only participant to utilize the the advanced search function. Recommendation: We recommend that the search functionality expand, so that users ​ can auto search aspects of entertainment such as genres and character names. Additionally, the search function should be adjusted to autofill each time a user utilized the search function.

CTRL + F

Severity 4 Issue: When looking at full cast lists, three out of five participants did not want to look ​ through the entire cast for a specific character when that character was not one of the top ones listed. Rather than finding the information they needed through regular scrolling and navigation, participants resorted to relying upon the “CTRL+F” function on the provided Mac computer in order to specifically search information location. ● Usability Study Results: ○ When looking at full cast lists, three out of five participants did not want to look through the entire cast for a specific character when that character was not one of the top ones listed (P1, P3, P5). ○ When looking for the character “Judy”, participants CTRL+F searched “Judy.” This shows that users are not willing to go through the amount of content listed and instead resort to shortcuts to find the specific pieces of information. ○ P4 said they do not usually go to the front page because they are looking for something specific when they visit IMDb ○ 60% of participants list finding cast information as one of the ways they use IMDb (P1, P3, P4). ○ For P1, finding cast information was one of the only two ways they used IMDb. Recommendation: For finding cast information, users are finding specific pieces of ​ information where they know exactly what they are looking for, they just need to find the answer. Participants likely want this information quickly and easily. Participants utilized

24

CTRL + F, because they were not interest in the rest of the information in the cast list, but rather the specific piece they were looking for, so they used a computer shortcut to make the task go faster. Because of this, we recommend including a search bar within cast lists for users who may not use CTRL + F to also be able to quickly find the cast information they are looking for.

Expand the “People who also liked” section Severity 4 Issue: While the “People who also liked” section is helpful for finding additional movies ​ to watch, there is room for expansion and diversity among the recommended movies. ● Usability Study Results: ○ P1 did not like the suggestions that were provided in the “People who also liked” section and wanted independent film suggestions added into the different selections. ○ P4 stated that they did not think that the “similar” movies were similar enough to the movie page they were on. ○ Additionally, P4 expected to see Hayao Miyazaki films when looking at Miyazaki movies, rather than Disney and movies for Hayao Miyazaki films. They would have liked to view recommended movies based on the producer and director of the film’s page that they were on. Recommendation: The “People who also liked” section needs to be retailored to ​ include additional options that would expand the diversity and recommendations, based on the selected movie page.

Placement of Storyline Severity 4 Issue: Participants want to read the storyline of movies; however the storyline is buried ​ under other information and cause the participants to work in order to access the storyline. ● Usability Study Results: ○ P4 cited that they actively read and rely upon the storyline in order to determine the quality of a movie ○ P5 suggested that the storyline of the movie should be placed higher, “so you don’t scroll to see it.” Recommendation: The storyline section should be placed higher within the hierarchy ​ of the organization on movie pages, directly below the awards and nominations section. By doing this, the storyline will be easier to access and participants will not have to do as much scrolling.

25

Specific Heuristic Evaluation Findings We believe that it is important to include the full results of our study within this report. Thus, we have included this section to explain several of the findings we discovered while completing our original heuristic evaluations, based on the IMDb mobile website. Based on the results from our individual evaluations, we found the following major problem areas in the usability of IMDb’s mobile website:

Device Responsiveness Consistency

Severity 3 Issue: Mobile website is not responsive enough to the devices used to access the ​ mobile site. The IMDb site interface was tested on mobile phones using mobile browsers instead of the IMDb mobile application. This was because press releases stated that IMDb is accessed more on phones and tablets than on desktops (Internet Movie Database [IMDb], 2012). ● Heuristic Evaluation Results: ○ Although the front page of IMDb on phone browsers is a mobile version, after a quick search, the desktop site appeared, which was difficult to read and navigate on a tiny phone screen compared to the mobile home page. This was extremely frustrating, because the IMDb site makes it difficult for users to understand which functionalities force them to use the desktop site and how users can return to the mobile site on their phones. ○ The desktop site was built and structured for computer screens, rather than mobile screens, making it hard to navigate and understand, further increasing the frustration and the feeling of being lost. Recommendation: The lack of consistency in responsiveness across different devices ​ may frustrate users and deter them from accessing the website on certain devices. Because of this, IMDb should make more of their pages responsive to different devices, specifically smartphones. Having the mobile website retain the full functionality of the desktop website would allow users to use their smartphones to access IMDb in a format tailored to their phones without having to sacrifice any features or functions from the desktop website.

Desktop vs. Mobile Site Consistency Severity 3 Issue: The desktop and mobile versions of IMDb’s website differ greatly in terms of the ​ information available and functions.

26

● Heuristic Evaluation Results: ○ The desktop version provides users with the option to view movies by genre from the home page. ○ On the other hand, the mobile version of the home page does not show any functions to search by genre. Due to this, users on the mobile version have to switch to the desktop version of the website in order to see movies by genre. Recommendation: This inconsistency can cause users to feel frustrated from having to ​ go through a tedious amount of work just to find a simple function. Similar to our recommendations for the previous section about consistency with device responsiveness, we recommend having the information hierarchy from the desktop site be reflected more accurately in the mobile site. This will help reduce the amount of confusion and make content easier to find for users who switch between the desktop and mobile versions of the site.

Text Style Consistency

Severity 4 Issue: The text styles of categories and clickable pages are not consistent between the ​ different pages on the IMDb mobile site. ● Heuristic Evaluation Results: ○ On the main page of the mobile site, some of the titles are depicted in black text. These categories do not initially appear clickable, however the titles can be pressed and will move the user to an additional IMDb page concerning the information referenced in the title. ○ This functionality is inconsistent with other pages, where links that are clickable are highlighted in blue text. Recommendation: Because the lack of consistency in text style may frustrate users ​ and confuse them about what features are clickable or not, we recommend that clickable links be clearly marked in blue. This would contrast better with the plain, unclickable text, which would allow it to stand out more and clearly mark the pages that users are able to click.

Back Navigation

Severity 4 Issue: Currently, users do not have the ability to view the pages they have visited. ​

27

● Heuristic Evaluation Results: ​ ○ This results in users getting lost, because they are unable to view the flow of their actions. ○ Once users have ended up on an unintended or unintentional page, users are forced to rely upon the browser to take them back to their initial page. IMDb does not have any built-in undo, redo, or “emergency exits” functionalities within their mobile site. Recommendation: In order to correct the lost feeling that users may be experience, ​ two design recommendations could be implemented. First, IMDb could create a flow of the user's actions, tracking users actions and depicting the places that the users has been. An example of this functionality can be seen in Figure 6, a screenshot of an Amazon product page.

Adding this sequence of events, breadcrumbs, would also create a built in back functionality, which would allow users to quickly be directed to the specific page they wanted to move back to.

Figure 6: A history of the sequence of events on Amazon.com (Amazon, 2016). ​ ​

Second, the ability to swipe backwards from one page to the next could be added to the IMDb mobile site. Currently, the browser must be used in order to perform back functionalities. However, the ability to swipe backwards in order to travel backwards, would allow IMDb to become more instinctive as well as providing an easy way for users to view the previous pages and remove themselves for unintentionally clicked pages.

28

Help and Documentation Severity 4 Issue: There is one “Help & FAQ” section, but it only provides help for registering as a ​ user and a list of general frequently asked questions. ● Heuristic Evaluation Results: ○ The content in the help page did not change to specifically tailor to each page. For example, there was no help visible when we needed assistance on the movie genre page. ○ The help button located at the top right of the website only has FAQs regarding account information, which has no relevance for users who use IMDb without an account. Recommendation: It would be more useful if the help documentation was responsive ​ to the page the user is on and could provide specific help. For example, at the bottom of each page, IMDb could add a section asking if users need help with the page they are on. IMDb has a help forum specifically for asking questions and receiving answers from other helpful users. This is a nice feature, but it is not exactly convenient since it may take longer to receive responses.

Conclusions and Final recommendations Overall, this usability study of the IMDb website provided us with clear results of specific usability issues that could be improved in order to optimize the overall flow of the website. The main usability issues that were indicated from our results were: navigation to genres, clickability of certain movie page features, and amount of content. For navigation to genres, we recommend creating an clearly visible “genres” tab for easy access so that users will not have to go through multiple pages just to access the genres page. For the clickability of movie page features, we recommend making it more obvious the users about what features are clickable or not. Lastly, for the amount of content, we suggest further research in order to find out what kind of content people actually want to see.

We hope that IMDb will take our recommendations into consideration in order to improve their website’s navigation, the clarity of their information architecture and categorization, and the overall consistency and logical flow in order to encourage users to continue using the website.

29

References Amazon (2016). Modems. Retrieved from ​ ​ https://www.amazon.com/b/ref=s9_acss_bw_en_WT_d_1_2?_encoding=UTF8& node=284715&pf_rd_m=ATVPDKIKX0DER&pf_rd_s=merchandised-search-top- 2&pf_rd_r=V6B31NCWVPM8G8BT5DST&pf_rd_r=V6B31NCWVPM8G8BT5DST &pf_rd_t=101&pf_rd_p=7e4c1778-5f33-4d51-80ac-789b566669d2&pf_rd_p=7e4 c1778-5f33-4d51-80ac-789b566669d2&pf_rd_i=172504

Internet Movie Database, Press Room (2012). IMDb Mobile Pull Quotes: About IMDb’s ​ Mobile Momentum. Retrieved from http://www.imdb.com/pressroom/ ​ ​ mobile_pull_quote

Sanocki, L., & Zhang, T. (2016). HCDE 417 | Autumn 2016: 22 Nov 2016 [Powerpoint slides]. Retrieved from https://canvas.uw.edu/courses/1107482/file ​ s/39149967/download?wrap=1

30

Appendices

Appendix A: Consent Form

I agree to participate in the study conducted by the IMDb.com Usability Test Group at the University of Washington, Human Centered Design and Engineering Department.

During this study: ● I will be asked to perform certain tasks on a computer ● I will be interviewed regarding the tasks I’ve performed ● I will be recorded through audio and video during the session

I understand and consent to the use and release of the recording by IMDb.com Usability Test Group at the University of Washington, Human Centered Design and Engineering Department. I understand that the information and recording is for research purposes only and that my name and image will not be used for any other purpose. I relinquish any rights to the recording used by IMDb.com Usability Test Group at the University of Washington, Human Centered Design and Engineering Department without further permission.

Participation in this usability study is voluntary. All information and recordings will remain strictly confidential. The descriptions and findings may be used to help improve the IMDb.com website design. At no time will my name or any other identification be used. I have the ability to withdraw consent to the experiment and stop participation at any time.

Below is my signature indicating that I have read and understood the information on this form and that any questions I might have about the session have been answered.

Date: ______

Participant’s printed name: ______

Participant’s signature : ______

Thank you!

We appreciate your participation.

31

Appendix B: Pre-Test Questionnaire ​

1. How often do you watch movies or TV shows? ___ Only a couple times a year ___ Once a month ___ Once a week ___ Every day

2. What helps you determine what movies or TV shows to watch? Please explain.

3. Have you used IMDb before? ___ Yes ___ No

32

Previous IMDb user:

How often do you use IMDb? ___ Every day ___ About once a week ___ Once a month ___ A few times each year ___ Never

When was the last time you used IMDb? ___ This week ___ About a month ago ___ 2-3 months ago ___ More than 3 months ago

What do you use IMDb for? ___ Finding movies in theaters now ___ Finding movies/TV shows related to ones you like ___ Watch movie trailers ___ Looking up the cast of movies/TV shows ___ Finding trivia information about movies/TV shows ___ Reading movie reviews ___ Find the latest news ___ Other: Please explain below

What device or platforms to you check IMDb on? ___ Phone browser ___ Mobile application ___ Tablet ___ Laptop ___ Desktop

In what environment do you usually look at IMDb? ___ At home ___ Outside of the house (mall, school, etc.) ___ Alone ___ Around other people ___ Other: Please explain below

Do you prefer IMDb over other movie databases like or Wikipedia? Explain:

33

Non-IMDb user:

a. What do you think it can be used for?

b. What do you use to look at movies and entertainment?

34

Appendix C: Introductory Script ​ ​ [Before reading Introductory Script]

❏ Welcome participant to testing area ❏ Have IMDb.com opened in a Google Chrome window on the laptop ❏ Clear browsing data ❏ Make sure recordings are prepared ❏ Separate moderator vs. ppt docs

Start time: ______

“Thank you for taking time to participate in this study, we really appreciate it.

I’m and this is . We are students in the Human Centered Design and Engineering (HCDE) major at the University of Washington. will be just be taking notes today and I will be facilitating the study.

We are studying the usability of the IMDb website. IMDb is a popular online movie database to find more information about movies and TV. During this study, we will ask you to complete 4 tasks on the website. You may take a break at any time. Please remember that we are testing the site, not you. Any difficulties you may have are because it wasn’t designed in a way that makes sense to you.

During the tasks, I can clarify what you are asked to do, but otherwise will remain silent. Try to complete the tasks as if you were at home. You can spend as little or as much time as you want on each of them. Again, we are not testing you, we are testing the site. Also, as we go along the tasks, I'm going to ask you to think out loud as much as possible. By that, we mean that we’d like you to speak your thoughts as often as you can so that we know what’s going through your mind.

If it’s okay with you, we will be taking screen, audio, and video recordings for a better understanding of behavior on the website. Please take a look at this consent form and sign if you agree with the conditions.

❏ Give participant the consent/video form to sign ❏ Once signed, start up Lookback and Skype on the laptop ❏ Connect with the note-taker’s laptop and set up Skype’s screen-sharing

Before we begin, do you have any questions?

35

Appendix D: Study Script ​ [Before reading Task Scenarios]

❏ Read Introductory Script ❏ Read Think Aloud Warmup Script

Think Aloud Warmup Script During this study, we ask that you think aloud as you complete each task. Tell us what you like or dislike, what frustrates you, what you’re looking for, etc. To get a better idea of the level of detail in thinking aloud that we would like to hear, I have a short think aloud warmup: count how many windows are in your house or apartment, and walk me through your process. I’m not really interested in how many windows you have, but I am interested in how you go about doing this task.

● If level of detail is satisfactory, say: ○ “Great, thank you. It would be really helpful if you walked me through your thinking during the tasks with the same level of detail.”

Homepage Warmup Okay, now we can get you started on this laptop here. On the laptop screen, you will see the homepage of the IMDb website.“

Now, take a few moments to observe the IMDb homepage. Feel free to scroll around, but please do not click on anything just yet.

❏ Give the participant 2 - 3 minutes to explore the homepage and questions

● What do you think this website is for? ● What do you think you can do on this page? ● What strikes you about this site?

Thank you. Now I’ll proceed to ask you to try doing some tasks on this website, and I will read each task out loud. Again, as much as possible, it will help us if you can try to think out loud while you perform these tasks so that we know what you’re thinking about.

When you feel you’re done with each task, please fill out the short survey at the bottom of the task sheet. At this point, I may ask you a couple questions, or you may continue. ● Moderator: If the ppt did not actually complete the task, ask them questions to see why they thought they were done

❏ Read task scenario out loud (from script)

36

❏ Allow participant to proceed until they accomplish the task or they get very frustrated ❏ After each task, participant should complete the post-task questionnaire ❏ Ask any additional questions based on your observations ❏ Repeat for each task or until time runs out

Task 1 It’s Thanksgiving night, and you and your family want to stream a comedy movie on your laptop, but cannot decide which movie to watch. Find out what kinds of comedy movies there are on IMDb. Choose one that you know you and your family will enjoy.

Task 2 On Halloween, you and your friend watched “.” You want to find another movie to watch and remember that you really enjoyed the actress who played the daughter, Judy, in the movie. Find another movie with that actress in it.

Task 3 You heard one of your friends talking excitedly about a new movie called “Arrival.” You think it sounds interesting. Try to look it up yourself to see what’s so good about it.

Task 4 Think of a movie that you recently saw. Find a movie similar to it in order to help you decide on what movie to watch next.

Thank you, that was very helpful.

Now that we’ve finished exploring the IMDb website, I have a brief questionnaire that I would like you to complete. The information that you provide is for our use only, and will not be stored with the questionnaire data.

❏ Have participant fill out Post-Test Questionnaire

Lastly, I’d just like to ask you a couple questions about your experience.

● How did you feel about the tasks overall in terms of difficulty?

● Is there any areas, specifically, where you struggled? ○ How do you think IMDb can be improved to help you with this?

● If users previously said “yes” to knowing about IMDb in the pre-test questions, we will ask: 37

○ How has testing IMDb changed your view of IMDb? ○ How do you feel about IMDb as a whole? ○ Has your opinion of IMDb changed now that you’ve tested out the desktop site?

● If users previously said “no” to knowing about IMDb in the pre-test questions, we will ask: ○ Now that you’ve heard about it, do you think you would use it in the future? ■ If so, how would you use it? ■ If not, why not?

● What was the navigation through the website like? Please describe your experience.

Now that we’re done, do you have any comments or questions for me about today’s session?

❏ Remind participant about the incentive for participating in this usability test (they’re entered in drawing for $25 Amazon gift card) ❏ Stop recording on Lookback, end Skype screen-sharing, and save the files ❏ Thank participant and escort them out ❏ Note end time

38

Appendix E: Homepage Warmup Homepage Warmup - Note Taking

● What do you think this website is for?

● What do you think you can do on this page?

● What strikes you about this site?

39

Appendix F: Post-Test Questionnaire ​

Please indicate your level of agreement/disagreement with each of the following statements. Statement Strongly Disagree Neither Agree Strongly Disagree Agree or Agree Disagree

I was able to complete the tasks without having to remember the information on previous screens.

I felt confident in my ability to understand where I was within the website.

I felt confident in my ability to locate the information I needed.

The navigation bar was helpful in completing my tasks.

The section headers made the content easy to identify.

I feel the content is organized in a way that makes sense.

How satisfied are you with the overall experience of the IMDb.com website?

1 2 3 4 5 Not at all Slightly satisfied Moderately Very satisfied Completely satisfied Satisfied

40

Appendix G: Post-Test Interview

After participants fill out the questionnaire, we will also ask them several questions including:

● How did you feel about the tasks overall in terms of difficulty?

● Is there any areas, specifically, where you struggled?

○ How do you think IMDb can be improved to help you with this?

Previous Users: ● If users previously said “yes” to knowing about IMDb in the pre-test questions, we will ask: ○ How has testing IMDb changed your view of IMDb?

○ How do you feel about IMDb as a whole?

○ Has your opinion of IMDb changed now that you’ve tested out the desktop site?

41

Non-IMDb Users: ● If users previously said “no” to knowing about IMDb in the pre-test questions, we will ask: ○ Now that you’ve heard about it, do you think you would use it in the future?

■ If so, how would you use it?

■ If not, why not?

● What was the navigation through the website like? Please describe your experience.

42

Appendix H: Tasks

Task 1: It’s Thanksgiving night, and you and your family want to stream a comedy movie on your laptop, but cannot decide which movie to watch. Find out what kinds of comedy movies there are on IMDb. Choose one that you know you and your family will enjoy. Once you’ve completed the task, please navigate back to the IMDb homepage.

Please indicate your level of agreement/disagreement with each of the following statements. Statement Strongly Disagree Neither Agree Strongly Disagree Agree or Agree Disagree

I was able to complete the task without having to remember the information on previous screens.

I felt confident in my ability to understand where I was within the website.

I felt confident in my ability to locate the information I needed.

The navigation bar was helpful in completing my task.

43

Task 2: On Halloween, you and your friend watched “The Conjuring 2.” You want to find another movie to watch and remember that you really enjoyed the actress who played the daughter, Judy, in the movie. Find another movie with that actress in it. Once you’ve completed the task, please navigate back to the IMDb homepage.

Please indicate your level of agreement/disagreement with each of the following statements. Statement Strongly Disagree Neither Agree Strongly Disagree Agree or Agree Disagree

I was able to complete the task without having to remember the information on previous screens.

I felt confident in my ability to understand where I was within the website.

I felt confident in my ability to locate the information I needed.

The navigation bar was helpful in completing my task.

44

Task 3: You heard one of your friends talking excitedly about a new movie called “Arrival.” You think it sounds interesting. Try to look it up yourself to see what’s so good about it. Once you’ve completed the task, please navigate back to the IMDb homepage.

Please indicate your level of agreement/disagreement with each of the following statements. Statement Strongly Disagree Neither Agree Strongly Disagree Agree or Agree Disagree

I was able to complete the task without having to remember the information on previous screens.

I felt confident in my ability to understand where I was within the website.

I felt confident in my ability to locate the information I needed.

The navigation bar was helpful in completing my task.

45

Task 4: Think of a movie that you recently saw. Find a movie similar to it in order to help you decide on what movie to watch next. Once you’ve completed the task, please navigate back to the IMDb homepage.

Please indicate your level of agreement/disagreement with each of the following statements. Statement Strongly Disagree Neither Agree Strongly Disagree Agree or Agree Disagree

I was able to complete the task without having to remember the information on previous screens.

I felt confident in my ability to understand where I was within the website.

I felt confident in my ability to locate the information I needed.

The navigation bar was helpful in completing my task.

46

Appendix I: Data-Logging Forms ​

Tasks Successful Number Number Number Number Number of Number of Completio of Hints of of times of times Positive Negative n (yes/no) Errors Backtrac Search Comments Comments k Function used

1) Find a movie

Comments

2) Find an actress

Comments

3) Find “Arrival”

Comments

4) Find a related movie

Comments

47