Fakulteten för teknik och samhälle Datavetenskap

Examensarbete 15 högskolepoäng, grundnivå

On the development of an educational math game

En studie i skapandet av ett lärandespel i matematik

Sebastian Hovenberg

Ehab Okal

Examen: Kandidatexamen 180hp Handledare: Mia Persson

Huvudområde: Datavetenskap Andrabedömare: Gion Koch Svedberg

Program: Spelutveckling

Datum för slutseminarium: 2016-04-14

Abstract

There are many different ways to implement learning activities in the modern educational system as seen in schools nowadays. But in an era rapidly being digitized, the educational system has possibly not really managed to catch up or succeed in implementing these digitalization’s in a meaningful and effective way. This inability to digitize has been growing more prevalent by each passing year with droppings in mathematical performance, as shown in the latest result for the tests carried out by PISA [1].

Out of this enlightenment, the Swedish government set out funding research projects in order to understand and examine the reasons behind this drop. But this study will not focus on the PISA result. In this study, we investigate whether educational games could be an answer to negate or halt these dropping in performance.

In particular, the main aim of this study is to design and implement an educational game centered on solving mathematical problems in a new way not often seen in educational games, by implementing design choices mostly seen in highly developed video games.

The results of this study show that there is some evidence that confirms that more highly developed games could be an excellent way of learning and possibly also the next evolution in the educational system seen in schools. However, our results also show that the educational part of such an educational math game really has to be flexible and challenging enough for the players or students in order to have them coming back to the game and learn more.

Sammanfattning

Det finns många olika sätt att skapa inlärningsaktiviteter i det moderna utbildningssystemet. Men i en tid som snabbt digitaliseras har utbildningssystemet kanske inte alltid lyckats med att implementera digitaliseringen på ett meningsfullt och effektivt sätt. Denna oförmåga att digitalisera har blivit alltmer utbredd för varje år som passerar med en minskning i matematisk prestanda hos studenterna, vilket tydligt framgår av senaste resultaten i PISA [1].

Utifrån ovannämnda observationer utarbetade den svenska regeringen ett finansierat forskningsprojekt för att förstå och undersöka orsakerna till denna nedgång i matematisk prestanda. Denna studie kommer dock inte att fokusera på PISA- resultatet utan istället kommer vi att undersöka huruvida pedagogiska spel kan vara ett möjligt sätt att hantera ovannämnda nedgång i matematiska prestanda.

Syftet med denna studie är att skapa ett pedagogiskt spel som är centrerat på att lösa matematiska problem på ett nytt sätt, närmare bestämt på ett sätt som oftast inte ses i pedagogiska spel; detta kommer att göras genom att implementera designmönster som oftast ses i högutvecklade videospel.

Resultaten av denna studie visar att det finns bevis som bekräftar att mer högutvecklade spel kan vara ett bra sätt att lära sig, och därmed eventuellt kanske nästa steg i skolsystemets utveckling. Resultatet visar dock även att den pedagogiska delen av spelet måste vara flexibel och utmanande nog för att få spelaren eller studenten att komma tillbaka till spelet och lära sig mer.

Table of Contents 1. Introduction ...... 1 1.1 Background...... 1 1.2 Main aim of our study ...... 2 1.3 Audience ...... 2 1.4 Delimitations ...... 3 2. Theoretical background ...... 4 2.1 Important elements for our study ...... 4 2.1.1 Previous results ...... 4 2.1.2 Mechanics-Dynamics-Aesthetics (MDA) framework ...... 4 2.1.3 Design patterns ...... 5 2.1.4 On the relevance of aesthetics in video games ...... 6 2.1.5 Dynamics ...... 7 2.1.6 Mechanics ...... 8 3. Method ...... 9 3.1 Worldview ...... 9 3.2 Design science ...... 10 3.3 Problem identification ...... 11 3.3.1 Research question and hypothesis ...... 11 3.3.2 Literature study ...... 13 3.4 Solution design ...... 14 3.4.1 Artefact design and implementation ...... 14 3.4.2 Choice of ...... 14 3.4.3 Modeling tools and software ...... 15 3.4.4 Development model ...... 15 3.4.5 Version handling ...... 15 3.4.6 Platform ...... 16 3.5 Evaluation of the artefact...... 16 3.6 Evaluation ...... 16 3.6.1 Mixed methodology ...... 16 3.6.2 Data collection ...... 17 3.6.3 Selection of test participants ...... 17

3.6.4 Data collection -- biases and pitfalls...... 18 3.6.5 Data analysis ...... 18 4. Analysis, design and implementation of our game ...... 20 4.1 Analysis of educational games ...... 20 4.2 Setting up the Game ...... 21 4.2.1 Design Pattern: Fail to learn (forgiving gameplay) ...... 22 4.2.2 Design Pattern: Pseudo unlimited movement space (freer gameplay) ...... 22 4.2.3 Design Pattern: Predictable Consequence nr.1 [19] ...... 23 4.3 Introduction ...... 24 4.3.1 Design Pattern: Teacher NPC (or Sture) ...... 24 4.3.2 Design Pattern: Sture’s Movement (or privileged movement) ...... 25 Design Pattern: Sture Weenie ...... 26 4.4 Design Pattern: Give the player a goal ...... 26 4.5 Design pattern: Pseudo choices ...... 27 4.6 Design pattern: Transition ...... 28 4.7 The cube solving puzzle part ...... 29 4.7.1 Design Pattern: Predictable Consequence nr. 2 ...... 30 4.7.2 Design pattern: Filter...... 31 4.8 Challenges: Jump solving puzzle part ...... 31 4.9 Variety ...... 33 4.10 Missing parts ...... 36 4.11 MDA framework ...... 37 5. Qualitative results and analysis ...... 38 5.1 Observational results and qualitative data analysis...... 38 5.1.1 Whole Game ...... 39 5.1.2 The introduction part of the game ...... 40 5.1.3 The cube solving puzzle part ...... 44 5.1.4 The jump solving puzzle part ...... 48 5.2 Survey results and qualitative data analysis ...... 52 5.2.1 The test participants’ perceptions of the educational part of our game ...... 52 5.2.2 Would our test participants use a similar game? ...... 55 5.2.3 Usually played game genre ...... 58

5.2.4 How would you categorize (w.r.t. genre) our game?...... 58 6. Quantitative results and analysis ...... 59 6.1 Some basic definitions ...... 59 6.1.1 Arithmetic mean ...... 59 6.1.2 Variance ...... 59 6.1.3 Standard deviation ...... 59 6.2 Sample result and analysis ...... 60 6.3 Average number of hours spent playing video game per week ...... 61 6.4 Completion time ...... 62 6.5 How clear was the educational part of our game? ...... 63 6.6 How interesting was our proposed game? ...... 65 6.7 How unique was the game? ...... 66 6.8 How do our test participants view other educational games versus our proposed educational game? ...... 67 6.8.1 Our test participants’ views of other educational games ...... 68 6.8.2 Our test participants’ views of our proposed educational games ...... 69 6.9 Inferential statistics ...... 70 6.9.1 Inferential statistics: age versus completion time ...... 70 6.9.2 Inferential statistics: average time spent versus completion time ...... 71 7. Convergent analysis ...... 72 8. Discussion ...... 76 9. Conclusion and future studies ...... 78 References ...... 81 Appendix: A ...... 85 Appendix B ...... 87

1. Introduction

1.1 Background Of the 64 countries with trend data from PISA between 2003 and 2012, 25 countries improved in mathematical performance while Sweden is one of the countries that dropped during these comparison [1]. Out of this enlightenment the Swedish government set out funding research projects in order to find out the reasons for this drop.

In light of the aforementioned issue, an idea was born to investigate whether it is possible to create an educational math game that could serve as a helping tool for students. There are already thousands of learning games available to all kinds of platforms, yet there are almost none that is compatible to use in free form in education. Moreover, almost every single learning games are designed to a specific topic, mostly math and language. However, there are some entertaining games that have stories with adventures combined with the gameplay which have been proved to drive the players wanting to learn more [2, 4, 7]. But the aforementioned games are still very restricted and can only be used in few areas.

In an earlier study, Rowe et al. [6] conducted an experiment of creating an educational game called Crystal Island with a focus on narrative-centered learning environments, which was conducted to help the students to learn more about biology. By using both the post and pre-tests the research team could then see if the students managed to learn something by playing the game. The findings of the study show and promotes that with a design of a broadly effective gameplay activities for narrative-centered learning environments such as investigations, being able to talk to Non-Playable Characters(NPC) and trying science equipment’s contributes to an effective problem solving mentality, and greatly improves the learning outcome and sustained engagement for all the students.

Moreover, in [3] Gee arguments that good games do not have to be educational for being an educational game. Instead, the game only has to be self-explained in such a way that the player can understand the fundamental on how the game can be played and thus being able to achieve the skills and knowledge for completing the progressive harder task [3,4].

In another study conducted by Gee (see [7]), he argues that it is possible to create an educational game that could benefit for schools and that the most and foremost part that video games can contribute with here is that they can be more forgiving if a player fails to complete a task. It is much quicker to restart a video game and retry until it is completed than handing in an essay or a test that could take up to a week to be checked. Or, as Gee formulates the derived conclusions in [7]: “The cutting edge is realizing the potential of games for learning by building good games into good learning systems in and out of classrooms and by building the good learning principles in good games into learning in and out of school whether or not a video game is present. “

1

1.2 Main aim of our study

In this study, we investigate whether educational math games could be a solution for negating the aforementioned drop in the Swedish PISA results.

In particular, the main aim of this study is to identify and obtain a deeper understanding of appropriate design choices while creating an educational math computer game, and how our choices here actually impact a player´s overall satisfaction and interest to continue playing the game.

Specifically, in this study we will seek the answer to the following question:

 Is it possible to develop an educational math computer game that is both satisfying and interesting for the player to play?

Note that from the aforementioned research question, it follows that one of the objectives of our study will be to propose a suitable design pattern to create a game that are both satisfying, interesting and educational in the same time for a player.

In particular, note that in order to achieve the aforementioned aim, we will need to develop a new educational math game, implementing appropriate design patterns, well known to facilitating the educational, satisfying and interesting aspects in a game. And, as a next step, evaluate whether our proposed game actually fulfills the aforementioned so called non-functional requirements. During this evaluation phase, we will collect both qualitative and quantitative data from observations of how the educational elements in our proposed game where perceived by a set of test participants, and how it affected their overall judgment with respect to the educational, satisfaction and interest aspects of the game. We will collect the aforementioned data by observing and recording the behavior of the participants testing our new educational math game, and also by collecting additional data in form of a survey.

1.3 Audience Intended audience for this study are video game developers, video game researcher but also researchers in education who are interested in the implementation of video games as a possible learning tool in classrooms. Our intention is that the reader of this study should be able to understand the structure, methods, models and frameworks used for the design of the research process, how and why the game was implemented in a certain fashion and finally how the results of the study was collected and evaluated in order to integrate these findings into future works and iterations of the created educational game.

2

1.4 Delimitations Our objective is not to go through all kinds of game designs, but rather to propose one possible design promoting the aforementioned aspects in an educational math game. Also, our proposed educational game restricts to a PC , since other platform’s possible restrictions in game controls would not allow our game to be played as we wanted it to be played (this observation will be discussed in detail in upcoming chapters).

Worth mentioning here is also that we are not investigating any gains or abilities in mathematical proficiency among our test participants, since the main focus of our study is investigate how our test participants reacted and perceived on the design choices and patterns of our proposed educational game.

3

2. Theoretical background

2.1 Important elements for our study Since our study is aimed both towards video game developers and teachers that are either interested in the game development process or the generated results from the evaluation of our proposed educational game, it is important to address the elements and framework that were used to structure our study, and our developed educational game herein. In the rest of this chapter, we will subsequently describe three elements that are particularly important during the rest of our study.

2.1.1 Previous results The first important element used in our study are some related research results generated by Gee [4, 7, 20]. In particular, Gee has dedicated a lot of research in order to establish a design framework that can be used in video game development in order to enhance the learning experience of a game. Moreover, Gee also argues how video games can be a useful tool supplementing traditional educational system. In particular, in our study we will use two earlier research studies by Gee; one that addresses six reasons to why video games are good for learning [7], the other provides 16 guidelines recommended to follow when creating an educational game [4].

2.1.2 Mechanics-Dynamics-Aesthetics (MDA) framework The second important element used in our study is the Mechanics-Dynamics- Aesthetics framework (MDA framework for short), described as follows in [16]: “MDA is a formal approach to understanding games - one which attempts to bridge the gap between game design and development, game criticism and technical game research.”

The MDA framework was developed at Game Design and Tuning Working at the Game Developers Conference at San Jose 2001- 2004, and the main aim of this methodology is to a have a more formal approach regarding the subject of game research [16]. In particular, the purpose of this methodology is to clarify and strengthen the iterative processes for developers, scholars and researches alike, which allows the studying, decomposing and finally the designing of a broad array of game designs and game artifacts [16]. MDA is an abbreviation for:  Mechanics: The components that stands for data representation and algorithms. For example, pressing the space bar will result in the character jumping a certain height and landing back on the ground through an algorithm that simulates gravity.

 Dynamics: Is the run-time behavior of the mechanics acting on the player’s inputs and the output it generates over time. For example, when the player presses the shoot button the game character fires of the gun.

 Aesthetics: Is the emotional response the game evokes in the player. For example, there is an emotional response in a game scenery such as a lush forest or a post-apocalyptic wasteland.

4

2.1.3 Design patterns The third important element used in this study is the implementation of design patterns [19], which is a formal approach used during the development and testing of different game designs during the development phase of a video game. Design patterns can be described as follows [19]: “Patterns are simple collections of reusable solutions to solve recurring problems.”

Design patterns are traditionally expressions of problem-oriented thinking, with each pattern being described as a problem which occurs often in a certain environment. The aim of design patterns is to find a solution to the described problem in a way that can be reapplied “millions of times”, without ever doing it the same way twice [19].

The reason behind using design patterns is to establish and provide the development team with a shared design vocabulary which allows for [19]:  Better communication among the development team.

 Documentation of the insights of the development team, and organization of individual experience as written knowledge.

 The analysis of the design choices of the development team, as well as the design choices of others, with the purpose of conducting comparative criticism, re-engineering or maintenance.

The documentation of design patterns is usually performed by using a template containing the following four essential elements [19]: 1. Name: In order to make the shared vocabulary among the development team easy to understand, it is important to name the created pattern.

2. Problem: A description of the problem, including the inherent trade-offs and the context in which the problem occurs.

3. Solution: A description of the solution to the problem, and the general arrangement of entireties and mechanisms that can be used to solve the problem.

4. Consequences: Each solution has its own trade-offs and consequences, with some solutions either causing other problems or amplify existing ones. Therefore, is it important to balance the costs and benefits, and also compare other solutions, before making a design decision.

In this study, design patterns promoting the aspects satisfying and interesting in the context of video games are of particular interest to us. Note that the definitions of the aforementioned aspects satisfying and interesting for this study are derived from a series of design patterns [19] commonly seen in well renowned and highly developed video games, and are listed as 16 different design aspects as mentioned in James Paul Gee’s article Good video game and good learning [4]. Additionally, the MDA framework [16] will also be used to define the aforementioned terms, since this framework is used to developed and improve video games in a way that is both satisfying and interesting for the player playing the game.

5

The main focus of the aforementioned design patterns is to catch the attention of the players and drive them to continue on playing the game until completion (note the relation to aforementioned design aspects satisfaction and interest).

The following design patterns are well known for promoting the aforementioned aspects [16]:

- Well implemented and meaningful game mechanics are considered to be the most important design pattern when developing video games, since video games as a medium are centered on the interaction between the player and the game. Therefore, it is important to convey a meaning behind each action.

- Interesting and satisfying game story without any plot holes. Video games are in many ways just like books. They have a beginning, protagonist, antagonist, clash of interest, transitions and an ending, but unlike the more passive notion seen in books (as described by Plato in Phaedrus [4, 45]), games reflects the actions and decisions of the players, and thus, allows the player to become both the reader and the writer of the story.

- Immersive and eye catching game world. This design pattern leans toward the aesthetic part of video game development and plays a significant role in setting the scene of the story of the game. This design pattern is the style in which the game projects itself to the player, i.e., 3d / pixels.

- Clever and intuitive gameplay is closely linked to game mechanics but instead addresses the design of the puzzle elements and their solutions.

For a detailed description on how we actually deploy design patterns in our study, see chapter 3.

2.1.4 On the relevance of aesthetics in video games The aesthetics in video games is an important aspect when it comes to developing a game that is considered “fun” by the end user [16]. But instead of using the word “fun” in our study, we will use the more directed vocabulary from [16]:

1. Sensation Game as sense-pleasure

2. Fantasy Game as make-believe

3. Narrative Game as drama

4. Challenge Game as obstacle course

5. Fellowship Game/Companionship as social framework.

6

6. Discovery Game as uncharted territory

7. Expression Game as self-discovery

8. Submission Game as pastime.

As for the educational game for this study it will consist of: Sensation, Fantasy, Discovery, Companionship and Challenge, see 2.1.5 below for a more detailed explanation of these keywords. Note that the aforementioned keywords will be used as a compass in order to define our gameplay and describe the dynamic and mechanics aspect of our gameplay.

2.1.5 Dynamics The design choices for the educational game need to be evaluated in order to establish their usefulness and adaptability with the general vision and design plan of the game, dynamics work to create aesthetic experiences for the game. Therefore, we will continuously evaluate our developed artefacts:  Sensation: The theme for the game world need to be made in fashion that directly catch the players interest, that is why the design of the game world will be focused on immersing the player in a world filled with mysteries and interesting visual effects that will serve as eye catchers [16].

 Fantasy: In order to supplement the sensation aspect for game play, the setting and the design for the game will be set in a fantasy world. This enables more freedom since the game world does not need to adhere to a certain set of worldly rules and law.

 Discovery: One way to catch the player’s interest is to create an environment that encourage the player to explore and discover in a mysterious and uncharted game world [16]. Our assumption here which will later be evaluated is that this will drive the player to continue on to play the game and not run away after realizing it is a math game.

 Companionship: Is just like Fellowship but instead of interacting with fellow player in the game world, this game will instead have a companion that will follow and help the player.

 Challenge: The challenge aspect for this game will be the mathematical questions that need to be solved in in order to progress forward in the game, the solving of the math problems will not be made in the traditional fashion, i.e., inputting numbers directly to solve the problem. Instead it will be made in puzzle fashion by lifting or moving cubes with numbers on them, this is in order to make the gameplay a little more interesting.

7

2.1.6 Mechanics The mechanics for the educational game are the various actions, behaviors and control mechanisms. The mechanics available for the player will be the standard control functions as seen in nearly all modern first person shooter, i.e. the ability to move around, jump and interact with game object either by lifting them or pushing them. Regarding the control mechanisms for the game it will be based on a first- person perspective in a 3d world and controlled with a keyboard and mouse just as the mentioned inspiration for this game.

8

3. Method

Our study will rely on the following elements and methods/approaches:

 Worldview

 Design science

 Mixed methods methodology

3.1 Worldview

Worldview (or philosophical foundation) has been described as “a basic set of beliefs that guide action” [21]. In particular, it is how the researcher and author view the world around them, and also how they react and perceive the different elements in that certain world. In essence, these views draw from a mixture of different assumptions and variables such as predetermined facts, past experience, academic standards and ideals [14]. Many different views have been formed to better understand and address the different problems that are at hand and in our study, we will deploy the well-known so called pragmatic worldview which can be described as follows:

“Instead of focusing on the methods, researchers emphasize the research problem and use all approaches available to understand the problem (Rossman et al. [13]).”

Furthermore, the pragmatic worldview is generally used to address “The consequences of a series of actions, problem-centred or real-world practice oriented” [14].

Based on our analysis and planning for this study, we realized that the aforementioned pragmatic worldview was the most suitable option for our study since, as mentioned above, it allows the mixture of both quantitative and qualitative methods, conducted in either sequential or concurrent ways [13]. Hence, this gives us the freedom to choose any method, technique, or procedures of research that best suits the needs and purpose of the study. In other word, we will be able to focus on which mixture of methods that provides the best understanding needed to investigate and solve our research question under consideration [14].

However, there are some disadvantages with using a pragmatic worldview; one is how to come up with a reasonable mixture between quantitative and qualitative assumptions, because of the difference in their data collection, interpretation and conclusion. Another problem is when trying to connect quantitative data, which is numeric based, with qualitative data obtained from observations or interviews, and still manage to develop an acceptable analysis and interpretation of the combined data. Moreover, this view is open-ended which can become an issue when planning the research [14]. 9

3.2 Design science The approach and structure of our study will be based on design science [12] for information systems, with a MDA view or lens [16] for evaluating our design choices during development of our educational game. Furthermore, in order to collect and analyze the necessary test data obtained from the evaluation of our proposed game a concurrent mixed methodology [14] will be employed.

Van Aken et al. describe design science research as follows in [9]:

“Design science research can be defined as research based on the approach of the design sciences, that is, research that develops a valid general knowledge to solve field problem. “

Design science research (see, e.g., [9, 12, 13, 25, 26]) is a problem-solving process, often used in development and improvement of information systems, with its framework mainly used for conducting research in areas such as engineering and computing [13]. In particular, the design science framework focuses on the relevance of the research problem under consideration and its subsequent contribution to related subject, since the main outcome of this framework is forwarding the development (or improvement) of a certain information system or a process related to a technical domain [25]. Since the goal of design science is utility and moreover, since our study is tailored towards information system such as video games, this approach is considered suitable for our study.

In [26], Peffers et al. was the first to provide the following general guidelines for design science practitioners:

1. Design of artifact 2. Problem relevance 3. Design evaluation 4. Research contributions 5. Research rigor 6. Design as a search process 7. Communication of research

Out of the aforementioned guidelines, Offermann et al. [25] further defined and evolved the process of design science into two work packages; one intended for publication and another self-contained, and together they are summarized into the following three subparts: problem identification, solution design, and evaluation.

In our study, we will deploy the aforementioned recommended structure by Offermann et al. [25], together with a specialized publication work package with the MDA framework and design patterns.

10

See Fig.1 below for summary of our deployed research method.

Figure 1: The specialized publication work package with MDA framework and design patterns, which summarizes and illustrates the deployed research method of our study.

3.3 Problem identification

3.3.1 Hypotheses and research question

As stated above, the research question of our study can be formulated as follows:

 Is it possible to develop an educational math computer game that is both satisfying and interesting for the player to play?

11

From the aforementioned research question, we derive the following hypotheses:

. Is there a relationship between the average time spent playing video game per week and the completion time of our game? (Relationship)

Note that the aim of this hypothesis is to investigate whether our developed educational game requires any past gaming knowledge in order to understand how to play it. The intention of our developed game is to be playable without any gaming experience.

. Does the age of a participant play a role in the overall performance and ability to complete the game? (Relationship)

Note that the aim of this hypothesis is quite similar to the aforementioned one, but here we investigate whether the age of the test participant plays a factor. This in order, as mentioned above, investigate whether our developed game is playable regardless of age.

. Have we managed to create a demonstration game that takes approximately 15-25 min to complete?

Since a testing/presentation session generally is between 15-25 minutes, the aim of this hypothesis is to investigate whether it is possible for us to develop such a game demo, fulfilling all the aforementioned constraints (e.g., a satisfying and interesting educational math game).

. How did the participants experience our game in comparison with their past experiences with other educational game? (Relationship)

Note that this hypothesis is included in order to investigate our test participant’s experiences with our proposed game, in relation to their past experiences with educational games.

. Did the choice of theme, design and gameplay style make our game unique and interesting for the players? (Supporting the research question.)

Note that the main aim of this hypothesis is to investigate whether our selected design patterns actually contribute to an interesting and satisfying educational math game. Also note that since our game is developed in fashion akin to the aforementioned inspirational games, it is important for us to establish that our proposed game is viewed as a unique contribution to existing games, even though our developed game employs similar game mechanics.

Furthermore, it is also important to note that the answers to the aforementioned hypotheses will provide us with valuable information whether the age and earlier gaming experience of our test participants have an impact on their performance, and also experience, of our proposed game. A knowledge about the general composition of our test sample, and any relationship between certain factors within our study, is important when it comes to handle potential error sources in our obtained primary data. 12

3.3.2 Literature study The main aim of our literature study is to provide us with an in-depth insight into previously research results related to our posed research question (see section 3.3.1).

Our literature study was conducted as follows. We started by first searching for related research studies relevant to our subject under consideration. Here we were using the following online databases: Google scholar, ACM digital library, and IEEE Xplore. Moreover, we use the following search terms for our search: games, game and math, mathematical games, design pattern, educational games, learning, students, and digital education. Then, by skimming over the abstracts, introductions and discussions sections of our selected research articles, we decide whether the articles were relevant for our research study. If so, a continuation was made based on the reference list of the article; this is in order to obtain more information regarding the subject the study was addressing.

In particular, our literature study also provided us with several articles addressing design science [9, 13, 25, 26], hence providing us with valuable guidelines on how to structure the development process of an information systems (such as video games) in a research study.

Moreover, in order to develop an educational game that is convinced as both satisfying and interesting by a player, articles addressing the development (and improvements) of design patterns were selected for further investigation. In particular, design patterns [19] and the MDA framework [16] were used for our study. Moreover, several articles by James Paul Gee (see, e.g., [3, 4, 7]) were studied to understand why and how video games should be used as an educational helping tool for students.

Finally, the book titled Research Design Qualitative, Quantitative and Mixed Method Approaches and Educational research: planning, conducting, and evaluating quantitative and qualitative research by Creswell [14] provides us with an invaluable framework for our study, since the aforementioned book actually covers every step from the planning, research design, how to collect the necessary data, how to link together and interpret collected data, to how one actually conclude and validate our findings.

See our list of references for a complete list of found earlier related research studies and books. In summary, our conducted literature study provides us with several related research studies, confirming the relevance of our posed research question. Moreover, it also provides us with a solid understanding and background (or, more specifically, a framework) on the subject of developing, testing and evaluating educational math games.

13

3.4 Solution design

3.4.1 Artefact design and implementation The artefact design, i.e., the design of our educational game, was decided based on the information and ideas obtained during our aforementioned literature study but also from our earlier experiences on game design and game development. In particular, when it comes to the overall design of our proposed educational game, we draw inspiration from three well renowned games, namely Portal [27], Anti-Chamber [28] and Amnesia [29]. Specifically, already very early during the planning phase of our study we decided that our proposed educational game would be inspired and designed in the same fashion as the aforementioned games. The reason behind this is first of all that all of these three games are so called puzzle games, i.e., their gameplay mechanics revolves around a certain set of puzzle related actions that need to be used or implemented in order to advance further in these games. Another reason is that all of the aforementioned three game are played through a first-person perspective in a 3d world space, which requires a higher degree of production quality compared to regular educational games, which usually are based on flash animation or still pictures. Moreover, all of these three games are also developed by using specialized in-house versions of the MDA framework (which will be deployed also in our study), and last but not the least, all these three games have an interesting and gripping story, which is one of the main factors why they are consider to be good games by players and video games reviewers alike [46, 47, 48, 49].

In our opinion, the most spectacular with the three aforementioned games is the way the development teams for these games managed to implement design choices and patterns promoting our two desired aspects of a game, namely an interesting and satisfying game. In particular, the design patterns relating to the puzzle element, control mechanics and the way the story of the game is told are of interest in our study. Therefore, several design patterns inspired by these three games will be implemented during the development of our educational game, see chapter 4 for a more detailed discussion on this matter.

The implementation of the artefact of our study will be creating an educational game by following the important elements (e.g., design patterns) mentioned in chapter 2, together with the 16 guidelines set by Gee in [4]. Finally, in order to fine-tune and improve our developed educational game as a whole the MDA framework will also be used. The implementation of the artefact and its subsequent design choices and patterns will be discussed in further details in upcoming chapters of this study.

3.4.2 Choice of game engine The aforementioned three games (i.e., Portal [27], Anti-Chamber [28] and Amnesia [29]) are all developed with the help of so called game engines. A game engine is software framework that enables the effective creation of video games and other related media. There are many different version of game engines used in the development process of video games in game industry. The choice of game engines is

14 usually based on: the type/genre of the game, cost and availably of the engine and finally, past experience of the development team.

The game of our study will be developed by using the game engine Unity [30]. The main reason for choosing Unity is based on our previous experience working with this engine, but also its usage of C# for writing scripts. Moreover, its versatility and ease of usage compared to other game engines (such as for example Unreal Development Kit [31]) was also important for our choice here. Another advantage with Unity is the cost and availability of the engine, since the business model for Unity is free to download and use if the developed game earns less than $100,000. But if the game earns more than this sum, the development team has to subscribe to a monthly service (Unity Professional edition [30], with a cost of $75/month).

3.4.3 Modeling tools and software Since the Unity game engine only provides the necessary framework for programming the logic and rules for the game, additional software is needed in order to develop (or in this case model) the necessary 3d models that will be used in our game. The software that were used to model the 3d world and its contents in our study are 3d Studio Max [32] and Blender [33]. Moreover, for coloring (or texturing) the 3d models Photoshop with the Quxiel [34], which is an add-on for Photoshop that makes the texturing easy, was used.

3.4.4 Development model When developing any software video game or other applications, it is always important to structure the development process around a certain software development model. The model that will be used for developing the educational game in our study is the well-known agile development model [41] with some Extreme programming (XP) [41]. The choice of the aforementioned models where based on our need of rapid implementation and testing of different designs for our proposed educational game.

3.4.5 Version handling In our study, we used Dropbox [35] for storing and version handling of our educational game and its related assets. Dropbox is a cloud storage service that allows the uploading and sharing of program files and other file types between different users, thus allowing us to work on the game from different angles. For example, while one of us is programming the scripts, the other one models the 3d objects for the game.

15

3.4.6 Platform Our proposed educational game was developed for PC/Mac mostly because of the planned control design which is a keyboard and mouse, but also since PC/Mac have better computational and graphical performance compared to smartphones.

3.5 Evaluation of the artefact

During the development process of our educational game, we will continuously test and evaluate our design choices. In particular, this will be conducted in-house (i.e., alpha stage). Note that this early and continuous involvement of the end user (i.e., in our case, our selected test participants) in the development process is in line with the deployment of modern agile development methods. There are several well-known benefits with such an early involvement of the end users, e.g., this will provide us with invaluable feedback in the process of developing a game that better fits to our end users’ wants and needs of an interesting and satisfying educational math game.

3.6 Evaluation

3.6.1 Mixed methodology There are many different types of methods and design framework when conducting a research study. The methods of choice are generally based upon the specific type of research problem and what kind of data it relies on, but also how the researchers decided to conduct their research study and the expected outcome.

There are some different method designs at hand, one example here is the performance centered study which relies mostly on numeric based quantitative data in order to analyze and understand if a series of changes impacted the performance of a certain thing or subject. The other well-known method design relies instead on observations and interviews (i.e., based on qualitative data) in order to find a certain relationship between a subject or participants’ behavior and their historical and social background.

However, some studies require both a quantitative and qualitative approach and then, the researchers need to apply a so called mixed method research design [14, 24]; this in order to have a distinct framework on how to address the needs and requirements for both qualitative and quantitative research. Since mixed methodology relies on both quantitative and qualitative research methods, a study applying the aforementioned mixed approach must also emphasize the order in which the quantitative and qualitative data is collected. For example, if the data collections were made in a sequential order (i.e., first a qualitative data collection and later a quantitative, or vice versa) or in a transformative order (i.e., first collecting either qualitative or quantitative data in order to base a secondary data collect of the opposite approach) or lastly, both qualitative and quantitative data are collected at the same time (also denoted as concurrent data collection) [14, 24].

16

Since our study relies both on qualitative and quantitative assumptions in order to answer our aforementioned research question (i.e., whether our design choices here impacted our test participants’ overall satisfaction and interest to continue playing our educational math game), a mixed method approach will be implemented in our study. Regarding the emphasis of the study it will be equal divided between the two approaches.

3.6.2 Data collection In our study, a traditional behavioral science research (see, e.g., [8, 14, 24]) will be used to collect the necessary primary data. Moreover, the primary data of our study will be collected in a concurrent fashion, first by silently observing our test participants playing our proposed educational math game (only assisting them when they get stuck). Furthermore, in order to obtain first hand qualitative data our test participants were encouraged to think loud, i.e., speak up their thoughts while playing our game. After completing the game, our test participants were also asked to fill out a survey (see appendix A) based on a mixture of both qualitative and quantitative questions about their enjoyment, educational level and perception of our proposed game. Finally, a short interview, based on the aforementioned observational data, was conducted with a selected group of our test participants. Note that the main aim of the interview is to flesh out our test participants’ experience of our proposed game and in this way, gain a better understanding about their thoughts and impressions of the game that they just played.

3.6.3 Selection of test participants As mentioned above, the evaluation of our proposed educational game will be performed by a selected group of test participants, which will simulate the end user for our proposed educational game. Since the intended sample group for our study is elementary school students between the ages of 8-12 years old, we contacted some local schools and presented the idea of our study. However, due to complications with arranging such testing sessions together with the local high schools (due to some school regulations), we instead had to include test participants for our study outside our intended main end user group. Specifically, we set up an inquiry about participating in our study and from this we ended up with a group of 14 test participants in the age of 18-48 years old, constituting mostly family members and close friends (mostly video game development students and LAN gamers), for our study.

17

3.6.4 Data collection -- biases and pitfalls There are some important aspects that must be taken into consideration while conducting the data collection process. First, possible biases that our test participants may have while testing and answering the question should be minimized. For example, does a high score of a test case actually means that a test participant enjoyed our game only for other reasons than the intended ones.

Another important aspect is the gender diversity of our data sampling. Specifically, there is a lack of female participants our study, which unfortunately diminishes the validity of the test data since the female perspective and feedback is unknown for the moment. The reasons for the lack of female participants are mainly attributed to the low number of females in the video game development program, or that we simply missed them while conducting the data collection.

Last, but not the least, the qualitative data collection was conducted in Swedish. Later the obtained data was translated into English by us. This is important to consider since language differences may have consequences, e.g., concepts in one language may be understood differently in another language. Hence, in order to translate the qualitative data in a correct way (i.e., not losing any meaning in the process) we followed the recommendations set by van Nes et al. in [15]. In particular, van Nes et al. discuss the challenges of such language differences in qualitative research studies.

3.6.5 Data analysis The analysis of the collected data for our study is conducted by using the convergent concurrent design [37, 50], which is an analysis process that first analyses both quantitative and qualitative datasets separately in order to compare or relate the results and make an interpretation whether the results support or contradict each other. In our study, the convergence will be performed by quantifying the qualitative data, which means translating textual information into numerical data in order to compare it with the quantitative dataset. In our study, the quantification of the qualitative data was performed with the help of MAXQDA 12[38], which is a qualitative data analysis software used to assign or code the textual information in order to later on analyze the frequency of the code in the collected dataset. Furthermore, the statistical analysis of the quantitative dataset of our study was performed with the help of IBM SPSS [39], which is quantitative data analysis software used to analyze the relationship between different variables in a dataset. Finally, the two datasets of our study will also be directly compared with each other, by supporting their results with statistical tendencies or qualitative themes.

See Fig. 2 below for a summary of our convergent concurrent design.

18

Figure 2: Visual presentation of the convergent concurrent design used in our study.

19

4. Analysis, design and implementation of our game

In this chapter, we well describe our initial analysis of the educational math game we are about to develop in our study. Moreover, in this chapter we will also break down the design of developed educational game into a set of conceptual entities, where each component was implemented by deploying one or more well-known design patterns. Note that each of the aforementioned design pattern is well-known for facilitating one or more attribute; in particular, we will explain why the implemented design patterns are necessary to make the educational game interesting and satisfying. Moreover, in order to provide a general picture of our proposed game design we will also describe how the aforementioned patterns work in its entirety.

4.1 Analysis of educational games

There are many kinds of educational games out in the market today, but very few of them have succeeded in being perceived by the player as being both interesting and satisfying. One possible reason for this is the way these games are developed, with restrictive design choices in the game design, the lack of re-playability and dull and uninspiring game world and story [6].

Therefore, we decide to break the norm of how these games are normally developed, i.e., in a fashion comprising pure puzzle elements, quiz, point and click, and etc. Instead, we will focus on creating a game that uses the standard first person perspective as see in modern first-person shooter game (FPS for short) just like Counter strike [39] in combination with a simulation gameplay to enhance the educational aspect of the game.

Specifically, to identify the “how” to create an educational game we chose to go through Gee’s 16 game mechanical principles on what makes a game “A good video game” for educational purposes [4]. A good game for learning, according to Gee, is a simulation game that will reflect the area it affects. Here, the players will be able to explore, test, examine and experiment just to get a hang of what they can do and what they cannot do in the game. However, note that creating a simulation game does not mean that it is relevant by its own unless it is constructed in a user-friendly fashion which enables a coherent interaction between the player and the game.

Furthermore, one of the key points when developing a game is to give the players a clue on what role they have in the game, i.e., who they are and what they can do to influence an outcome in the game [4]. In particular, we found that the theme of the game is a very important and quite critical part of a game since it affects the player’s first impression of the game. By knowing what and how the game should be played, it should now have an appropriate corresponding theme; however, this should not be too revealing or it will ruin some undiscovered excitements of the game.

Rai and Beck argue that instead of creating a learning game, it can be helpful to create a learning environment with game-like elements as learning games have had difficulty in keeping students' interest for longer periods, which limits their long-term learning [5]. 20

As discussed above, everything that makes a so-called identity is also what makes a game. Hence, the structure of our developed game will be a combination of a presented way of solving a puzzle (on-demand), this by having a narrative environment which provides the necessary information needed to solve the puzzle. But we will also allow the players to have the freedom to try and solve it in their own way (just in time); this in order to portray a degree of freedom which indeed is a key factor in making the game play of our developed game more interesting since it involves an element of exploration. Gee summarizes the above discussion as follows:

“Good games give information on demand and just in time, not out of the contexts of actual use or apart from people's purposes and goals, something that happens too often in schools.” [3]

Note that while the Design Pattern process can be used to evaluate the implemented design choices of the developed game, the MDA framework can be used to improve and fine tune the developed game as a whole.

4.2 Setting up the Game

We started by brainstorming a theme that would fit the game with an important rule which is that the theme should not mimic a school environment. Instead the theme needs to be more interesting and eye catching in order to keep the player (student) coming back for more, and the more they come back the more they will eventually learn.

The theme of this game is inspired from the scientific theory behind the Big Bang [42]. The motivation behind picking the aforementioned theme is the mystery and scientific value it has. The Big Bang theme will also serve as our premise for the story of our developed game, with the story revolving around what caused the Big Bang to happen. The answer to this question will be the presented goal to the player, in which they have to explore, examine and solve mathematical puzzles in order to answer what could have caused the Big Bang.

In addition, we believe our game play needs to be “freer” and more “forgiving” than several earlier attempts; this in order to encourage the player to experiment and explore more and hence learn for their mistakes without being punished (i.e., as a failure in exam could be). In order to achieve the aforementioned gameplay, the following design patterns are implemented in our educational math game:

21

4.2.1 Design Pattern: Fail to learn (forgiving gameplay)

Problem: A lot of games penalize the player when failing to complete a certain presented task, and this can discourage some players since it reminds them of the same feeling when failing a test in school. This problem will occur throughout the game in the puzzle section of the game, but also in the exploration and examination sections.

Solution: In order to solve this problem, we will not implement any penalty system in our developed game. Hence, there will no life systems, time limits or anything alike as seen in the aforementioned games (i.e., Portal [27], Anti-Chamber [28] and Amnesia [29]). Instead, the gameplay is implemented in a way that allows the player to have unlimited tries to solve the presented puzzles, and also being rescued by a so-called NPC when falling of a platform (which will be explain into further details in the Teacher NPC design pattern in this chapter).

Consequences: The consequence of this design choice is that some players will either find the gameplay “too easy” or, more importantly, solve the mathematical puzzle by “brute forcing”, i.e., trying different solutions without thinking in order to solve the puzzle as quickly as possible.

4.2.2 Design Pattern: Pseudo unlimited movement space (freer gameplay)

Problem: Unfortunately, all games have the necessary evil pattern of so-called invisible walls, which are used to limit the player’s movement inside the game. This pattern is implemented in different ways, ranging from simple rooms in which the player move around to debris blocking a certain path. The reason behind the implementation of this pattern is to divert the players focus and attention the essential elements and mechanics inside of the game, but also to limit the game world (and the subsequent work need to develop the game world).

Solution: The solution we implemented in our developed game is to let the player move around on floating platforms, which are placed in a game world rimiest of a starry night or galaxy dust. This aesthetic design choice limits the amount of interest of the players since they are surrounded by the same horizon, and if they decide to walk outside and fall of the platform the Teacher NPC design pattern will rescue them.

Consequences: The consequence of this design choice is that some of the players will eventually find out that the freedom offered in our developed game is fake or pseudo, and this realization will unfortunately diminish the player’s interest in exploration. But our design pattern will still be better received by the players, since our invisible wall are not really present in our game thanks to the platform structure of the game world.

22

4.2.3 Design Pattern: Predictable Consequence nr.1 [19]

Problem: The player has to perceive failure in movement (or input) as a consequence of their mistakes, and not as a random or a predestined event in the game.

Solution: The player cannot take a meaningful decision to act (or not to act) if the result of the possible action cannot be anticipated. Therefore, it is important to convey meaning behind every reaction our developed game has to the player’s action, since a meaningful player decision is an informed decision: the player has to be able to guess the result of their action before doing it. The goal of the game mechanic, which will be implemented in our game, is to communicating these predictable behaviors in the following manners:

Earth-like physics: The game world and all the objects residing in it are influenced by gravity and Newton’s three laws of motion [43]; the decision for using this mechanics is that it is based upon real life experience which the player already has some knowledge about.

Standard first person “shooter” (FPS for short) control schema: Our developed game will follow the control schema and mechanics which are used in several modern FPS games, with the following controls:

Keyboard: Mouse:  W = Move forward.  Pan up = Look up  S = Move Backwards.  Pan down = Look down  A = Move Left.  Pan left = Look left  D = Move Right.  Pan right = Look right  Space = Jump.  Holding left mouse button =  E = Change the objects Interaction with a game object value +/- i.e. 3 to -3.

The projection of the game world is projected through the player’s “eyes”, i.e., the player plays as the game character and sees what the game character sees. Moreover, our developed game is a so-called single player game, i.e., there is only one player in a game instance.

Consequences: Any faulty implementation in the two aforementioned solutions will break the immersion in our game which will only tend to confuse the player, since the games reaction will be unpredicted and this will greatly damage both the gameplay and the satisfaction level of our game. Furthermore, it becomes harder to surprise the player with other design patterns which temporary distorts the aforementioned design pattern, since the distortion will either be received by the player as a fun and interesting supplement or just confusing and strange.

23

4.3 Introduction

An important part of all games is the introduction level which describes how and what to do in order to play the game. According to us, this part of our game must be very similar to what is done in school, i.e., when a teacher introduces the students to a new subject such as mathematics. But the learning principles are different and vary with factors such as motivation, recognition and learning environment. A skillful teacher is defined as a person who, for example, can easily show how to solve a certain problem in many different ways, while a good game designer creates a number of different aids for the player to use in order to solve the problem [2].

For our developed educational game the MDA framework companionship aesthetics will be used together with the following design pattern:

4.3.1 Design Pattern: Teacher NPC (or Sture)

Problem: The decision of including a so-called teacher NPC was made out of the insight that most of earlier educational games lacked this kind of feature, which made the aforementioned games feel empty and lonely. Moreover, the effectiveness of this design pattern when it comes to informing the player on what and how things work in the game world, and how fun this feature can be, can be seen in the aforementioned game Portal [27]. In the aforementioned game, the main antagonist Cell encourages the player to continue solving deadly puzzles with the main reason being “it is for science” [27].

Solution: Our solution is to introduce a fictional little gas ball by the name of “Sture", who is a fictional non-playable character (NPC), see figure 3 below. Sture’s task is to be the player's teacher, guide and friend. Note that the implementation of our character Sture is in line with the following recommendation on how to create successful learning environment:

“Narrative-centered learning environments offer the potential to be effective tools for promoting content and problem-solving learning gains by providing students with engaging, interactive learning experiences.” [6]

Consequence: If Sture is poorly implemented the entirety of our game would suffer, and both interest and satisfaction levels would diminish. Since Sture is one of the key pillars in our educational game, any erratic behavior of Sture could possibly damage or amplify other design patterns.

24

. Figure 3: The companion “Sture” acts as a guide and a friend during the whole game.

4.3.2 Design Pattern: Sture’s Movement (or privileged movement)

Problem: It is important to note that the player of our developed game should not be able to interfere with Sture’s movement or other in-game entities such as the platforms which the player stands on. This design pattern is implemented in order to keep the aforementioned Predictable consequence nr.1 design pattern in check, and also to hinder the player from finding game-breaking exploitations [19].

Solution: Our solution for this problem is to first of all lock all static game objects (such as the doors the player walks through, the platforms the player stands on and the structures of the puzzles) after they appear. The second solution is to give Sture the ability to fly around unhindered by the player and other game object, without falling down as the player would do.

Consequence: This implementation can sometimes be viewed as being a very heavy- handed way of ensuring protection of the game world and its game objects. But it is unfortunately a necessary evil in order to insure the games structure and consistency. The player might also perceive this movement constraint to only platforms as annoying and disappointing, compared to Sture who can fly around without any problems.

25

Design Pattern: Sture Weenie

Problem: The player might lose their sense of direction with respect to how our game world unfold, which is a particular problem often seen in player driven gameplay such as in our developed game [19].

Solution: Our solution for this problem is to have Sture becoming the guide of the players throughout the game world and also to divert the players focus towards the important elements presented inside our game. This is archived by implementing audiovisual cues often seen in other games such as: a glowing point which the player has to move towards, a sound bite indicating confirming the player’s action, or the level itself refolding to a certain path to show where the player needs to go.

Consequence: The player may become dependent on this guidance system (i.e., Sture Weenie), and the player will be confused if this design pattern is poorly implemented or even suddenly omitted inside our game. Therefore, it is important to first focus on developing a number of solid Weenie design patterns, which are both coherent and consistent to the presented game world. But also in order to later on chain these Weenies together to form a clear and uninterrupted way through our game world.

Note that, according to Clarke-Wilson [44], the term Weenie was coined by Walt Disney. He suggested that when designing massive 3D environments, such as theme parks, it is important to lead the visitors through these environments the same way as one trains a dog - by holding a wiener and leading the dog by its nose.

4.4 Design Pattern: Give the player a goal Problem: In most learning games, you already know what the goal will be. But this knowledge will in fact make the game less interesting, just like as knowing the ending of a book or movie.

Solution: Our solution for this is to hide the main goal as much as possible; this in order to encourage exploration among our players when playing the game. Hence, we will not immediately reveal that the game is a math solving game. Therefore, we created goals that are not so obvious, even if our players know that the game is going to be about solving math questions. The aforementioned game pattern was also discussed as follows in [3]

“Good games confront players in the initial game levels with problems that are specifically designed to allow players to form good generalizations about what will work well later when they face more complex problems. Often, in fact, the initial levels of a game are in actuality hidden tutorials.” [3]

26

Consequence: The consequence of the aforementioned design pattern is based on how obvious the initial goal of the game is for the player, since a failure in conveying the meaning and purpose behind the aforementioned goal can lead to confusion which greatly diminishes the player interest and satisfaction of the game. Therefore, it is important to have other well implemented design patterns supporting this one (such as the Teacher NPC/Sture and Weenie design patterns previously mentioned, see figure 4).

Figure 4: Sture tells the player to move to the glowing point.

4.5 Design pattern: Pseudo choices

Problem: We included a small experiment in our game, more specifically, we would like to investigate whether it would be possible to trick the player by presenting an illusion of some choice and options in our game. The main aim here is to break up the linear structure of our developed game, i.e., a structure of our game where there is only one way to get from point A to point B.

Solution: This design pattern occurs in our game when the player is confronted by two doors with the choice of going through either the green or the blue door in order to get to the next part of the game. By deploying this design pattern, we let the player believe that the game world is filled with mysteries and that their choices mean something in the game world. This in order to give the player a feeling that anything can be explored in the game world, thus minimizing the feeling of linearity. Another reason to why we choose to implement this feature was the lack of any character customization in our game, which, according to Gee [3], is a necessity to make the player feel more like of producer than a viewer of the game. In other words, the player must have the ability to play the game in their own way (see figure 5 below). 27

Consequence: However, the aforementioned design pattern can actually become a so- called double edge sword when implemented in our game, since we can easily lose the illusion of choice if this design pattern is not well implemented. However, this can be prevented to some extent by implementing randomization together with this design pattern. For example, the door which actually leads to the next part of the game can be the green one for one play session, whereas for another play session it is actually the blue one, and so on (see figure 5).

Figure 5: Choices are being given to the player to let them do their own decisions.

4.6 Design pattern: Transition

Problem: The transition in a game is important and has to be smooth for a maximum stimulation between the stages of the game. By making it “smooth” we mean that the player has to be able to identify themselves as a part of the transition rather than just jumping from one part to another part of the game without knowing how it happened. Another thing about smooth transition is that it helps the player to put behind a part of the game mentally and to recognize where they are at the moment.

Solution: The transition section in our game is implemented by having a flashy tunnel section, which is unique between every transition section in our game. This transition idea comes from how movies and books handle transitions, i.e., how they manage to increase the curiosity of the reader (or viewer) on what will happen next if they will proceed further on (see figure 6).

28

Consequence: The transition can be done at any part of the game, whenever the game changes from one state to another. But the most important part, as mentioned above, is that the player must be a part of that transition. Like in movies or books, if you lose the reader (or the viewer) during a bad transition then they will lose interest.

Figure 6: A great transition will give a player the motivation to continue the discovering.

4.7 The cube solving puzzle part When the players have completed the basic tutorial and the introductory part of our game, and have familiarized themselves with the controls, they will be introduced to the first task of the game. The first task our game consists of a simple puzzle game which uses multiple cubes and a platform. The purpose of the platform is to provide the player with a number, which the player has at disposal in order to complete the puzzle. Furthermore, the role of a cube is to hold a number. On a platform, there are several such cubes placed and the main aim for a player is to select a number of cubes in such a way that the total sum of the selected cubes digits is equal to the sum written on the platform. Note that before the player can do anything, the teacher NPC, i.e., Sture, approaches and inform the player that it is possible to pick up the cubes and to put them on the platform. Specifically, the first task includes the following elements:  Multiple cubes, where each cube has a number (or digit) printed on it,  a platform, where each platform has a number (or digit) printed on it,  The power to lift and throw the cubes at will, by aiming on the cube and then holding the left mouse button.

29

Note that the main aim of this first puzzle task is to select a number of cubes in such a way that the total sum of the numbers written on the selected cubes is equal to the number written on the platform under consideration.

It is important to note that the first task (i.e., puzzle) should be as easy as possible; otherwise, the player will most likely lose interest to keep on playing.

The design patterns that will be used in conjunction with the development and implementation of the first task will be discussed below.

4.7.1 Design Pattern: Predictable Consequence nr. 2 Problem: This design pattern will handle the reoccurring problem of audiovisual feedback when completing the puzzle elements in our game.

Solution: Our solution to this problem is to implementation a series of audio and visual feedbacks corresponding to the player’s action, in order to communicate the predictable behavior of our game. These feedbacks are based on how other games have solved this problem, e.g., by using colors and sound bites that are widely accepted as confirmations indicators such as the following:

Blue text above the solving platform: Indicates that the solving platform has not been touched or used by the player, and no calculation is shown underneath the required platform sum to be solved.

Green text above the solving platform: Indicates together with a positive sound bite that the player has managed to solve the puzzle with the correct calculation shown underneath the required platform sum, and moreover, the cubes that formed the correct calculation are then locked to the platform while the unused cubes shrink and disappear.

Red and yellow text above the solving platform: The red color indicates that the answer the player has provided is actually incorrect, or that cubes are not stationary on the platform. Moreover, the yellow color indicates that the player has overshoot the answer of the required sum of the puzzle. After solving each puzzle element, Sture flies to the next area in the puzzle section of our game in order to create and show the next puzzle element, following the aforementioned design patterns of Sture.

Consequence: It is important not to break the consistency of the visual language used in our game, since this may confuse the player because the game no longer reacts in a predictable manner. Moreover, as mentioned in the Predictable consequence nr. 1 pattern above, it is harder to surprise the player in a good and well implemented way, since they have developed a reliance on how our game usually reacts to the player’s actions. 30

4.7.2 Design pattern: Filter Problem: If the player is given the means to create or combine existing game objects, the result of this freedom may become overwhelming complex to develop and maintain for the designer of the game [19].

Solution: The reasons for removing the unused cubes during each puzzle element (as mentioned above) is to minimize the complexity of each puzzle element and not allow the player to use other cubes that may solve the new puzzle element without any effort by the player (see figure 7 below).

Consequence: The player’s freedom is restricted by either implicit or explicit means, with accidental filtering sometime resulting in unsolvable puzzle elements. Therefore, it is important for us to balance the filtering in a meaningful way without damaging the player’s sense of freedom in our game; this is why we choose to implement this design pattern in a way that is perceived by the player as a trivial action of the game after each completed puzzle element. Since each question, just like a test in school, has a set of alternatives for solving the problem, our puzzle element has a set of cubes to solve the required sum of the platform under consideration.

Figure 7: The first puzzle illustrates the freedom of choice on how the player wants to solve it.

4.8 Challenges: Jump solving puzzle part After each completed part of a new content in a video game, which includes newfound skills or knowledge gain, the player has to go through some kind of challenging part, which puts the player’s newfound knowledge and skills to the test. The main rule when creating a challenge design is that it has to be simple; otherwise the challenge shall only require too much time and effort to be completed.

31

To make a challenge more interesting it is important to create a different approach that has the same principles as the previous task, rather than making a harder version of the same task. As for our game, instead of asking for a higher sum of numbers to be combined with cubes, we create a new area that included multiple platforms with different numbers to be used by the player, but instead of lifting (as done in the first task of our game) the player is now asked to jump on platforms in order to activate/add the number or deactivate/remove the number from the required sum to solve our puzzle. Note that the thing that actually makes this challenge harder than the first task is the number of platforms needed to be activated in order to solve the required sum, which is greater than the number of cubes needed in the first task. Moreover, the possibility of the player falling down if they miss the platform, which in turn is not really a punishment since the players get rescued by Sture lifting them back up again. Another thing that makes a challenge is that it does not explain as much on how and why as in the first task; the sole purpose of this challenge section is to see if the player understood the principles from the previous tasks.

Note that the aforementioned challenge section of the cube solving puzzle part will also have the Predictable Consequence nr. 2 design pattern; this in order to communicate our games reactions to the player’s actions (see figure 8).

Figure 8: Create a challenge that reflects the new knowledge to see if they really have learned it.

32

4.9 Variety A video game should include numerous implementation of variety in its content, otherwise the game will become monotonous and self-destructive due to the high risk of low stimulation. This is often found in the majority of released applications on the online stores for table, computer and smartphones, in particular when it comes to educational games. In order to solve the aforementioned problem of variety we implemented the following variations in our game:

 Floor pieces: The platforms in the different section of in our developed game are modelled differently with different colors and shape; this in order to communicate to the players of our game where they are actually in the game play, and also what they are supposed to do at this particular point in the game. Therefore, we have implemented the following aesthetic design choices:

The platforms in the introduction section in our game are see-through platforms colored in light fluorescent pink. The main reason behind this design choice is to create an eye-catching and interesting introduction to our developed game world. Note that this is important since the initial sections of a game are crucial in capturing the player’s interest and willingness to continue playing the game. Another reason behind this design choice is to also showcase our technical and development expertise when presenting our developed game to future video game companies.

The other platforms implemented in the introduction section of our game are floating glowing white colored platforms. The main aim of the aforementioned platforms is to learn the player the movement controls, in particular the jump ability. Moreover, the reason behind the aforementioned aesthetic design choice is to clearly indicate to the player where to move next, and also what they are supposed to do next.

Moreover, note that the platforms in both the cube and jump solving puzzle parts are colored grey and modeled in a simple fashion; this in order to shift the player’s focus towards the puzzle elements (i.e., not to distract them with other elements found in our game).

 Color of the cubes: Later in our developed game the player is informed by Sture that they can change the value of the cubes from positive to negative or vice versa; this can be done by first aiming (or holding) the cube and then press the E key on the keyboard. This in order to give the player more freedom in solving the pretended puzzle element in our game.

Note that the coloration of the cubes when pressing the E key follows the aforementioned Predictable consequence nr. 2 design pattern, since the color of a cube changes from positive green to negative red in order to indicate that the value of the number inside a cube has changed.

 Transition part: The transition parts in our developed game are first of all colored differently in order to reflect variation. Moreover, they are also animated differently and the main aim here is to surprise the player by how the game world unfold itself. 33

After the players have completed the first two sessions (which focused on learning addition and subtraction) we will change our games transition stage to imply that something completely different will occur if the players proceed. Namely, the next step in our game play will be to introduce the multiplication puzzle element of our game for the players. The motivation behind this design choice is to peak the player’s interest, since this alteration breaks the player’s notion on how our game previously handled the transition which prompts the player to think “what is behind the other side of this strange transition?” (See figure 9 below).

Figure 9: The final transition path.

 Different kind of puzzles and game world structures: Our developed game consist of two cube puzzle parts, two challenge jump puzzle parts and an altered cube puzzle part for the multiplications arithmetic (which also is the final part of our game). The placement of the different puzzle element, and other important elements, of our game are shown below in figure 10.

34

Figure 10: An illustration depicting the structure of the game world used in our developed game.

35

4.10 Missing parts Unfortunately, during the work with our study we only manage to complete about 80% of our educational math game. We note that the reason behind this shortcoming mainly being attributed to time constraint. We estimate that our game is only around 80% completed due to the following three major reasons:

 A complete voice acting to the NPC for more interaction possibilities is not fully implemented.  Some bugs that are still unsolved (but do not affect the overall gameplay).  Our game is missing an end scene that tells the player that the game has ended.

Regarding the aforementioned voice acting, we think that our game feels empty without a proper companion that can communicate with the player. The player should not feel lonely in an educational game and therefore it is important for the game to offer multiple ways to interact (as mentioned in [6]).

Moreover, we have had a major issue with some bugs that are still unsolved. More particularly, there are some bugs in the event system which Sture has control over and from this, some of the events are not executing in a proper fashion. However, this problem is bypassed but not in a well implemented fashion since Sture is no longer able to be close to the player.

Finally, the overall theme and story of our game, which is about how the player found themselves at the start of the game in the beginning of “the big bang”. Our initial intention here was to implement changes to our game world as the player proceeds through our game, with the game world slowly deteriorating and becoming unstable which leads to a “black hole”. This black hole should the start to sucking in everything around it (including the player and Sture), turning the screen black. Then, after a moment of silence and darkness, the players “wakes up” to find themselves transported to a forest on Earth set during Viking age. Our motivation behind the aforementioned design was to see if it is possible to make smooth transition between school subjects and a hint that our game is bigger than it initially is (see figure 11 below).

36

Figure 11: Forest game world.

4.11 MDA framework Since our educational math game is not fully developed, the improvement and fine tuning with the MDA framework will not be used in detail in our study, since this framework (as mentioned previously) is used mainly for fully developed games. Therefore, in this study we only rely on the aforementioned framework as a guideline and as a classification umbrella for the presented results and analysis of our collected data. The MDA framework will instead be used when our game is fully developed and in future development iterations of our game, where this framework will be used to solve our games shortcomings mentioned by our test participants for this study.

37

5. Qualitative results and analysis

As mentioned above, the evaluation session (also denoted our testing session) of our study was performed by 14 selected test participants testing our developed educational game. Moreover, the aforementioned evaluation session was conducted in two different iterations with 7 test participants participating in the first iteration and 7 test participants participating in the second iteration.

The first iteration consisted of participants from the LAN-party, whereas the second iteration consisted of participants from a video game development program at the university. Note that no changes to our game were made between the two iterations; this in order to not affect the evaluation of our proposed educational games and end up with inconsistent data.

The rest of this chapter is organized as follows. First, we will present and analyze the observational results obtained from our aforementioned testing session (see section 5.1) and finally, we will present and analyze our obtained survey results (see section 5.2).

5.1 Observational results and qualitative data analysis In this section, we will present our 14 test participants’ feedback, interaction, behavior and remarks obtained while they were testing and evaluating our proposed educational game. Note that the test participants’ feedback, interaction, behavior and remarks will also be refereed as observational data. Our aforementioned observations will be divided into three parts, namely the introduction part of the game, the cube solving puzzle part and finally, the jump solving puzzle part. Note that the main aim of this division is to examine each game part alone and, in this way, isolate our observations and analysis of each part of our game. Moreover, during the aforementioned evaluation session of our proposed educational game we will use a ranking system constituting the ranks Good, Neutral and Bad (see below) in the following way:

 Rank Good: Is attributed to the participant expressing views that are interpreted as being positive towards the game.

 Rank Bad: Is the reverse of the good ranking, i.e., when views are interpreted as being negative towards the game.

 Rank Neutral: Are views that are interpreted as neither good nor bad, but a mixture of both (or interactions that cannot be interpreted). For example, 50% of participants clicked the Options menu in order to check the controls for the game.

Furthermore, the assigned ranks Good, Neutral and Bad will be further divided into the following four subcategories: Design, Educational, Aesthetics and Gameplay. Note that the aim of the aforementioned subcategories is to investigate which of the aforementioned 4 aspects of our game that actually affected a particular assigned rank. 38

5.1.1 Whole Game

Figure 12: The distribution of the collected observational data for the “whole game” part.

The distribution of the different ranks in our collected observational data for the “whole game” part is depicted in figure 12. The evaluation of our proposed educational game when it comes to the so called Whole game aspect of our game shows that approximately 55% of the total collected observational data (i.e., obtained feedback and remarks, and observed behavior and interaction) is ranked as Good, while approximately 25% is ranked as Bad, and approximately 20% as none of the others (i.e., Neutral).

The aforementioned result indicates that our test participants mostly enjoyed our proposed educational game. From this we conclude that a significant number of the design choices we made for the educational game hit its marks with the participants, but it is important to note that these results do not tell much beyond that.

Therefore, our gathered observational data will now be divided into three different categories, i.e., where the first category holds data that corresponds to the introduction part of our game (see section 5.1.2), the second category holds data that corresponds to the cube solving puzzle part of our game (see section 5.1.3), and the third and last category holds data that corresponds to the jump solving puzzle part our game (see section 5.1.4). Note that the main aim of this division of data is to first isolate and then examine the different parts of our educational game independently.

39

5.1.2 The introduction part of the game

Figure 13: The distribution of the collected observational data for the introduction part of our game.

The distribution of the different ranks in our collected observational data for the “introduction of the game” part is depicted in figure 13. The evaluation of our proposed educational game when it comes to the introduction part of the game of our game shows that approximately 73% of the total collected observational data (i.e., feedback, behavior, remark and interaction) is ranked as Good, while approximately 13% is ranked as Bad, and approximately 13% as Neutral (i.e., none of the others).

The aforementioned result clearly indicates that our test participants mostly enjoyed the introduction part of our educational game. From this we conclude that a significant number of the design choices we made for the introduction part of our educational game hit its marks with the participants.

As mentioned above, the three aforementioned different ranks are now further divided and categorized into four distinct subcategories (also aspects), namely educational, design, gameplay and aesthetics. Note that the main aim here is to investigate which of the aforementioned aspects the participants liked or disliked the most. Moreover, we will also present and analyze our test participants’ feedback and remarks in order to better understand their feelings when playing our educational game.

40

5.1.2.1 Good

Figure 13.1: The pie chart shows the category distribution for the rank good. Note that we in this chart only consider the observational data, obtained from our 14 test participants during their evaluation of the introduction part of our educational game. Also note that the introduction part of the game did not have any educational elements present, hence the educational category is not listed in this chart.

Figure 13.1 shows what the Good ranks, obtained during the evaluation of the introduction part of our game, and actually is “made up of”. Specifically, figure 13.1 shows that out of the 73.3% obtained Good ranks, approximately 52% went to aesthetics, 46 % went to design, and 2% went to gameplay. These results show that our test participants mentioned the aesthetic part of the introduction the most, followed up by the design of the introduction part of our game. This shows that our proposed educational game has succeeded in capturing the interest of the participant, which is nice observation since the first impression is important when designing video game (i.e., in order to capture the interest of the player to continue playing the game). Also note that the least mentioned aspects of the introduction part are the gameplay and the educational aspects. However, this is understandable since there is not a lot of gameplay in this part nor is there any educational elements since this part of the game mainly focuses on introducing the played to the game world.

Below we provide some representative feedback and remarks, and observed behavior and interaction, categorized as Good, obtained from our test participants.

Aesthetics: Design:  Nice explosion in the start of  Uplifting feedback the game  Normal controls  Nice lightshow  Clear on what to do  I like the animations  Looks like there is an element of  I like the theme of the game choice  Interesting colors  Freedom to explore  Funny name for the NPC  Went to the other door

Gameplay:  Uplifting feedback  Interesting gameplay choice, having to play an educational game form a first- person perspective. 41

5.1.2.2 Neutral

Figure 13.2: The pie chart shows the category distribution for the rank Neutral. Note that we in this chart only consider observational data obtained from our 14 test participants during their evaluation of the introduction part of our educational game.

Figure 13.2 shows what the Neutral ranks, obtained during the evaluation of the introduction part of our game, actually are “made up of”. Specifically, figure 13.2 shows that out of the 13.3% obtained Neutral ranks, approximately 88% went to design and approximately 12% went to aesthetics. We note that one possible explanation for the high percentage regarding the design aspect is that 50% of our test participants went to the Options tab in the menu screen in order to check the control schema, which we anticipated some of the participants would do and therefore, programmed the game to start instead. This since the control schema is the first thing presented in the introduction part of the game.

Below we provide some representative feedback and remarks, and observed behavior and interaction, categorized as Neutral, obtained from our test participants.

Design: Aesthetics:

 Clicked on the options tab  “Okay what is this in the menu screen. funny thing supposed  Labyrinth structured to be”, referring to the platforms. NPC Sture.

42

5.1.2.3 Bad

Figure 13.3: The pie chart shows the category distribution for the rank Bad. Note that we in this chart only consider the observational data obtained from our 14 test participants during their evaluation of the introduction part of our educational game.

Figure 13.3 shows what the Good ranks, obtained during the evaluation of the introduction part of our game, actually are “made up of”. Specifically, figure 13.3 shows that out of the 13.3 % obtained Bad ranks, approximately 88% went to design and approximately 12% went to gameplay.

We note that one possible explanation for the high percentage regarding the design aspect is that 29% of our test participants could not see the crosshair, since it was too small and obscured by the game world.

Another bad design choice was the inability to customize the control schema. Finally, another bad design choice was how the mouse sensitivity was consider inverted by one of our test participants.

However, we note that all of the aforementioned highlighted bad design choices are easy to fix and improve. The most notable about these results is that none of them addressed any core design choices for the introduction part of the game. This observation implies that, in general, the introduction part of our proposed educational game is well designed and enjoyable by our test participants.

43

5.1.3 The cube solving puzzle part

Figure 14: The distribution of the collected observational data for the cube solving puzzle part of our game.

One of our biggest concerns while developing the educational game is how the participants would react and perceive the cube solving puzzle part of the game, since it is one of the core design element of our educational game. Moreover, since this design element never (or at least rarely) have been seen together with an educational element, it is easy for this design choice to either become a hit or a miss among the participants.

However, figure 14 above shows that our proposed cube solving puzzle was actually perceived quite good among our test participants, with approximately 64% of the total collected observational data (i.e., feedback and remarks, and behavior and interaction) is ranked as Good, while approximately 22% is ranked as Bad, and approximately 14% as Neutral (i.e., none of the others).

As mentioned above, the three aforementioned different ranks are now further divided and categorized into four distinct subcategories (also aspects), namely educational, design, gameplay and aesthetics. Note that the main aim here is to investigate which of the aforementioned aspects the participants liked or disliked the most. Moreover, we will also present and analyze our test participants’ feedback and remarks in order to better understand their feelings when playing our educational game.

44

5.1.3.1 Good

Figure 14.1: The pie chart shows the category distribution for the rank Good. Note that we in this chart only consider the observational data obtained from our 14 test participants during their evaluation of the cube solving puzzle part of our educational game.

Figure 14.1 shows what the Good ranks, obtained during the evaluation of the cube solving puzzle part of our game, actually are “made up of”. Specifically, figure 14.1 shows that out of the (63.9%) obtained Good ranks, approximately 35% went to aesthetics, 28% went to gameplay, 24% went to design, and 13% went to educational. This distribution shows, among other things, that our test participants mentioned the educational part of the puzzle the least. This fact could possibly indicate that either the educational element was not viewed as that special among our test participants since they actually already understood basic arithmetic’s, or that they perceived the cube solving puzzle part more as a puzzle element than an educational element (although it was actually design to be 50/50). Below we provide some representative feedback and remarks, and observed behavior and interaction, categorized as Good, obtained from our test participants.

Aesthetics: Gameplay:

 Nice glowing solving platforms.  No life system, no penalty for failing.  Nice animation when build up the plus/minus model after finishing all  It is always fun to

three puzzles. have some physics affecting the cubes.  Nice feedback when solving the puzzles.  The gameplay feels intuitive.

 Fun puzzles.

45

Design:  It is clear on what Educational: to do.  Cool that math

worked.  Cool puzzle

element.  Good explaining.

 The Freedom to  Correct presentation solving the puzzle of the math. in different ways.  Interesting take on solving math.

5.1.3.2 Neutral

Figure 14.2: The pie chart shows the category distribution for the rank Neutral. Note that we in this chart only consider the observational data obtained from our 14 test participants during their evaluation of the cube solving puzzle part of our educational game.

Figure 14.2 shows what the Neutral ranks, obtained during the evaluation of the cube solving puzzle part of our game, actually are “made up of”. Specifically, figure 14.2 shows that out of 13.9% obtained Neutral ranks, 90% went to design and 10% went to educational. This distribution shows, among other things, that our test participants mentioned the design part of the puzzle the most. On possible explanation for this is the following observation that we made during our testing session: 6 out of our 14 test participants (i.e., approximately 43%) jumped down on the newly animated plus model in order to see if something happened. Moreover, another possible explanation is how the solving cubes spawned and sometimes also forming high towers which prompted some of the participants to respond by saying: “well that a lot of cubes” or “wow! Look it is a tower of number cubes”. To summarize, the most interesting result here is the large number of participants that jumped down on the plus model; this result can be worked on by hiding a fun feedback (or Easter egg) for the players in future installments of our educational game.

46

5.1.3.3 Bad

Figure 14.3: The pie chart shows the category distribution for the rank Bad. Note that we in this chart only consider the observational data obtained from our 14 test participants during their evaluation of the cube solving puzzle part of our educational game.

Figure 14.3 shows what the Bad ranks, obtained during the evaluation of the cube solving puzzle part of our game, actually are “made up of”. Specifically, figure 14.3 shows that out of the 22.2% obtained Bad ranks, approximately 37% went to educational, 25% went to gameplay, 19% went to design, and 19 % went to aesthetics. This result indicates that the educational aspect of the cube solving puzzle was not perceived as favorable as we presumed it would be. One possible explanation is that our test participants found the mathematical level being too easy and unchallenging for them. This is understandable since the age group of our test participants is higher than the intended target group of 8-12 years old. Regarding the Bad ranks associated with the gameplay aspect we observe that the mainly annoyances the participants expressed were when falling of the platforms in order to advance to the next level of our game, and when the cubes falling of the solving platform and thus break the game flow (and more important, disturbs the problem-solving process). However, the aforementioned annoyances are actually implemented on purpose by us in order to put the players on “their toes”, and make the gameplay a little bit more challenging by adding some so-called frustration inducing elements. When it comes to the gameplay aspect we observed that some of our test participants did not like the design of the cube solving puzzle, with most complaints being about why the gameplay is designed to have the aforementioned small so-called frustration inducing elements. However, worth noting here is that our test participants continued playing the game and did not constantly complain about the design; this indicates that the design of the frustration inducing elements did not break the enjoyment of the game. Instead these elements only made the game a little bit more challenging for the player, which is considered to be a good thing in the game development community. The only complaint we received about the aesthetic aspect for the cube solving puzzle part of our game is that some of the feedback for completing the entire puzzle part was either too weak or annoying. However, this is easy to improve for future installments of our game.

47

5.1.4 The jump solving puzzle part

Figure 15: The distribution of the collected observational data for the jump solving puzzle part of our game.

The distribution of the different ranks in our collected observational data for the “jump solving puzzle part” part is depicted in figure 15. Just like for the cube solving puzzle, we had big concerns while developing this part of our educational game since this part is also one of the core design element of our educational game. But compared to the cube solving puzzle this design element has been seen in other games, even some together with an educational element. Figure 15 shows that our test participants also viewed our proposed jump solving puzzle in a positive favor; approximately 68% of the total collected observational data (i.e., feedback, behavior, remark and interaction) is ranked as Good, while approximately 15% is ranked as Bad, and approximately 17% as Neutral (i.e., none of the others

The aforementioned result indicates that the jump solving puzzle part of our game is well designed and implemented in a familiar way; i.e., in a way that our test participants are familiar with from other games with similar puzzle elements.

48

5.1.4.1 Good

Figure 15.1: The pie chart shows the category distribution for the rank Good. Note that we in this chart only consider the observational data obtained from our 14 test participants during their evaluation of the jump solving puzzle part of our educational game.

Figure 15.1 shows what the Good ranks, obtained during the evaluation of the jump solving puzzle part of our game, actually are “made up of”. Specifically, figure 15.1 shows that out of the 67.6% obtained Good ranks, approximately 39% went to design, 26% went to educational, 22% went to gameplay, and 13% went to aesthetic. We observe that our test participants viewed this puzzle to have higher educational value than its cube solving puzzle counterpart. One possible explanation for this is that our jump solving puzzle is designed as a challenge level (or a so-called “boss” level in video game terminology) with a challenging problem to solve, aiming at testing and challenge a player’s newfound skills. Moreover, this could possibly also explain our obtained result for the design aspect, since our test participants express that they liked our design of challenge in the jump solving puzzle part of our game.

Regarding the gameplay aspect, we observe that our test participants mostly liked that there was no so-called no penalty system implemented in the jump solving puzzle part, which actually allowed them to solve the jump puzzle without fearing any major repercussions (expect falling of the platform and being picked up by our NPC Sture).

Finally, when it comes to aesthetic aspect of our jump solving puzzle game part we obtained some minor comments like “nice colors” and “cool animation for completing the jump solving puzzle”.

Below we provide some representative feedback and remarks, and also observed behavior and interaction, categorized as Good, obtained from our test participants.

49

Design: Educational:  Challenging puzzle.  Good explaining of the  Fun puzzle. math.  Challenging math in a good  It is clear what to do.  There is a freedom to test way.

around without being punished.

Gameplay: Aesthetics:

 No penalty system.  Nice colors when jumping  Sture saves you when falling of on the platform. the platform.  Nice completion animation for the door leading to the next level.

 Nice animation of the plus/minus model in the middle of the level.

5.1.4.2 Neutral

Figure 15.2: The pie chart shows the category distribution for the rank Neutral. Note that we in this chart only consider the observational data obtained from our 14 test participants during their evaluation of the jump solving puzzle part of our educational game.

Figure 15.2 shows what the Neutral ranks, obtained during the evaluation of the jump solving puzzle part of our game, actually are “made up of”. Specifically, figure 15.2 shows that out of the 17.6% obtained Neutral ranks, approximately 83% went to gameplay and 17% went to educational. The reasons for the high percentage being gameplay related, 50

In particular, when it comes to feedback related to the gameplay aspect, we observed that 5 out of our 14 test participants (approximately 36%) stopped and looked at the numbers on the platforms before jumping one them. When we asked these participants to explain their actions here in a post interview, some of them answered that they wanted to calculate the correct jump path to solve the puzzle without jumping around and guessing. This indicates that some of our test participants actually viewed this part of the game differently and added another layer of rules in order to make gameplay a bit more challenging. As for the educational feedback on this part of our game, one of our test participant expressed this as follows: “It isn’t the puzzle that is bad, it is just my past knowledge that is failing me right now”.

5.1.4.3 Bad

Figure 15.3: The pie chart shows the category distribution for the rank Bad. Note that we in this chart only consider the observational data obtained from our 14 test participants during their evaluation of the jump solving puzzle part of our educational game.

Figure 15.3 shows what the Bad ranks, obtained during the evaluation of the jump solving puzzle part of our game, actually are “made up of”. Specifically, figure 15.3 shows that out of the 14.7% obtained Bad ranks, 80% went to design and 20% went to gameplay. Regarding the gameplay aspect, we observe that 4 of our 14 test participants (approximately 29%) found the design of the jump solving puzzle to be annoying to complete, and indeed mostly complained when falling of the platforms or undershooting (or overshooting) the solution number. Furthermore, complaints related to the gameplay for this part were only about the placement of the scoreboard in this level; it actually become hard to see it when being too close. However, note that the most interesting part of our obtained results here is the absence of any comments regarding the educational aspect of the jump solving puzzle part of our game. This could possibly indicate either that the educational aspect is well implemented for this part of our game, or that our test participants did not view this aspect as that special since, perhaps since this aspect has already been seen in other games.

51

5.2 Survey results and qualitative data analysis

Our obtained survey results are divide up into two parts, with the first part presenting information relating our sample, such as age and gender demographics, average hours spent playing video game per week, if our test participants have played other educational game before and what their perception is about these games, and completion times of the educational game. The second part presents our obtained results on how our test participants perceived our proposed educational game, with questions about their perceptions of the educational part of our proposed game, how overall interesting our educational game was for them, what genre our educational game could be categorized as, and if they would like to use similar games as an aiding tool for their future studies and if so, is this because of gained knowledge renewal or just for the fun thing of playing.

Our handed-out survey contained two open-ended questions, which allowed the participant to explain with their own words how they perceived the educational part of our proposed educational game and if they would consider using a similar game in the future. Both the aforementioned questions were coded for analysis in a similar fashion to the analysis of our obtained observational data in 5.1 above. Note that the main purpose of the aforementioned two questions is to either support or contradict our observational analysis (see 5.1).

5.2.1 The test participants’ perceptions of the educational part of our game

The bar chart in figure 16 below shows the coded answers (i.e., coded either as Good or Bad) on the posed open-ended survey question “Did the educational part of the game feel good or bad (please explain)”.

Figure 16: This bar chart illustrates how our test participants perceived the educational part of our proposed game. 52

Note that the main purpose of the aforementioned question is to isolate our test participants’ perceptions on the educational part of our proposed game; this in order to later on examine which parts they actually consider to be good or bad.

From figure 16 above we observe that approximately 77% of the coded answers could be interpreted as Good, whether 23% of the coded answers could be interpreted as Bad. This result shows that our test participants mostly had positive thing to say about the educational part of our proposed educational game.

In a similar fashion to our analysis in 5.1 we will now further divide our obtained answers coded (or ranked) as Good (or Bad, respectively) into three different categories, namely the design aspect, the educational aspect and the gameplay aspect; this in order to find out which aspects of the educational part of our game the test participants actually liked (or disliked).

5.2.1.1 Good

Figure 16.1: The pie chart shows the category distribution for the obtained answers coded (or ranked) as Good. Note that we in this chart only consider our test participants’ perceptions on the educational part of our proposed game.

Figure 16.1 shows what the Good ranked answers, obtained from our survey study, actually are “made up of”. Specifically, figure 16.1 shows that out of the 77.3% obtained Good ranked answers, approximately 47% went to the design, 29% went to the educational, and finally 24% went to the gameplay.

From figure 16.1 we can see that our test participants most liked how the educational part was designed, which indicates that our design choices regarding the educational part of the game mostly hit their mark with the participants. Below we provide some representative feedback and remarks, categorized as Good, obtained from our survey study.

 The game allowed the player to fail in order to learn how to succeed.  It was clear on what to do and engaging.  Good since it had simple math questions.

53

 It felt good, you could really feel that you were advancing through the game.  In a good way, you do not feel that you are solving math.  Good, because of how figuratively the game explained the math solving, it was all a front of you.

5.2.1.2 Bad

Figure 16.2: The pie chart shows the category distribution for the obtained answers coded (or ranked) as Bad. Note that we in this chart only consider our test participants’ perceptions on the educational part of our proposed game.

Figure 16.2 shows what the Bad ranked answers, obtained from our survey study, actually are “made up of”. Specifically, figure 16.2 shows that out of the 22.7% obtained Bad ranked answers, approximately 38% went to design, 37% went to educational, and finally 25% went to gameplay. Below we provide some representative feedback and remarks, categorized as Bad, obtained from our survey study.

 You need to have some previous math knowledge.  It was not challenging.  The math was too easy.  It felt educational, but you did not learn a lot since you did not reflect on what you just did. Instead you played the game out of cheer enjoyment of what the game had to offer. So if there would be a way to reflect on what you just did, maybe you could learn more out of solving the puzzles.

Note that the last remark from our test participants is really an interesting one, since we actually have not thought about the importance of reflections in the learning process, i.e., how we could implement a design pattern facilitating reflections among the players in our gameplay. This aspect will definitely need to be taken into account in future implementations of our educational game.

54

5.2.2 Would our test participants use a similar game?

The pie chart in figure 17 below shows the distribution of the obtained answers (partly coded either as Yes, No or Maybe) on the posed open-ended survey question “Would you consider using a similar game in the future as a helping tool for your: studies, knowledge renewal or as a fun thing to play? (Please explain.)”

Figure 17: The pie chart shows the distribution of the obtained answers for open- ended question: Would you consider using a similar game in the future as a helping tool for your: studies, knowledge renewal or as a fun thing to play? (Please explain).

Note that the pie chart in figure 17 above shows whether an educational game like ours is of interest among our test participants; i.e., if they would consider using educational games like our proposed.

Figure 17 show that 80% of our test participants would consider using a similar game, and 20% would maybe use a similar game, if certain conditions are meet. This result show that there is indeed an interest among our test participants to use a similar game as our proposed educational game, which indicates that the design of our educational game (and similar ones) are desired by our test participants.

In a similar fashion to our analysis in 5.1 and previous sections, we will now further divide the three categories (i.e., the aforementioned Yes, No, or Maybe category), into four by us identified subcategories, namely “utility”, “fun thing”, “educational”, and “design and interaction”; this in order to explain why our test participants would consider using an educational game like our proposed game. Note that the aforementioned four subcategories are identified as representatives for our obtained answers on the aforementioned question.

55

5.2.2.1 Yes

Figure 17.1: The pie chart shows the category distribution for the affirmative replies on our posed survey question “Would you consider using a similar game in the future …”

Specifically, figure 17.1 shows that out of the 80% obtained Yes answers,

 35% went to Utility (i.e., would use to renew knowledge about a certain subject),

 29% went to Fun thing (i.e., would use as an entertainment product),

 18% went to Educational (i.e., would use to learn a new subject),

 18% went to Design and interaction (i.e., would use to enhance mental and spatial abilities).

Below we provide some representative feedback and remarks on the question why our test participants would consider using a similar game in the future:

 Yes- It is a good way to enhance your spatial capacity.  Absolutely. The game is fun, and I learn the most when I have fun.  It would be fun to have a similar game like this one, not just as a fun thing but also as an aid to my studies. Because you actually learn thing by using the game.

 Yes - Since the interaction with the math task had a good mental and practical combination.  Absolutely - It appeals to both children and adults. 56

5.2.2.2 Maybe

Figure 17.2: The result of this pie chart shows what conditions our test participants have before considering using a similar game.

Specifically, figure 17.2 shows that the reasons to why our test participants would “Maybe” consider using a similar game are:

 50% Customizable: The played should have the ability to add other subject into the game.

 25% More advanced: The game should be able to cover more advanced levels of a certain subject.

 25% Cover other subject: The played should have the ability to choose between an array of different subject such as physics or chemistry.

By analyzing the aforementioned data, we learn some of our test participants would like some kind of customization feature in order to have better control of what they want to learn, and also the ability to change the difficulty level of the game. These two feature requests will be taken into account in future implementations of our educational game.

Finally, our survey also contained two close-ended questions, one asking our test participants to name three video game genres they usually enjoy playing, and another question which asks our test participant what genre our educational game could be categorized as. Below we provide the obtained results on the aforementioned two questions.

57

5.2.3 Usually played game genre

Figure 18: The bar chart shows what 3 genres of games our test participants usually play. For a description of the different game genre abbreviations above, see Appendix B.

The chart in figure 18 above shows the number of times a certain genre is mentioned by our test participants. Note that this result is good to know in order see what gaming background our test participants have.

As mentioned above, one purpose of the first survey question is to understand which form of gaming background our test participants are from. Moreover, we also would like to investigate whether our design choice of playing the game from a first-person perspective adhered to the majority of usually played game genre. From figure 18 we actually see that approximately 57% of our test participants actually mentioning playing shooter from a first-person perspective, which is abbreviated FPS (First-person-shooter) in figure 18.

5.2.4 How would you categorize (w.r.t. genre) our game?

The main purpose of this question is to see from which game genre lens they (i.e., our test participants) viewed our proposed game; this in order to help developers to better classify our developed game in future implementations. The obtained result shows that our proposed game was viewed mostly as an educational puzzle platform played through a first-person perspective, which actually fits the view we had while developing our educational game.

58

6. Quantitative results and analysis

A statistical analysis of our collected quantitative data will be performed by calculating the descriptive statics of the different questions posed in our survey; this in order to investigate the general tendencies in the collected data (i.e., by calculating the mean, maximum, and minimum). Moreover, we will also calculate the spread of the score data with the variance, strand deviation and range, and out of this information we can later on describe the variables as independent, dependent or control. Some of the independent variables will be compare by calculating their inferential statics, in order to answer some of our aforementioned hypotheses and our research question.

6.1 Some basic definitions

We start by providing some basic definitions in probability theory and statistics which will be used throughout our analysis of the data.

6.1.1 Arithmetic mean

Given a set {푥푖} of n values. The arithmetic mean 휇 of a set of values is then calculated as follows: 1 휇 = ∑푛 푥 푛 푖=1 푖

6.1.2 Variance

Given a set {푥푖} of n values and a (sample) mean denoted by 휇. The (sample) variance 휎2 is then calculated as follows: 푛 1 휎2 = ∑(푥 − 휇)2 푛 푖 푖=1

6.1.3 Standard deviation The standard deviation of a set of n values (with (sample) variance 휎2) may then be calculated as follows: √ 휎2 =휎

59

6.2 Sample result and analysis

As mentioned previously, the main aim of the first questions of our survey was to gather information about our test participants (i.e., our sample). Note that the test participants of our study consisted of only male participants (i.e., 100% male in the gender demography), and that all of them have played some sort of video game before (see figures 19 - 20 below).

Age demography of the Have played educational sample games before

14% 7% 1 29%

50% 13

16 -20 21-25 26-30 31- 35 < x Played Not Played

Figure 19: The pie chart shows the Figure 20: The pie chart shows the age distribution of the test distribution of our test participants’ earlier participant of our study. Note that experience with educational games. Note here our test participants are divided that our test participants had to choose up and categorized into 4 age between answering either “Yes” (i.e., played

groups; this in order to maintain 4 an educational game before) or “No” (i.e., age groups without saturating the not played any educational games before).

data. Also note that this result will Note that this question is related to another be a variable when analyzing question which asks what perception our relationships with other variables. test participants have about educational games in general. Hence, this question is a gatekeeper which opens up if answered “Yes”.

We begin by analyzing the age demography (depicted in figure 19 above) of our test participants, see figure 21 below.

Std. N Range Minimum Maximum Mean Deviation Variance Age 14 30,00 18,00 48,00 24,8571 7,50238 56,286 Valid N 14 (list wise)

Figure 21: Descriptive statistics of the age demography of our test participants.

60

Out of the aforementioned descriptive statics we see that the range of age is 30, since our youngest test participant is 18 years old (the minimum) and our oldest test participants is 48 years old (the maximum). Moreover, note that the calculated mean value of our sample group is approximately 25 years, and that the age spread of our sample group from the mean is approximately ±7.5 years.

6.3 Average number of hours spent playing video game per week

The average number of hours spent playing video game per week among our test participants is divided up and categorized as shown in figure 22 below.

Average amount of hours spent playing video games per week 40% 29% 21% 21% 21% 20% 8%

0% 0 -- 10 11 --20 21 -- 30 31 -- 40 41 -- 50 < x f

Figure 22: Bar chart that shows the average number of hours spent playing video games every week among our test participants. Note that our obtained results are categorized into five groups. Also note that the aforementioned result will used be as a variable when analyzing relationships with other variables.

Std. N Range Minimum Maximum Mean Deviation Variance Average weekly hours spent playing 24,78 14 49,00 1,00 50,00 16,60481 275,720 videogames 57

Figure 23: Descriptive statistics of the number of hours spent playing video games

among our test participants.

61

Out of the aforementioned descriptive statics we see that the range of the number of hours spent playing video games per week is 49, since one of our test participant plays video games one hour per week (the minimum) while another test participants plays video games for 50 hours per week (the maximum). Moreover, note that the calculated mean value of our sample group is approximately 25 hours, and that the spread of our sample group from the mean is approximately ±17 hours.

6.4 Completion time In order to answer our aforementioned hypothesis “Have we managed to create a demonstration game that takes approximately 15-25 minutes to complete?” we observed the times it took for our test participants to complete our proposed game, and then plotted the obtained result (see figures 24 - 25).

Figure 24: This spread chart plots out the game completion time (Y- axis) versus test 40 37 participant (X- axis). Note that 35 the aforementioned completion 30 times were recorded during the 27 28 observation phase. Also note 25 24 that this result will used be as 20 20 19 18 18 18 18 17 17 18 a variable when analyzing 15 16 relationships with other 10 variables. 5

0

1 2 3 4 5 6 7 8 9 10 11 12 13 14

Figure 25: This bar chart categorizes the observed game Completion times completion times of our test 100% participants into three 79% different groups. Moreover, 80% note that the result depicted in this bar chart will be used to 60% answer our hypothesis whether it takes approximately 40% between 15 - 25 minutes to 14% complete our proposed 20% 7% educational game. 0% 15-25 min 25-35 min 35-45 min

62

Std. N Range Minimum Maximum Mean Deviation Variance Completion Time 14 21,00 16,00 37,00 21,0714 5,92860 35,148

Figure 26: Descriptive statistics of the game completion times among our test

participants.

Out of the aforementioned descriptive statics depicted in figure 26 above, we see that the range of the completions times is 21 minutes, since the completion time for one of our test participant is 16 minutes (the minimum) while the completion time for another test participant is 37 minutes (the maximum). Moreover, note that the calculated mean value of our sample group is approximately 21 minutes, and that the spread of our sample group from the mean is approximately ±6 minutes.

Note that the aforementioned result confirms the hypothesis that we (virtually) succeeded in developing an educational game that takes approximately 15-25 minutes to complete. This follows from the observations that the aforementioned calculated mean value of our sample group is approximately 21 minutes, and that the spread of our sample group from the mean is approximately ±6 minutes.

Moreover, from figure 25 we observe that 79% of the collected completion times landed in the “15 - 25 minutes” category. Hence, we conclude that we actually managed to keep the time window of 15-25 minutes as a demonstration for our proposed game. Recall that the reasons for collecting this data from the beginning is that most demonstration games have a completion time around 15 – 25 minutes.

6.5 How clear was the educational part of our game? The test participants of our study were also asked to rate how clear the educational part of the game was presented; the rating went from 0 (unclear) and 10 (clear). The result of this question is shown in figure 27 below. Note that the main aim of this question is to examine whether our test participants understand the process and design of the educational part of our proposed game.

63

Figure 27: This bar chart shows how clear the educational part of the How clear was the game was perceived by our test educational part of the participants, where 0 is unclear and 10 is clear. Note that the result game? depicted in the aforementioned bar 40% 36% 36% chart will later be used to back up (or contradict) other variables in our 30% study. 20%

10% 7% 7% 7% 7%

0% 5 6 7 8 9 10

Std. N Range Minimum Maximum Mean Deviation Variance Educational 14 5,00 5,00 10,00 7,5000 1,22474 1,500 rating

Figure 28: Descriptive statistics for the test participants’ experienced clearness of

the educational part of our proposed game.

Out of the aforementioned descriptive statics depicted in figure 28 above, we see that the range of the score is 5, since one of our test participant provided the score 5 (the minimum) while another test participant provided the score 10 (the maximum). Moreover, note that the calculated mean value of our sample group is approximately 8, and that the spread of our sample group from the mean is approximately ±1.2.

This result shows that most of our test participants understood the educational part of our proposed game, which indicates that the design of the educational part is clear enough.

64

6.6 How interesting was our proposed game? The participants were asked to rate how interesting our proposed educational game is; this in order to examine whether our selected combination of design choices and aesthetic elements succeeded in capturing the test participant’s interest. The scale rating for this question is 0 (uninteresting) – 10 (interesting). The obtained result is shown in figure 29 below.

Figure 29: The bar chart shows how Interest score “interesting” the concept of our game (as a whole) was perceived by 40% 36% our test participants, where 0 is 35% 29% uninteresting and 10 is interesting. 30% Note that the result depicted in this 25% 21% bar chart will be used to back up (or 20% contradict) other variables in our 15% study. 10% 7% 7% 5% 0% 6 7 8 9 10

Std. N Range Minimum Maximum Mean Deviation Variance Interest rating 14 4,00 6,00 10,00 8,2857 1,32599 1,758

Figure 30: Descriptive statistics for the test participants’ interest in playing our proposed educational game.

Out of the aforementioned descriptive statics depicted in figure 30 above, we see that the range of the score is 4, since one of our test participant provided the score 6 (the minimum) while another test participant provided the score 10 (the maximum). Moreover, note that the calculated mean value of our sample group is approximately 8, and that the spread of our sample group from the mean is approximately ±1.3.

This result shows that our selected design choices and aesthetic elements actually succeeded in capturing the participant’s interest quite well. Hence, the aforementioned result supports our hypothesis “Did the choice of theme, design and gameplay style make the game unique and interesting for the participant?”

65

6.7 How unique was the game? Our test participants were also asked to rate how unique they perceived our educational game; this in order to examine whether the idea in which the educational game was created around is considered to be a unique one according to our test participants. The rating scale for this question is 0 (not unique) – 10 (unique). The obtained result is shown in figure 31 below.

Figure 31: The bar chart shows how unique the game a whole was Uniqueness score perceived by our test participants, 50% 43% where 0 is not unique and 10 is unique. Note that the result shows 40% 29% whether our test participants have 30% seen (or played) an educational game similar to our proposed. Note 20% 14% 7% 7% that our obtained result here will 10% also be used later in order to back up (or contradict) other variables in 0% our study. 6 7 8 9 10

Std. N Range Minimum Maximum Mean Deviation Variance Unique 14 4,00 6,00 10,00 8,7857 1,31140 1,720 score

Figure 32: Descriptive statistics for the test participants’ perceived uniqueness of our proposed educational game.

Out of the aforementioned descriptive statics depicted in figure 32 above, we see that the range of the score is 4, since one of our test participant provided the score 6 (the minimum) while another test participant provided the score 10 (the maximum). Moreover, note that the calculated mean value of our sample group is approximately 9, and that the spread of our sample group from the mean is approximately ±1.

The aforementioned result shows that our design choices and aesthetic elements actually succeeded in conveying a certain uniqueness about our game. Hence, this result actually supports our hypothesis “Did the choice of theme, design and gameplay style make the game unique and interesting for the participant?”

66

6.8 How do our test participants view other educational games versus our proposed educational game?

In order to answer our aforementioned research question, we now continue by investigating how our test participants experience our game compared to their past experience with other educational game.

Specifically, we start by asking our test participants to rank their views about other educational games they have played before; we use the ranks Boring, Childish, Fun, Interesting and Challenging. See figure 34 below for the obtained result. Note that this question will assist us in answering the research question of our study.

Furthermore, the aforementioned ranks were quantified according to the following scoring system: Boring = 1, Childish = 2, Fun = 3, Interesting = 4 and Challenging = 5; this in order to establish the result for this question as a control variable to be compared with the result of “How the participants viewed our educational game” (see figure 36 below).

Before continuing on with the aforementioned analysis we posed the following so- called “gatekeeper” question to our test participants: “Have you played any educational games before?” (See figure 33 below for the obtained result).

Have played educational Figure 33: The pie chart shows the distribution of the obtained answers to the games before question whether our test participants had played educational games before. Note 1 that our test participants had to choose between played (Yes) and not played (No). Moreover, this question is actually related to another question which asks our test 13 participants about their perception of educational game in general. Hence, this Played Not Played question is a so-called “gatekeeper”

question which opens up if answered affirmative. We further note here that out of our 14 test participants, one of them had not played any educational game before, and therefore, N = 13 (i.e., the aforementioned test participant is not included in this analysis).

67

6.8.1 Our test participants’ views of other educational games

Figure 34: The bar chart shows our test participants’ views and experience of other educational games.

Std. N Range Minimum Maximum Mean Deviation Variance How they 13 4,00 1,00 5,00 2,3846 1,19293 1,423 viewed other educational

games

Figure 35: Descriptive statistics for the test participants’ views and experience of

other educational game.

Out of the aforementioned descriptive statistics for aforementioned control variable “our test participants’ views on other educational game” (see figure 35), we see that the range of the score is 4, since one of our test participants provided the score 1 (the minimum) while another test participants provided the score 5 (the maximum). Moreover, note that the calculated mean value of our sample group is approximately 2, and that the spread of our sample group from the mean is approximately ± 1.

This result shows that the majority of our test participants viewed other educational games as boring and childish.

68

6.8.2 Our test participants’ views of our proposed educational games

Figure 36: This bar chart shows how our test participants viewed and experienced our proposed educational game.

Std. N Range Minimum Maximum Mean Deviation Variance How they 13 2,00 3,00 5,00 4,0769 0,86232 0,744 viewed our educational

game

Figure 37: Descriptive statistics for our test participants’ views and experience of our proposed educational game.

Out of the aforementioned descriptive statistics for the aforementioned variable “our test participants’ views on our proposed educational game” (see figure 37), we see that the range of the score is 2, since one of our test participants provided the score 3 (the minimum) while another test participants provided the score 5 (the maximum). Moreover, note that the calculated mean value of our sample group is approximately 4, and that the spread of our sample group from the mean is approximately ± 1.

This result clearly show that our proposed educational game was viewed fairly well, and clearly better than other educational games that our test participants had played before; this answers our posed question on “How did the participants experience our game compare to their past experience with other educational game?”

69

6.9 Inferential statistics After calculating all the collected quantitative data and analyzed them in a descriptive manor, we can continue on and compare or relate two or more of these variables in order to answer some of the aforementioned hypotheses of this study.

The inferential statistics that will be used in order to calculate if there is a relationship between two or more variables is Pearson product-moment correlation [40], which can be used for calculating the strength of the relationship r, but also how significant the correlation p (or alpha level) is between the variables. The significant level for this study is set to 0.05.

6.9.1 Inferential statistics: age versus completion time “Does the age of the participant play a role in their overall performance and ability to complete the game?” The aforementioned hypothesis was included in order to see if the age of the participant played a role in understanding and completing our proposed educational game; this in order examine if the game became harder to understand if the age group consisted of either older or younger participants.

Completion time Age Completion Pearson 1 0,725** time Correlation Sig. (2-tailed) 0,003 N 14 14 Age Pearson 0,725** 1 Correlation Sig. (2-tailed) 0,003 N 14 14 Figure 38: Correlations. **. Correlation is significant at the 0.01 level (2-tailed).

The results depicted in figure 38 above show that there is a very strong relationship between age and completion time with an r value of 0.725 out of 1.0. The results are also statistically significant, with an off p = 0.003 correlation between the two variables. Hence, it is clear that the age of the participant plays a role in completing our proposed educational game faster.

70

6.9.2 Inferential statistics: average time spent versus completion time “Is there a relationship between the participants’ average time spent playing video game per week and their completion time?” The aforementioned hypothesis was included in order to see if the participant’s gaming experience played any role in completing the game quicker; this is to see whether our proposed educational game requires any past gaming knowledge in order to complete it in an acceptable time.

Completion time Average hours Completion Pearson 1 -0,233 time Correlation Sig. (2-tailed) 0,423 N 14 14 Average hours Pearson -0,233 1 Correlation Sig. (2-tailed) 0,423 N 14 14

Figure 39: Correlations.

The results depicted in figure 39 above show that there is a weak relationship of r = -0,233, were one variable, average hours, increases in size while the other variable, completion time, decreases in size. Hence, there is some truth in our aforementioned hypothesis, but the significance correlation, which is p = 0,423, is way above the accepted level of 0.05 and is consider to be for the moment insignificant. The reason for the low significance correlation may be because of the low number of participant in our sample, or the spread of the collected data regarding average hours per week.

71

7. Convergent analysis

As mentioned above, the analysis of the collected data in this study will be performed by following the so-called convergence design process [37, 50], with the qualitative and quantitative data sets first analyzed separately and then later on they will be merge together. The merging of the two aforementioned two data sets will be performed by either comparing them side by side or by quantifying the qualitative data; this in order to either contradict or support the aforementioned research question and hypothesis of our study.

Specifically, in this chapter we will merge, relate or compare the two aforementioned data sets in order to answer the research question of our study:

 Is it possible to develop an educational math computer game that is both satisfying and interesting for the player to play?

The evaluation of the research question will be performed by scoring the result as either supporting/confirming with a value of 1, or contradicting/disconfirming with a value of -1, or inconclusive with a value of 0. Later, the scores are summed together in order to calculate so-called confirmation percentage of our aforementioned research question.

We begin by comparing/merging and relate the analyzed results below:

- 4.4.1 Educational part of our proposed game

We observe that out of the obtained (coded) answers on the posed question whether the educational part of our proposed game felt good, as much as 77% of the answers of our test participants could actually be ranked as “Yes”. Only 23% were ranked as “No”. Hence, this result support our research question. +1.

- 4.4.1.1 Good

Recall that out of the obtained “Good” ranks, 48% went to the design part, 28% went to the educational part, and 24% went to the game play part. Hence, from this result we can see that our test participants enjoyed the design and the educational aspects of our proposed educational the most. Hence, this result support our research question, because the result shows that the design of this part is well implemented. +1.

- 4.4.1.2 Bad

Recall that out of the obtained “Bad” ranks, 38% went to the educational part, 37% went to the design part, and 25% went to the game play part.

72

Moreover, out of this result we also observe that our test participants mainly disliked the educational and design aspects of the educational part of the game. Hence, this result actually contradicts our research question, thus -1.

Below we provide some of our test participants’ comments, ranked as “Bad”, on the educational part of our proposed game. Note that the comments below are selected due to its representativeness.

 It was not challenging.  The math was too easy.  It felt educational, but you did not learn a lot since you did not reflect on what you just did. Instead you played the game out of cheer enjoyment of what the game had to offer. So, if there would be a way to reflect on what you just did, maybe you could learn more out of solving the puzzles.

- 4.3.3.1 The cube solving puzzle (“Good”) We observe that out of the 64% “Good” ranks for the cube solving puzzle, only 13% went to the educational aspect. This result indicates (as we discussed earlier in our analysis) that the educational element was not viewed as that special by our test participants, and this follows from the fact that most of our test participants actually understand basic arithmetic’s. Instead, our test participants appreciated the puzzle element of this part of the game more.

Hence, this result supports our idea here that at least the design of the cube solving puzzle is well implemented. However, since only 13% of the “Good” ranks went to the educational aspect of the cube solving puzzle part of our game, we interpret this result as inconclusive, thus 0.

- 4.3.3.3 The cube solving puzzle (“Bad”) We observe that out of the 22.2% “Bad” ranks for the cube solving puzzle, 38% went to the educational aspect. This result indicates that possibly the mathematics was perceived too easy and unchallenging to most of our test participants. Note that this contradicts the research question, thus -1.

- 4.3.4.1 The jump solving puzzle (“Good”) We observe that out of the 67.6% “Good” ranks for the jump solving puzzle, 26% went to the educational aspect and 39% went to the design aspect. This result indicates (as we mentioned earlier in our analysis) that the educational element of the jump solving puzzle part was perceived more challenging than those in the cube solving puzzle. Since the jump solving puzzle was actually design as a challenge level, this result confirms that the design is well implemented and that a significant share of our test participants also perceived the educational aspect positive here. Note that this result supports the research question, thus +1.

- 4.3.4.3 The bad about jump solving puzzle. We further observe that out of the 14.7% “Bad” ranks for the jump solving puzzle, 80% went to the design aspect and 0 % went to the educational aspect. This result indicates (as mentioned earlier in our analysis) that the educational element is well implemented in the jump solving puzzle part of our game, 73

which indeed is very positive. But the design was not perceived as that good by a some of our test participants. However, since 80% of the “Bad” ranks went to the design aspect of the jump solving puzzle part of our game, we interpret this result as inconclusive at the moment, thus 0.

- 5.7.2: How did our test participants perceived our game? The statistics show that the mean of the how the participants viewed our game landed on the value 4, with a spread of approximately ±1 points. Out of this result we can conclude that this result actually supports our research question, thus +1.

Now, instead of consider the following three questions separately, we will merge them together into a new variable denoted “the general score of the game”.

- 6.4: How clear was our proposed educational part? From our obtained result on this question, we observe that the mean is 8, with a spread of approximately ± 1 points. Hence, this result supports the research question, thus +1.

- 6.5: How interesting was our proposed educational game? From our obtained result on this question, we observe that the mean is 8, with a spread of approximately ± 1 points. Hence, this result supports the research question, thus +1.

- 6.6: How unique was the game? From our obtained result on this question, we observe that the mean is 9, with a spread of approximately ± 1 points. Hence, this result supports the research question, thus +1.

Out of the aforementioned three results, we observe that the mean of the new variable “the general score of the game” is (7.5 + 8.3 + 8.8)/3 ≈ 8.2, thus, +1.

- 4.3.1: The whole game aspect. We observe that 55% of the total collected observational data was ranked as “Good” feedback, behavior, remarks and interaction. Moreover, we also observe that 25% of the total collected observational data was ranked as “Bad” feedback, behavior, remarks and interaction. Out of these results we see that the majority of the total collected observational data was ranked as “Good”, and therefore, this result supports the research question, thus +1.

- 4.3.3: The cube solving puzzle part. The collected observational data shows that the cube solving puzzle part of our educational game was actually perceived quite positively among our test participants with 64% of the feedback being ranked as “Good”, followed by 22% of the feedback ranked as bad, followed by 13.9% of the feedback ranker as neutral. These results show that we have managed to develop an overall well implemented game section in our proposed educational game, which confirms our research question, thus +1.

74

4.3.4 The jump solving puzzle part The collected observational data shows that also the jump solving puzzle part of our proposed game was perceived quite positively among our test participants with 68% of the feedback being ranked as “Good”, followed by 17% of the feedback ranked as “Neutral”, followed by 15% of the feedback ranked as “Bad”. These results indicate that the jump solving puzzle of our educational game is well designed and implemented in a way familiar for our test participants (i.e., similar in the sense that our test participants obviously have seen similar puzzle element in other games), which actually is one of the primary goals when designing our educational game. Hence, this confirms our research question, and thus +1.

Out of the aforementioned results we get, by summing up the aforementioned 1,0, and -1s, a so-called confirmation percentage of 60% (i.e., 9/15), which clearly confirms our research question “Is it possible to develop a computer game that is both satisfying and interesting for the player to play and educational in mathematics?” with a so-called medium probability.

75

8. Discussion

In our study, we have proved that it was possible for us to develop a computer game that is both interesting and satisfying and, at the same time, somewhat educational.

In particular, we succeeded in developing a game that is far bigger than what we usually have created before. Moreover, our obtained results also show that the majority of our design choices for our game hit its mark with the test participants and this result supports the notion that we are capable of creating games that are generally both well implemented and viewed positively by the player. Specifically, the information obtained from our 14 test participants provided us with valuable knowledge regarding positive aspects of our proposed game, but also what negative aspects our test participants experienced when playing our proposed educational game. It is interesting to note that it was actually possible for us to extract some representative feedback (both positive and negative) among our test participants’ answers. In particular, the positive feedback about our proposed game comprised mostly of the aesthetics aspects of the game with praises about how “pretty” the animations and the colors were in the game, but also a lot of praising went to the design aspect of our game with comments such as “challenging” and “fun” puzzles. Another identified positive aspect was the freedom to move around and having no penalty system when failing. Summarizing the discussion above, we conclude that we have managed to implement the aforementioned design patterns (see chapter 4) in a reasonably way, as our proposed educational game generally was perceived both interesting and satisfying by our test participants.

Moreover, the result of our study also shows that the educational part of our proposed educational game needs some revision. This follows from our observation that the negative scorings mostly being attributed to the educational aspect of our developed game (see chapter 5). However, there is still some evidence that the design of our proposed game can be used for educational purposes since, as mentioned in the aforementioned analysis of our posed question “Would you consider using a similar game in the future as a helping tool for your: studies, knowledge renewal or as a fun thing to play? (Please explain)”, 80% of our 14 test participants would consider using a similar game as ours, and moreover, the remaining 20% would perhaps use a similar game as ours if it was further customizable (i.e., the player can add or remove question ranging from computational to textual subject, together with the ability to change the difficulty level of our game).

Regarding the generality of our obtained results, it is important to note that the data of our study was collected from a fairly small sample group of university students (i.e., our aforementioned 14 test participants) with an average age of 25 years. In this context, it is important to note that the intended games target group of our proposed educational game are elementary school students (i.e., students aged 8 - 12 years old). However, as mentioned above, it was not possible to conduct our empirical study in the local high schools.

76

Another important observation in this context is the lack of female participants in our sample group; this fact actually diminishes the validity of our obtained result since there is no data collected for about 50% of the intended users of our educational game.

The last important thing is the size of our selected sample group, which sometimes resulted in problems when trying to establish relationships between two or more variables, due to the spread of the collected data which gives a higher chance of insignificancy since there are not enough data points (i.e., test participants) to form a pattern.

Another thing that may impact our derived results (and their related analysis) is the fact that there are some missing elements in our developed game, for example the voice of the NPC Sture is not yet implemented in our game, and also the unimplemented final (or end scene) of our game. Hence, our obtained results will only be able to address the current version of our educational game, which is approximately only 80% completed. It is an interesting open problem to implement the aforementioned missing elements in our educational game.

Recall that we initially made the decision not to developing our educational game for tablets and smartphones. Our main motivation behind this decision is because of the limitations the aforementioned devices have in both processing and graphical capabilities in comparison with a PC. Specifically, a PC is needed in order to develop a video game that has a higher level of quality such as our developed game. Note also that the aforementioned inspirational games also require a PC. Moreover, it is also easier to develop and implement the aforementioned design choices and patterns on a PC, since they are originally developed for PC video game and needs to be re- engineered to fit both smartphones and tablets.

Another reason for not developing our educational game on the aforementioned devices is because of how our game is presented and played, which is through the so- called first-person perspective and the subsequent control schema used for this perspective, which is a combination of both mouse and keyboard which a player uses to move around in our game. Note that the aforementioned design choice implies that a further development towards the aforementioned devices would become hard to implement in a way that is perceived by the player as both interesting and satisfying. The aforementioned development choices are also developed originally for PC video games, with the first games implementing these being Wolfenstein 3D [51] and Doom [52] which are both revered as the “grandfathers” of the first-person perspective genre.

77

9. Conclusion and future studies

Our initial literature study shows that several earlier research studies conclude that games are a good motivational concept to be use in schools, and should be somehow implemented in order better contribute the learning process [13]. In particular, previous field studies and surveys clearly indicate that students highly desire to have some kind of a game in schools as a helping tool for their studies [3, 4, 6, 7]. Hence, the aforementioned discussion warrants a study to investigate whether it is possible to develop an educational game in a fashion that is both interesting and satisfying for a player.

Moreover, the results of this study are considered sufficient to support the further development of our educational math game, with the main focus on first addressing the aforementioned identified problems regarding the design aspect of the educational part of the game, and reevaluate some of the design choices and discard others. Another important issue here is also to focus on improving the educational aspect of the game, e.g., by implementing a higher level of mathematical problems to be solve by the player.

Regarding the implementation of further testing sessions of our educational math game, the main focus should here be on including a larger number of test participants, and also with an even distribution of gender. Furthermore, to include more than one test sample, aiming at investigating possible differences between the sample groups, could be of interest in future evaluations of our educational math game.

But the important thing to conclude from the results of our study is that even though the results of this study came from a small (somewhat unintended) sample group, they still present valuable information about how our educational math game was perceived in general, in general it made it possible for us to distinguish several interesting positive and negative aspects of our educational math game, invaluable knowledge for future work within the research area under consideration.

In particular, it is also out of our results future studies and play testing will be based around; this in order to first of all see whether one actually can manage to find solutions to the aforementioned problems of this version of our game. Specifically, it is an interesting open problem to investigate new design choices that can improve the general perception our educational game among a set of test participants.

Below we summarize our obtained results, i.e., we provide the answer to the posed research question of our study and, at the same time, we either reject or confirm our aforementioned formulated hypotheses:

 Is it possible to develop an educational math computer game that is both satisfying and interesting for the player to play?

o Yes, but the educational part of our proposed educational math game needs some more work since the confirmation percentage is approximately 60%.

78

Hypotheses:

. Is there a relationship between the participants’ average times spent playing video game per week and their completion time?

o Our obtained results (see figure 39 above) show that there is a weak relationship of r = -0,233, were one variable (i.e., average hour) increases in size while the other variable (i.e., completion time) decreases in size. Hence, there is some truth in our aforementioned hypothesis, but the significance correlation, which is p = 0,423, is way above the accepted level of 0.05 and is consider to be for the moment insignificant. One possible reason for the low significance correlation may be because of the low number of participants in our test sample, or the spread of the collected data regarding average hours per week.

. Have we managed to create a demonstration game that takes approximately 15- 25 min to complete?

o Out of the aforementioned descriptive statics (see figure 26 above), we see that the range of the completions times is 21 minutes, since the completion time for one of our test participant is 16 minutes (the minimum) while the completion time for another test participant is 37 minutes (the maximum). Moreover, note that the calculated mean value of our sample group is approximately 21 minutes, and that the spread of our sample group from the mean is approximately ±6 minutes. Out of these results we confirm that nearly succeeded in making the game between 15-25 minutes (i.e., 27 minutes is still an acceptable result).

. How did the participants experience our game compare to their past experience with other educational game?

o The comparison between how the participants view our game contra other games, show that our game was viewed better by the participant with a mean value of 4 (of 5 total, see figure 37). This fact should be contrasted with the mean of value 2 for how our test participants viewed other educational game.

. Does the age of the participant play a role in their overall performance and ability to complete the game?

o Our obtained results (see figure 38 above) show that there is a very strong relationship between age and completion time with an r value of 0.725 out of 1.0. These results are also statistically significant, with a p value of 0.003 correlation between the two variables. Hence, it is clear that the age of the participant plays a role in completing our proposed educational game faster.

79

 Did the choice of theme, design and gameplay style make the game unique and interesting for the participant?

o Here we observe that the calculated mean value of our sample group is approximately 9 (of 10 total), and that the spread of our sample group from the mean is approximately ±1. These results clearly show that our selected design choices and aesthetic elements actually succeeded in conveying a certain uniqueness about the game (see figure 32 above). Thus, our results support the aforementioned hypothesis and our posed research question, since the aforementioned results show that we have managed to develop an educational math computer game that was perceived by the test participants as highly interesting.

80

References

[1] PISA Results in Focus: What 15-year-olds know and what they can do with what they know, 2012 (http://www.oecd.org/pisa/keyfindings/pisa-2012-results- overview.pdf).

[2] Linderoth, J. Why gamers don’t learn more An ecological approach to games as learning environments, University of Gothenburg; 2010

[3] Gee, J.P. What video games have to teach us about learning and literacy? Palgrave Macmillan, New York; 2003.

[4] Gee, J.P. Good video games and good learning: collected essays on video games, learning, and literacy, University of Wisconsin-Madison; 2013.

[5] Rai, D & Beck E.J, Math Learning Environment with Game-Like Elements: An incremental approach for enhancing student engagement and learning effectiveness, Springer-Verlag Berlin, Heidelberg; 2012.

[6] Rowe J.P, Shores L.R, Mott B.W & Lester J.C, Individual differences in gameplay and learning: a narrative-centered learning perspective: 2010.

[7] Gee, J.P. Why Are Video Games Good For Learning? Tashia Morgridge Professor of Reading University of Wisconsin-Madison, Department of Curriculum and Instruction Department of Educational Psychology, 2005: http://www.academiccolab.org/resources/documents/MacArthur.pdf

[8] Imran A & Yusoff R. Empirical Validation of Qualitative Data: A Mixed Method Approach. 2nd AFAP International conference on entrepreneurship and business management, University Technology Malaysia, Kuala Lumpur, Malaysia, 2015 : pp. 389-396.

[9] Ernst van Aken J & Romme, G.. Reinventing the future: adding design science to the repertoire of organization and management studies. Eindhoven University of Technology, Eindhoven, the Netherlands, Organization Management Journal 6, 2009: pp. 5-12.

[10] Temple B & Young. A. Qualitative research and translation dilemmas. Qualitative Research, SAGE Publications London, Thousand Oaks, CA and New Delhi. vol. 4, 2004: pp. 161-178.

[11] Rai D. Math Learning Environment with Game-Like Elements and Causal Modeling of User Data. Submitted to the Faculty of the Worcester Polytechnic Institute in a partial fulfillment of the requirements for the Degree of Master of Science in Computer Science; 2010. 81

[12] Van Eck R. Richard Digital Game- Based Learning: It’s not just the digital natives who are restless. University of North Dakota; 2006 http://er.educause.edu/~/media/files/article-downloads/erm0620.pdf

[13] Hevner A.R, Ram H.S, Salvatore T. March & Park J. Design science in information systems research. Design Science in IS Research, MIS Quarterly Vol. 28 No.1, 2004 pp. 75-105.

[14] Creswell J.W. 2009. Research design: Qualitative, quantitative, and mixed methods approaches -3rd ed. SAGE Publications. Los Angeles; 2009.

[15] Van Nes .F, Abma T & Jonsson H. 2010. Language differences in qualitative research: is meaning. Eur J Ageing (2010) pp. 313–316.

[16] Hunicke R, LeBlanc M & Zubek R. 2004. MDA: A Formal Approach to Game Design and Game Research. Game Design and Tuning Workshop at the Game Developers Conference, San Jose. http://www.cs.northwestern.edu/~hunicke/MDA.pdf

[17] Driscoll D.L, Appiah-Yeboah A, Salib P & Rupert D.J. Merging Qualitative and Quantitative Data in Mixed Methods Research: How To and Why Not. Ecological and Environmental Anthropology, University of Georgia. Paper 18. 2007 http://digitalcommons.unl.edu/icwdmeea/18

[18] Bazeley P. Issues in Mixing Qualitative and Quantitative Approaches to Research. Published in2002

[19] Kreimeier B. The Case for Game Design Patterns [Internet]. Gamasutra.com. 2002 [cited 20 March 2016]. Available from: http://www.gamasutra.com/view/feature/132649/the_case_for_game_design_pattern s.php

[20] James Paul gee homepage https://webapp4.asu.edu/directory/person/1054842 Arizona State University

[21] Guba, E. G. The alternative paradigm dialog. Newbury Park: CA: Sage; 1990.

[22] Guba, E. G., & Lincoln, Y. S. Paradigmatic controversies, contradictions, and emerging confluences. In N. K. Denzin & Y. S. Lincoln, The Sage handbook of qualitative research, 3rd ed. Thousand Oaks: CA: Sage; 2005.

[23] Neuman, W. L. Social research methods: Qualitative and quantitative approaches, 3rd ed. Boston: Allyn & Bacon; 2000.

82

[24] Tashakkori, A., & Teddies, C. Mixed methodology: Combining qualitative and quantitative approaches. Thousand Oaks: CA: Sage; 1998.

[25] Peffers, K., Tuunanen, T., Rothenberger, M. A., and Chatterjee, S. A. Design Science Research Methodology for Information Systems Research. Journal of Management Information Systems; 2007.

[26] Offermann, P., Levina, O., Schönherr, M., and Bub, U. Outline of a Design Science Research Process. In Proceedings of the 4th International Conference on Design Science Research in Information Systems and Technology. New York, NY, USA: ACM; 2009, ISBN: 978-1-60558-408-9.

[27] Official Portal 2 Website. Official Portal 2 Website. http://www.thinkwithportals.com/ (accessed 15 January 2016).

[28] Anti- chamber - A Mind-Bending Psychological Exploration Game. Antichamber- game.com. http://www.antichamber-game.com/ (accessed 15 January 2016).

[29] Amnesia: The Dark Descent. Amnesiagame.com. http://www.amnesiagame.com/#main (accessed 15 January 2016).

[30] Unity 3d. https://unity3d.com/. https://www.unity3d.com (accessed 15 January 2016).

[31] 4. What is Unreal Engine 4? https://www.unrealengine.com/ (accessed 15 January 2016).

[32] Autodesk. Autodesk | 3D Design, Engineering & Entertainment Software. http://www.autodesk.com (accessed 15 January 2016).

[33] Blender. blender.org - Home of the Blender project - Free and Open 3D Creation Software. blender.org (accessed 15 January 2016).

[34] Quixel. Quixel. http://quixel.se/ (accessed 15 January 2016).

[35] Dropbox. Dropbox. https://www.dropbox.com/ (accessed 15 January 2016).

[36] MAXQDA: Qualitative Data Analysis Software | Windows & Mac. MAXQDA - The Art of Data Analysis. http://www.maxqda.com/ (accessed 15 January 2016).

[37] Creswell, John W. Educational research: planning, conducting, and evaluating quantitative and qualitative research, 4th ed.: Pearson; 2012.

[38] IBM - SPSS software - United Kingdom. Www-01.ibm.com. http://www- 01.ibm.com/software/uk/analytics/spss/ (accessed 15 January 2016).

[39] Counter-Strike: Global Offensive. Counter-Strike: Global Offensive. http://blog.counter-strike.net/ (accessed 16 January 2015).

83

[40] R. Buber, J. Gadner, & L. Richards. Applying qualitative methods to marketing management research. UK: Palgrave Macmillan 2004, pp. 141-156.

[41] Sommerville, Ian. Software engineering, 9 ed.: Pearson; 2011, pp. 56-77.

[42] J. Wollack E. WMAP's Introduction to Cosmology [Internet]. Map.gsfc.nasa.gov. 2012 [cited 23 March 2016]. Available from: http://map.gsfc.nasa.gov/universe/

[43] Newton's Three Laws of Motion [Internet]. Csep10.phys.utk.edu. 2016 [cited 23 March 2016]. Available from: http://csep10.phys.utk.edu/astr161/lect/history/newton3laws.html

[44] Stephen Clarke-Willson. Applying Game Design to Virtual Environments (Digital Illusion, ACM Press, Vol. 2, Issue 1, January 1, 1998.)

[45] The Internet Classics Archive | Phaedrus by Plato [Internet]. Classics.mit.edu. [cited 28 March 2016]. Available from: http://classics.mit.edu/Plato/phaedrus.html

[46] Amnesia: The Dark Descent [Internet]. . 2011 [cited 28 March 2016]. Available from: http://www.metacritic.com/game/pc/amnesia-the-dark-descent

[47] Antichamber [Internet]. Metacritic. 2013 [cited 28 March 2016]. Available from: http://www.metacritic.com/game/pc/antichamber

[48] Portal 2 [Internet]. Metacritic. 2011 [cited 28 March 2016]. Available from: http://www.metacritic.com/game/pc/portal-2

[49] Portal [Internet]. Metacritic. 2008 [cited 28 March 2016]. Available from: http://www.metacritic.com/game/pc/portal

[50] Bian H. Mixed Methods Research [Internet]. 1st ed. Greenville: East Carolina University; [cited 30 March 2016]. Available from: http://www.ecu.edu/ofe/research- statistics_consultant.cfm. http://core.ecu.edu/ofe/statisticsresearch/mixed%20methods%20new.pdf.

[51] 3drealms 3. Wolfenstein 3D - 3D Realms - Firepower Matters [Internet]. 3D Realms. [cited 1 April 2016]. Available from: https://3drealms.com/catalog/wolfenstein-3d_25/

[52] DOOM for 3DO (1996) - MobyGames [Internet]. MobyGames. [cited 1 April 2016]. Available from: http://www.mobygames.com/game/doom/

84

Appendix: A

Test case nr:

Completion time:

Personal questions:

Male, female or other?:

Age?:

Background questions:

Have you played video game before?

-If yes

How many hours on average do you spend playing video game per week (approximation):

Which game genre do you usually play (Name: 3)?

Have you played any educational games before?

-If yes

How did you find these educational game to be (Boring, Childish, Fun, interesting, challenging)?

-If no

Are you interested in playing an educational game?

85

Game questions.

What genre did you think the tested game is?

What did you think about the tested game (Boring, Childish, Fun, interesting, challenging)?

How did the educational part of the game feel? Good or bad (Please explain)?

On a scale of 0-10: How clear was the educational part of the game? (0: unclear – 10: clear):

On a scale of 0-10: How interesting was the game? (0: not interesting – 10 interesting):

On a scale of 0-10: How unique was the game? (0: not unique -10 unique):

Would you use a similar game in the future as a helping tool for your: studies, knowledge renewal or as a fun thing to play?

Thank you for your answers.

86

Appendix B

NPC - Non Player Character aims to characters in the game that cannot be directly controlled during a game session but uses instead an AI that gives impression that it is being controlled by a player.

AI - Artificial Intelligence is mainly used in computer science for mimicking a human behavior or creating realistic course of actions.

Content: A collection name of multiple objects for a specific area on a game.

Bug: In computer science, a “bug” is a call sign to an unknown error in a software.

Algorithm - An algorithm can be defined as a long sequence of events with well defined functions for calculating an event.

FPS - First Person Shooter.

RPG - Role-Playing game.

RTS - Real Time Strategy.

MOBA - Multiplayer Online Battle Arena.

MMORPG - Massive Multiplayer Online Role-Playing game.

Sim - Simulation

Platform (Genre) - Game that includes a lot of jumping around on platforms.

87