UNIVERSITY OF HERTFORDSHIRE Faculty of Science Technology and Creative Arts

Modular BSc Honours in Computer Science

6COM0282 – Computer Science Project

Final Report April 2014

Artificially Intelligent Pac-Man

A M Waterhouse 10219812

Supervised By: Neil Davey

1 Andrew Waterhouse 10219812

Table of Contents Attached below is a table of contents to aid navigation throughout this project. The project has been sectioned according to the steps of the System Life Cycle – starting with an introduction and abstract behind the project.

ABSTRACT ...... 4 ACKNOWLEDGEMENTS ...... 4 1 INTRODUCTION TO THE PROJECT ...... 5 1.1 BACKGROUND OF NETLOGO AND INTELLIGENT AGENTS ...... 5 2. WHY DO WE PLAY GAMES? ...... 9 2.1 WHAT IS A GAME? ...... 9 2.2 WHY DO WE PLAY? ...... 10 2.3 MASLOW’S HIERARCHY OF NEEDS ...... 10 2.4 SELF-DETERMINATION THEORY ...... 11 2.4.1 Competence ...... 11 2.4.2 Autonomy ...... 11 2.4.3 Relatedness ...... 11 3. ANALYSIS...... 12 3.1 INITIAL ANALYSIS ...... 12 3.2 HOW THE CURRENT GAME OPERATES ...... 12 3.2.1 User Interface ...... 12 3.2.2 Controls ...... 12 3.2.3 Gameplay ...... 13 3.3 DETAILED BREAKDOWN OF EXISTING SYSTEM ...... 15 4. REQUIREMENTS SPECIFICATION ...... 20 4.1 FUNCTIONAL REQUIREMENTS ...... 20 4.2 NON-FUNCTIONAL REQUIREMENTS ...... 20 4.3 WHAT OLD, EXISTING METHODS WILL BE TAKEN FROM OLD SYSTEM INTO THE NEW? ...... 21 4.4 OUT OF SCOPE ...... 22 5. FEASIBILITY STUDY ...... 23 5.1 ISSUES TO ADDRESS ...... 23 5.1.1 Tools ...... 23 5.1.2 Programming Language ...... 23 5.1.3 Hardware ...... 23 5.1.4 Time ...... 23 5.2 RECOMMENDATIONS ...... 23 5.2.1 Tools and Programming Language ...... 23 5.2.2 Hardware ...... 24 5.2.3 Time ...... 24 6. DESIGN ...... 25 6.1 OUTLINE OF WHAT IS EXPECTED ...... 25 6.2 HOW THIS WILL BE IMPLEMENTED ...... 25 7. IMPLEMENTATION ...... 26 7.1 RANDOM PAC-MAN ...... 26 7.1.1 Code Analysis ...... 26 7.1.2 Problems Encountered ...... 27 7.2 CORRIDOR PAC-MAN ...... 28 7.2.1 Code Analysis ...... 28 7.2.2 PROBLEMS ENCOUNTERED ...... 28

2 Andrew Waterhouse 10219812

7.3 LEFT PAC-MAN ...... 29 7.3.1 Code Analysis ...... 29 7.3.2 Problems Encountered ...... 29 7.4 MORE INTELLIGENT PAC-MAN ...... 30 7.4.1 CODE ANALYSIS ...... 30 7.5 VERY INTELLIGENT PAC-MAN ...... 33 7.5.1 Code Analysis ...... 33 7.5.2 Problems Encountered ...... 37 8. TESTING ...... 38 8.1 PURPOSE OF TESTING...... 38 8.2 HYPOTHESIS ...... 38 8.3 TEST RESULTS ...... 39 8.4 OBSERVATIONS AND TESTING CONCLUSION ...... 41 9. EVALUATION ...... 43 9.1 DESIGN EVALUATION ...... 43 9.2 IMPLEMENTATION EVALUATION ...... 43 9.3 TESTING EVALUATION ...... 43 10. CONCLUSION...... 44 10.1 TECHNICAL CONCLUSION ...... 44 10.2 PROJECT CONCLUSION ...... 44 APPENDIX ...... 45 11. INTRODUCTION TO PAC-MAN ...... 45 11.1 HISTORY OF THE GAME ...... 45 11.2 RULES ...... 49 11.3 SCORING ...... 50 11.4 CHARACTERS ...... 52 11.5 VARIATIONS...... 53 11.5.1 Arcade ...... 54 11.6 PROJECT TIME MANAGEMENT GANTT CHART ...... 58 BIBLIOGRAPHY ...... 60 CREDITS ...... 60

3 Andrew Waterhouse 10219812

Abstract

The aim of this report is to program an artificial agent who is capable of completing a Pac-Man game as though it was being played by a human. The Turing test will be at the core of testing and feedback of whether the final end product is capable of producing human-like results.

Within this report, a vast history of the classic 1980’s game will be collated to provide an in-depth insight into the history and what made Pac-Man so popular. An investigation into how the gaming industry is worth $93 billion [1] and why video games are popular in an overall context will also provide a great backdrop to the start of the Pac-Man project.

Honest and constructive criticism of the project work shall conclude the assignment once the final product is complete. The achievements will be based upon how I believe the project has satisfied what I had set out to do when I started the Pac-Man artificial intelligence project. A discussion on the findings and the learning experiences taken away from the module will also be included in a reflective conclusion.

Acknowledgements Neil Davey was a great supervisor whilst this project was being undertaken. He was able to provide his professional opinion on where the project was heading whilst ensuring he was always available to provide advice.

4 Andrew Waterhouse 10219812

1 Introduction to the project

The basic principle of this project is to take the classic 1980’s style Pac-Man game and put AI in control. No longer will the player have input into the system, instead a variety of Artificial Intelligence techniques will perform the role the player would normally take. There is an intention to create the program in a style that allows easy customization and parameter changes to adapt to the goals of the controller. For instance, one experiment could be to achieve the highest score it can (meaning eat as many with the power pellet) or another could be to clear the level of Pac-Dots as quickly as possible.

As potential extensions to the project, I would like the opportunity of adding two additions to the project if time allows. The first would be to have different maps of which could be tested and experimented upon to see how the Pac-Man agent performs when the maze is different. The other extension to the project is to have the player reintroduced to the game but play as the ghosts instead. The AI will be left in total control of Pac-Man whilst it is the player’s job to try and use the four ghosts to defeat the algorithm.

The intention of the project is to include many artificial intelligence techniques learned through the years at the University of Hertfordshire. The project incorporates problem solving where the possibility of best first search could make judgments at each tick to determine the next position of Pac-Man. Contradicting goals will also play a vital role too, as there has to be a balance between eating the Pac-Dots on the maze whilst keeping an eye on the enemy position to figure out if they are threatening. The perfect information that is available to the player will be available to the artificial agent running Pac-Man. This provides the best basis of which the algorithm can run upon, where nothing from the world is left to chance.

The programming language of which this code will lie upon will be Netlogo; an agent based programming language for controlled simulations. Discussed and debated in the analysis part of the project, I explore what the existing Pac-Man model can offer as well as the expectations of the new system. I have created a Gantt chart of which I will use to help with time keeping and maintain a professional approach to the variety of work required for the project. An in-depth analysis of the Pac-Man game can be found in the appendix part of the project where its history is meticulously reviewed and collated inn one very interesting read.

1.1 Background of Netlogo and Intelligent Agents Netlogo is a programmable modeling environment with a high level language to control artificial agents. Created by Uri Wilensky in 1999, Netlogo has become ideal for simulating natural and social collective behavior. The top-down programming language allows instructions to be given to hundreds of individual agents, whilst all the while allowing each virtual agent to act independently. Many of the simulations in Netlogo relate to local interactions between agents

5 Andrew Waterhouse 10219812 and their world. The small, local behavior of agents often leads to a collective global behavior, which can be observed in the graphical user interface. Teachers and students can use Netlogo as its simple programming language reflects into easy to view results on screen. Hubnet caters for networking within the classroom, which allows students to connect to each other and each control one agent within a simulation. The interactive learning program at its core is simple, but all the meanwhile can be so advanced enough to be a powerful tool for researchers of emergent behavior. Running on a Java Virtual Machine, Netlogo is cross-platform meaning it is accessible to be run on the major platforms of PC operating systems. The free software is bundled with a collection of existing models where parameters can be changed as the program is being run to view the global results of small changes to local interactions. The models library has a great collection of code from a variety of areas like biology, social science and games. There is a vast community behind Netlogo, from students and teachers right up to the latest tests being performed on cells and viruses by scientists. Netlogo has a website for the community where users can upload their own models to the official website for the greater benefit of others who can download it for free. Unique to Netlogo is how it can represent virtual simulations Figure 1 - Wolf Sheep Predation Stable Simulation on screen that would otherwise be impossible in the real world. One of the most popular models is Wolf Sheep Predation, where a variety of variables have profound effects on the results. For example, by simply turning off grass (of which the sheep use to eat and gain energy from) a stable system is produced. Restarting the system with grass off (so a food source is always available) eventually results in the sheep winning the fight against wolves and surging in numbers. What is most peculiar of Netlogo is how the results are never predictable – even with identical settings being used. Once re-run with the same settings, there is also the possibility of the wolves enjoying too much food from the sheep population and eventually bringing the simulation to its end when no sheep are left. The wolves die out from the lack of food and the simulation ends. The stochastic approach to many of Netlogo variables means local interactions can have a profound effect where no two runs of the simulation are identical. The tweaks and options have a vast array of outcomes in this model, something that in real world is impossible to test.

Figure 2 – Alternative Sheep Dominance Ending Figure 3 - Empty World With Same Parameters 6 Andrew Waterhouse 10219812

Shakey was the first robot to lay the groundwork for Artificial Intelligence, a general purpose robot who could complete actions like travelling through a series of corridors, turning off and on light switches and moving objects in a closed, predictable environment. Programmed in LISP, Shakey was able to take a command typed in by a user to complete a task. It used its sonar sensors to detect the environment and performed actions with the actuators and effectors to change its world. The first intelligent agent was ELIZA, a program of which showed an example of how natural language processing could be Definition of an intelligent agent computed. The program could engage in a conversation with human-like interaction. “An intelligent agent perceives its environment via sensors and acts Written by Joseph Weizenbaum in 1964, the rationally upon that environment with its first chatterbot in existence used pattern- effectors.” [1] matching techniques to provide appropriate responses to custom questions asked by the user. ELIZA remains to date a milestone for simulating human-to-human interaction and has been used in many programs like Ask.com and Apple’s Siri service. Russell & Norvig are professional computer scientists who are well known for their contributions to artificial intelligence. They have set five different types of agents of which artificial models can be defined against.

Definition Example Simple Reflex Agents – Use a simple ‘if’ condition Thermostat to determine what action they should complete. Model-Based Reflex Agents – Hosts an internal Robot Vacuum Cleaner model of the world and chooses its actions similar to how simple reflex agents work. Goal-Based Agents – Adds to model-based by Robot Waitresses understanding how the robots actions will have an effect on its environment. Uses search to find the goal state and determine how close to the goal the agent is. Utility-Based Agents – Uses probabilities and AI Driven Car values to keep a track of ‘happy’ the agent is. Relies heavily on input from sensors, representation and reasoning to determine how close to the goal the agent is. Can consider more than one goal at a time. Learning Agents – Has the advantage to operate in Handwriting recognition unknown environments as it performs actions software based on previous history and the outcomes the agent had learned. The input values are critically analyzed and then uses knowledge to make a decision. A problem generator function helps discover new experiences and find new goals.

7 Andrew Waterhouse 10219812

I intend to create a Simple Reflex Agent to begin with as a basis to test against. Then, Model-Based Reflex Agents will be created as each program will allow the Pac-Man agent to have a better model of the world and have more complex rules about its movement. Goal-Based agents would be where Pac-Man has an idea of where ghosts are within the maze and react accordingly. A possible extension of the project could be where Pac-Man understands how ‘happy’ he is as he clears the maze of Pac-Dots.

8 Andrew Waterhouse 10219812

2. Why do we play games?

2.1 What is a game?

There have been many theories about why people play games, but first a definition of what a game truly is has to be constructed. Chris Crawford, a famous game designer constructed the following definition of what makes a game.

Figure 1 – Definition of a game

Toy No

Interactive Play Puzzle

1 No

Yes

Challenge 2 Competition No Yes Conflict

3

Yes

Game

1. Is there a goal? 2. Are there any other players? 3. Can you interfere with others?

We start off Figure 1 flowchart with an interactive plaything. The first thing to question is whether there is a goal. If there is a goal, then the object in question is a challenge, however if there is no goal the plaything can be defined as a toy. If the object has been determined to be a challenge, we next ask whether there are other players. If not, the challenge turns out to be a puzzle (like a Rubik’s Cube). Once there are other players, the plaything is determined to be a conflict. The final question is whether you can interfere with other players. If you cannot, it’s a competition (much like a 100m race as an example). To summarize, a game consists of a goal, other players and interaction with others too. This broadly encompasses all games but does have anomalies like Minecraft. But, this graph

9 Andrew Waterhouse 10219812 will remain the generic definition of what will be classified as a game in this project. 2.2 Why do we play?

Humans are not the only species that play games. In the wild for example, you can see tiger cubs having play fights with each other or monkeys in jungles playing through the high forests. What that shows is that nature develops our taste to have fun and enjoy playing games. The more interesting point though, is that it is not to prepare us for real life challenges. The games creatures play with each other causes an unnecessary expenditure of energy, whilst potentially risking injury and not really simulating real life events or predators. To provide one example of this, cats play fighting would not have the same level of adrenalin and crucial decision making as a real fight against other predators would. Instead of training the bodies of creatures, theories suggest that games are about preparing the brain for real life situations.

2.3 Maslow’s Hierarchy of Needs

Abraham Maslow, famous in the psychological world for his hierarchy of needs graph, demonstrates how humans have set levels of needs to be fulfilled before they can be completely satisfied. The lower the level of need, the easier they can be to fulfill due to their clearer cut nature. As needs are fulfilled at lower levels, the rules to fulfill those higher become more complex. Just as an example, fulfilling physiological needs are simple – ensure that there is food, water and shelter available. However, once you reach the Esteem level of the hierarchy, fulfilling achievement and respect of others is much more complex.

Life can be thought of as one big game. Feedback in real life happens Figure 2 - Maslow's Hierarchy of Needs way after you have completed an action or made a decision in the upper levels of the hierarchy. ‘Is my job good for what I want to do?’ is one example where the feedback on such a question is a long term one – something in a way that our big life game fails in providing quick satisfaction. The rewards for completing daily challenges and tasks in life are often slow to come by or are non-existent and this is where games step in.

Andrew Waterhouse 10219812 10

2.4 Self-Determination Theory

The Self-Determination Theory (SDT) is an overseeing theory of psychology, which is used to model human motivation and input into the greater society. SDT has a focus on individual behavior and how people can become self- motivated and determined. SDT is made up of three factors, all of which are satisfied by games.

2.4.1 Competence This is how people love the feeling of completeness, mastery and learning. Within games, trophies and achievements from the home consoles would be an especially Figure 3 - Self-Determination Theory strong example of how these core needs would be fulfilled by such players. Platform titles tend to be most popular with people who have a strong desire to satisfy this part of SDT as the more you play them, the better you tend to become.

2.4.2 Autonomy Autonomy is how players love to have choice and control of their actions, making their own adventure and customization onto the game they play. Open world titles like Skyrim provide great depth of customization as each player can have their own story to tell from playing the game. Having choice and affecting the game could be as simple as dialogue trees where the player feels as though they have gone through a much more personal journey than if it had been the same for all players.

2.4.3 Relatedness Multiplayer games have had a surge of popularity because of the way they satisfy this core need of the SDT. Relatedness makes a player feel as though they have made a contribution to society and playing MMORPG titles like World of Warcraft can bring players together. Games are a social structure where players are brought together both in-game and out in the broader society.

Outside video games, sports do a great job of satisfying these three core competencies defined for the SDT. Being able to make an individual impact on the pitch, becoming ever better at a sport and joining teammates to work together to defeat opponents makes the Self-Determination Theory stand up well. What is more interesting is how even watching and supporting others play satisfies the theory too. Basking in Reflected Glory (BIRG) is being a supporter of a match and having the needs of Maslow fulfilled. Cutting off Reflected Failure (CORF) is where fans distance themselves when their team loses – to help them strive through the two models of self-motivation defined earlier in the project.

Andrew Waterhouse 10219812 11

3. Analysis

3.1 Initial Analysis

The initial analysis of how Pac-Man currently operates will be conducted in an Artificial Intelligence programming environment so I can look directly at the code being used. This will help me to understand how each agent works with the existing code. Netlogo is the name of the program that I will be using to look at how Pac-Man currently is programmed to run. It features all of the functionality that players loved in the 1980’s and has in fact been bundled with every download of Netlogo in the Models Library.

3.2 How the current game operates To analyse what is required of the project, I will first look in detail at the existing game in the Netlogo Models Library. The core components of Pac-Man will be broken down into easier to manage sub sections to analyse the specifics of the game.

3.2.1 User Interface

There are a variety of tools that the user interface utilizes in the Netlogo version of Pac-Man. Monitors are used to show the user the score, level they are playing and the amount of lives left remaining. The score is updated on every tick whilst the level number and lives remaining only change once the user either completes a level of its Pac-Dots or loses a life to a ghost. Figure 1 - Netlogo Pac-Man User Interface Buttons are used to control the movement of Pac-Man and also allow the player to start a new game or pause the existing one. The buttons have also been linked to keyboard shortcuts to allow the player of Pac-Man to have a more natural way of playing the game.

Perhaps the most interesting part of the user interface comes from the only slider – Difficulty. The slider ranges from 0 to 7 in 1.0 increments and details of how it affects gameplay will be covered in the ‘Gameplay’ section of the analysis.

3.2.2 Controls The controls for the game are through four on screen buttons. Up, down, left and right are assigned to I, K, J and L on the keyboard to enable more natural control of the Pac-Man agent. Based upon playing the game, I found that these button assignments could be more effective if changed to the directional buttons on a QWERTY keyboard instead.

Andrew Waterhouse 10219812 12

3.2.3 Gameplay With the ability to start a new game or pause an existing one through the click of a button, the Netlogo version of Pac-Man opens itself up to show what it can really do. Once the game of Pac-Man begins, it is clear that this is not a direct replica of the original game that I have researched in the earlier part of this project. Ghosts appear to have been programmed differently from the original game as in the first rendition of the game, ghosts only move in one direction (until frightened when a Power Pellet is eaten). In Netlogo, the ghosts are able to change direction on the fly, which makes gameplay less exciting as the player has to also change direction based on whether they are trying to eat or run away from the enemy. The ghost movement styles appeared to be all the same and unlike how they had originally been designed to move. This was most evident when the player would leave the control and Pac-Man was left against a wall. Ghosts would take an extraordinarily long time to find Pac-Man and cause the player to lose a life. In the original game, it wouldn’t be long until the ghosts had captured Pac-Man, especially due to Blink’s targeting technique of aiming for the tile Pac-Man resides. Another small, niggling problem I found was how ghosts would come out of the ghost house still scared if they had just been eaten. This was never how it was in the original game as it could cause the player to just wait outside their spawn point to get a high ghost kill score. What I found most bizarre was that this exact critique was addressed in the ‘Info’ tab of the model like the programmer had fixed it – “Once a ghost is eaten it will return to its base, where it is born again, immune to the Power-Pellet until Pac-Man can find a new one to consume”. The most evident problem with the gameplay was how the simulation felt lag and slow. This could perhaps be a Figure 2 - Bonus Stars That Reward Up To 1000 restraint on the amount of calculations Netlogo had to perform or can hopefully be resolved with extra programming and tweaks. Another overbearing factor I personally found off-putting was the inclusion of stars as a collectible. These star graphics would appear randomly in areas of the maze for Pac-Man to collect bonus points. Multiple stars could appear on the maze at once, which would somewhat off put the player from completing the basic goal of clearing the map of Pac-Dots. Stars would have no Difficulty explanation found in Info effect on gameplay other than tab rewarding the player with extra points “DIFFICULTY controls the speed of the for their score. Different coloured stars game. Lower numbers make both the would reward different points from ghosts and Pac-Man move slowly, giving 100-1000. If this game were to keep to you more time to react as you play.” the true original, it would be better to

have the stars replaced by fruits which

Andrew Waterhouse 10219812 13 would start at one point on the map and would be able to move around the maze – eventually disappearing if not collected. I tested the difficulty slider whilst playing Pac-Man in a Netlogo environment. I found that the difficulty slider only made the game run faster as it controlled the speed of all agents. I would prefer to have seen actual decision making being influenced by this slider rather than the player only having to react quicker to movement. What I loved about the game was the leveling system. As a maze is wiped of its Pac-Dots, the next level would provide a different spin on things with a fresh design. This provides a great benchmark to use when I program an AI agent to control Pac-Man and see how it performs in different level designs.

Figure 3 - Unique Maze Designs

Disappointingly, I found there to be one programming error that meant that the fourth level of Pac-Man was impossible to complete. The runtime error appears at the start of the level and does not disappear even if the player tries to un-pause the current game. Figure 4 represents how the programming error appears –

Figure 4 - Runtime Error on Level 4 on first glance it appears patch related based upon the colour of the patch the ghosts are on.

Andrew Waterhouse 10219812 14

3.3 Detailed Breakdown of Existing System

Creates the agents and assigns variables for each type of breed added to the world.

Global variables of which are used to determine many features of gameplay like whether the level is over or how many lives the player has at their disposal.

An observer button of which creates a brand new game. Amongst many variables, the score is reset to zero; the first level is loaded whilst the ghosts are no longer scared and ready to begin the game.

An observer procedure of which the user doesn’t have direct access to. Called by ‘new’, load-map takes csv files, which have details on map specifics. Temporary variables called with ‘let’ allow the current state of the game to be presented in the User Interface. The if statement provides a safeguard against instances where the fifth level is complete and a sixth, unavailable level is selected to load. If the last level of the array is completed, this statement loads the first level of the collection.

Andrew Waterhouse 10219812 15

The runtime procedures are next addressed in the program. The first check in ‘play’ is if the player is out of lives (which stops the simulation). Difficulty is set next where it is clear the slider only has an effect on how quickly both Pac-Man and ghosts move. Every 0.25 seconds update-bonuses procedure is called which has a chance to create a new star shaped bonus on the map.

A check is made to reward the player with an extra life. Requirements for a bonus life increase as they are earned. If the Pac-Man agent is in a dead state, checks are made here whether to end the game or present the user with an on screen message about the state of current game.

Home-pos is a field in the csv pacmap file which is used here to set the x and y co-ordinates of agents. The level-over method is called within play to continuously check if all pac-dots have been collected. The next level is loaded if this is the case by adding 1 to the level parameter that loads the next item in the list of levels. If all levels are completed, increasing the speed through a multiplication calculation makes difficulty 1.6 times harder.

Move-pacman method will be replaced to make movement automated. However, in this system two variables are used for the heading/orientation old-heading and new-heading. Old- heading retains the direction Pac-Man was facing incase the patch ahead is not black. New-heading is set by the directional interface procedures, detailed at the end of the code. If the patch ahead of the heading is black, the agent moves forward one. If there are any pac-pellets on the patch moved to, they are consumed. Another call is made here to end the game if all pellets are collected. Move-pacman also animates the character by forcing alternating shapes.

Andrew Waterhouse 10219812 16

The first part of code for consume is to add the score of a bonus to the current score if there are any bonuses on the patch of which Pac-Man resides. These are removed from the world through the die method. The pellets are of the same class, but can have the power up attribute. An if statement determines how to react if the pellet is a power-pellet. In this case, a 500 score is added to the existing score and the scare level of ghosts is set to 40. The shape of ghosts is also changed to a ‘scared’ type. If the pellet is normal, only 100 points are added to score and at the end of consuming a pellet, they are asked to also die to remove them from the maze. For ghosts, a simple check to see if there is a ghost (that has not been eaten) on the same patch is conducted. If the ghosts are in their normal aggressive state, Pac-Man loses a life through the ‘dead?’ function. Otherwise, that ghost has the eaten variable set to true, shape changed to the eyes and score increased by 500.

Bonuses have a countdown variable which ticks down by one every 0.25 ticks. Once countdown has reached 0, it is removed from the maze.

If the ghosts are in the eaten state, a check is made to see if the patch they are on is grey (which indicates their spawn). If they are back at their base, the shape of agent changes back to ghost as their eaten state changes back to false – allowing them to leave the home by calling choose-heading and fd 1. If the ghost is in a scared state (as a power pellet has been consumed), the scared value is gradually reduced by 1 as the simulation progresses. Once this value is under 10, an animation of the scared ghost returning as a regular ghost appears. This is made through a simple if statement to alternate the sprite by having the ‘mod 2’ within the if statement that looks at whether the scared value is divisible by 2 with no remainder. Once scared is reduced back to zero, the normal ghost shape is set and the if statement breaks.

First a call is made to clear-headings, which returns the patches the agent can move to. The ghost isn’t able to move back on itself so the opposite of it’s current heading is removed. The algorithm then targets a patch coloured grey and angles itself toward it. If there is only one direction the agent can go, it moves in that direction. If there is more than one possibility though, either home-path or one of the items in the list is chosen.

Andrew Waterhouse 10219812 17

Choose-heading removes all invalid possible moves for the ghosts. If there is only one direction the ghost can go, it moves there. Otherwise if there are two or more possibilities for the ghost, it checks if it would be able to see-pacman in each potential move. If see-pacman method returns true, that direction is set and the ghost moves there. If in all of the possible moves pacman cannot be seen, a random direction is taken based upon those filtered in ‘new-dirs’.

If the ghost is tracking the Pac-Man and is not in a scared state, the ghost follows. If the ghost is going in the same direction as Pac-Man but is scared, tracking is removed and a new direction is taken.

Clear-headings is a report method that returns a list of directions of which a ghost has clear to move toward. Each of the four patches surrounding a ghost is checked for a blue tile. Patches that do not have a blue colour are recorded along with the degree of heading required.

Opposite removes the opposite heading direction for ghosts as they are programmed to move in one direction with no ability to reverse on demand.

Andrew Waterhouse 10219812 18

A temporary variable ‘saw-pacman?’ is set up initially to be false. Whenever this method is run by a ghost, it has to be on a black patch (to avoid pointless checks in ghost home). The method simply checks if there is ‘pacmans-here’. If Pac-man occupies the same spot, the variable is set to true. The outcome of saw-pacman is returned by this reporter method.

Uses the next-bonus-in parameter as a countdown. Once next-bonus-in reaches 0, a new bonus is generated in a patch that has no bonus or a pac-dot already there. A separate check is used to check that no agent is on top of the spawn location chosen for the bonus. If there isn’t anyone on the patch selected, the bonus is generated with a random colour and random number between 50 and 250 for the next countdown.

Basic functions behind the movement, each influencing the heading of Pac-Man.

Andrew Waterhouse 10219812 19

4. Requirements Specification

In the game Pac-Man, there are different challenges of which the player faces and a parameter setup should cope with the multitude of goals. Scoring the highest score the AI can achieve would be by eating as many ghosts as it could whilst also eating the fruit/bonuses that appear in the maze. However, some players would like to instead focus on the primary goal of eating the Pac-Dots and only eat ghosts if they are either in the way or would be threatening if left to live. The artificial system should have parameters that accommodate how the AI treats both the scoring aspect and how quickly it would like to complete the map. I’d expect the system to be able to cope with different maze styles that allow the Pac-Man agent to complete each level of their Pac-Dots. This is fundamental to the most recent releases of Pac-Man where mazes change as the player progresses. Allowing the ability of completing more than one maze style will also offer better flexibility and allow for more in-depth analysis of the results and experimentations on the new system too. There should be different Pac-Man programs created with different techniques to complete the game. This will allow great testing opportunities and the ability to compare the efficiency of each technique. During the production of the final intelligent Pac-Man system, there is an expectation to have multiple prototype models where intelligence is different. Increments of models should be produced where the intelligence of the system gradually increases.

4.1 Functional Requirements Functional requirements are the foundation of the game, which defines the basic function of the deliverable.  Variety of AI Models – There have to be multiple systems created to provide benchmarks and testing to determine which produces the best results.  Customisation of game options – In models where an input can be customized (such as how to respond to bonuses), it would be ideal for the User Interface to have options that can be changed to affect the gameplay.

4.2 Non-Functional Requirements Non-Functional requirements more specify how the system performs rather than ‘if’ the system can do it. They are often used to test the overall operation of the system and how user friendly it is.  Documentation - Detailing the model and what pieces of code have been changed will help to ensure maintainability for the future. Each method that has been created or changed should have appropriate documentation for future work by others.  Reliability & Performance – The AI models shouldn’t hinder the current performance of the game. Any crashes should be covered by quality coding to provide a reliable system.  Testability – Each system should have consistency in the results and feedback provided. This will help test different AI techniques against each other whilst ensuring everything else remains identical.

Andrew Waterhouse 10219812 20

4.3 What old, existing methods will be taken from old system into the new? The existing Pac-Man system in Netlogo offers a great foundation of which to program the intelligent Pac-Man agent. The table below represents the methods from the old system of which the new system will make full use of.

Method Use in new? (Y/N) Reason new Y Creates a new game and sets framework that can be used by AI load-map Y Sets up the map via an import of csv file and the global variable values play Y Regulates lives and when game is over or level is complete. move-pacman N AI will control the movement and heading of Pac-Man consume Y Allows pellets to be consumed as well as the bonus values, power pellet power up and ghost behaviour when eaten. update-bonuses Y Creates bonus pickups for Pac-Man as well as removing off screen when eaten or timed out move-ghosts Y Ghosts will not be interfered with as it is out of scope return-home Y A ghost procedure to direct eaten ghosts back to base choose-heading Y Another ghost procedure to choose direction to head in when Pac-Man is detected. clear-headings Y Stops ghosts from entering the maze walls opposite Y Used in the ghosts procedure to determine next heading see-pacman Y Ghost procedure that detects if Pac- Man is in vicinity make-bonus Y Bonus procedure that creates the equivalent of fruit from the original game. Could be customized to change shape and how often they appear in level move-up, move- Y These will provide the basis of right, move-down, movement for Pac-Man. They will be move-left called individually by the changes made in the move-pacman method.

Andrew Waterhouse 10219812 21

4.4 Out of Scope Billy Mitchell of the United States should not be worrying about his highscore of 3,333,360 being under threat. Whilst this would be fantastic to achieve, in the time scale available this would be out of the scope of the project. The AI should not be expected to complete every single level and achieve world record scores as it reaches the last 256th level. The program being designed should be expected to replicate similar behavior patterns of a human player rather than being the best player in the world. Also out of scope is how the ghost behavior is going to be affected. The goal is to produce an intelligent Pac-Man rather than intelligent gaming system. Whilst this is out of the scope of this project however, it could be an area of further work to be completed once the project is finished.

Andrew Waterhouse 10219812 22

5. Feasibility Study The purpose of the feasibility study is to determine how to complete the project within the allotted time frame. It covers a wide range of issues that I will face when programming the game such as the programming language to choose and hardware required.

5.1 Issues to Address

5.1.1 Tools In order to complete the task of creating an intelligent agent to control Pac-Man, I will need an Integrated Development Environment to program and customize different models. Ideally, an Integrated Modeling Environment will be used to better visualize the results and create emergent behavior of the agents within the maze. This will also allow better control of agents and the rules it sets with its world (patches).

5.1.2 Programming Language The programming language should be flexible and able to run easily on a multiple range of Operating Systems. There should be a great collection of supporting documents and help pages to introduce me to new methods and calculations. The programming language will be restricted to those of which I am already familiar with due to the time constraints on the project. Only having fourteen weeks to complete the project means there is not enough time to be caught up in complex programming problems with a new language.

5.1.3 Hardware The hardware requirements for the project are relatively low as the software should be able to run on a standard PC setup. I will require a PC that can connect to the internet to provide support during programming and also has a monitor with keyboard and mouse input too.

5.1.4 Time I expect to need at least 4 weeks to complete the programming part of the project. I have gone into detail about the exact time requirements for every part of the project in the supplied Gantt chart in the appendix of the report.

5.2 Recommendations

5.2.1 Tools and Programming Language I have had a vast amount of experience with an Integrated Modeling Environment, Netlogo. It allows great management of the world through calls to patches whilst being able to control different types of turtles easily. Emergent

Andrew Waterhouse 10219812 23 behavior is much easier achieved in Netlogo than it would be through Java or a similar structured object orientated programming language.

5.2.2 Hardware There is a vast range of computers throughout the University of Hertfordshire that I will be using to program and develop the project. Combined with my personal computer at home, these offer fast processing and has Netlogo pre- installed software to provide seamless development.

5.2.3 Time The fourteen weeks provided to complete the project should provide enough time to finish an intelligent Pac-Man system. Time management tools like the Gantt chart will be key to organize the progression of the project through the system life cycle. Time will be one of the external factors to the project that I will have little control over, so I will have to manage my time to cater for any unexpected events or delays.

Andrew Waterhouse 10219812 24

6. Design

6.1 Outline of what is expected I expect to create multiple Pac-Man simulation models that rely upon local interactions with its world. The major reason for the need to keep interactions as local is to avoid a global view/control by Pac-Man. A global view of the game would be too unfair on the ghosts in the original model as they are based upon local interactions also. It will be more interesting to see how different algorithms perform with local interactions rather than to create one all-seeing model that never dies.

6.2 How this will be implemented A variety of models will be created separately based upon the existing code found in the Netlogo Models Library. The Netlogo programming environment is ideal for modeling different algorithms and comparing the performances of each. Netlogo, a high-level programming language easily allows for great management of local interactions that would be otherwise hard in lower level languages (like Java or C++). As a basis, a Random Walking Pac-Man model will be created first for all future simulations to test against. It will provide a foundation for the other models to beat and the measurables throughout the project will be the Pac-Dots cleared alongside the score achieved. The reason for two measures is because of the way the Pac-Man game itself works. There are primarily two goals for a player, to achieve the highest score for the leaderboards and progress as far as they can into the game to get the most out of their buck. Completing more maze designs in this context will mean that the program is adaptable and able to understand to an extent the primary goal of the game. Achieving the highest score however is only partly related to the score achieved at the end of the level. Bonuses spawn into the game where only no Pac-Dot currently resides and have a countdown timer to be removed from the map. Collecting these will add a bonus value to the score of which is between 100 and 1000 (in 100 increments). The remaining method that Pac-Man can add value to its score is by consuming ghosts when they are in a vulnerable state. Comparing the scores obtained by the models alongside the amount of Pac-Dots eaten will be the best measures of how well the algorithms performed in both aspects to the Pac-Man game.

Andrew Waterhouse 10219812 25

7. Implementation I intend to create different levels of intelligence as multiple versions of Pac-Man allow progression of complexity. Each system will also be tested later in the project to see how different intelligence techniques perform in the game.

7.1 Random Pac-Man The first artificially controlled Pac-Man game will be used to provide a backbone of which all other systems can have their results tested against. Pac-Man in this installment makes moves entirely at random. I expect the results to be unimpressive and perhaps vary wildly compared to the other models due to its unpredictability.

7.1.1 Code Analysis

Move function was changed in a similar style to that of the return-home function for ghosts. First, I ensure there are two calls made to ‘consume’ to fix a problem discussed in the next part of the project. Move-pacman utilizes the clear-pacman- headings method that is similar to that used by ghosts to identify patches that are not walls. The list of patches returned by the reporter method is put into the temporary variable ‘pac-dirs’ and one is selected to be the heading of Pac-Man. Once the heading is selected, the move method operates as normal from the original model.

A more specific version of clear-headings has been created for Pac-Man due to how ghosts can move into grey patches. I have created this method so only black patches are reported as these are the only ones Pac-Man can move onto. ‘clear-pacman- headings’ will be used in all future developments of the Pac-Man game too.

Andrew Waterhouse 10219812 26

7.1.2 Problems Encountered

I found that the random walking Pac-Man would occasionally find itself on the same patch as a ghost but not lose a life. I found this problem to be rather strange as I didn’t change any characteristics of this part of the game. I investigated this problem by looking at the original unchanged model and found the same problem was evident. Pac-Man wouldn’t lose a life 100% of the time, especially when a ghost was moving in one direction and Pac-Man Won’t Die Glitch a ghost the other (colliding north vs south for instance). As the Pac-Man speed was different from the ghost To remedy this, I looked speed determined by: closely at the simulation and “every 1.6 * (1 - difficulty / 10) realized the problem [ move-ghosts ]) every (1 - difficulty / 10) occurred with the ‘consume’ [ move-pacman ]” method. there was opportunity for Pac-Man to walk through a ghost before a consume command was called.

The problem was with how Pac-Man could move at a different rate to the ghosts and not be detected because of how consume command was called. To remedy this game-breaking problem with the original model, I made two separate calls to consume, both before and after the Pac- Man agent moved. The consume method covered the ground for when Pac-Man was on a patch that had a Pac-Dot, as well as Figure 1 - Pac-Man and Ghost Occupy Same Patch But No Life Lost when it was on the same patch as a ghost. This coupled, integrated method meant it would be wise to add more calls to consume from the move procedure, as it would more regularly check for a ghost whilst being able to consume Pac-Dots normally.

Andrew Waterhouse 10219812 27

7.2 Corridor Pac-Man The second iteration of Pac-Man model was for the agent to move along a corridor until he hits a wall. Then, a random direction is chosen to make based upon the valid patches to move to. This can have the effect of Pac-Man reversing his direction as the choice is made randomly.

7.2.1 Code Analysis

If the patch colour ahead of Pac-Man is black, the agent moves forward whilst consuming and animating as usual. If another colour is detected one patch ahead of Pac-Man, a new direction is taken randomly. This has the effect of changing direction whenever Pac-Man meets a wall or ghost house entrance.

7.2.2 Problems Encountered There were few problems with creating the Corridor Pac-Man. It was created so that there are primarily two states: to go forward when the patch ahead is black or to change direction otherwise. This resulted in the Pac-Man agent moving around the map as expected, only making choices to change direction when there was a blocking blue patch ahead.

Andrew Waterhouse 10219812 28

7.3 Left Pac-Man This iteration sees Pac-Man always take a left turn if there is the choice of a direction to go. I’d expect this model to perform well up to a point where the agent is stuck in an infinite loop around one particular section of maze.

7.3.1 Code Analysis

I chose to program the movement of Left Pac-Man in a style that ‘filters’ different situations. First, if there is an option to go straight or left – pick left. If there is a wall ahead but an option to go left, take left. If there is no turn available, but there is a black patch ahead, continue forward. If none of these options are available a random heading is chosen based upon a slider in the user interface.

7.3.2 Problems Encountered Once I had a functioning left-turn Pac-Man system, I had to decide what the agent should do when confronted with a wall with no left turn. Should the agent always turn to the left, make a random heading direction or always turn to the patch that Figure 2 - Trapped Left Turn Pac-Man allows it to escape the wall? This conundrum was debated with the project supervisor to understand how we should deal with situations where Pac-Man is trapped in a part of the maze. Originally, I had created a Left Pac-Man system that would turn 90 degrees to the left when there was no option to continue straight or take a left turn. This resulted in a quickly trapped agent where the results were always the same. To help Pac-Man break out of trapped areas of map, a random heading was inserted in to the code. A random heading is only taken when the system produces a random float number less than the specified ‘randomness’ slider on the interface. When a random float is higher than the randomness variable though, no move is taken. I chose to incorporate this code to help the system become more efficient without breaking away from its roots of having a preference to take a left turn. I will have to test later in the project which randomness value is the most efficient in producing a system that takes a preference of a left turn without becoming too random when stuck.

Andrew Waterhouse 10219812 29

7.4 More Intelligent Pac-Man Unlike the previous iterations of Pac-Man, this model takes into account the position of enemies. If an enemy is detected by Pac-Man, it makes moves to avoid the oncoming ghost. As well as the ghost avoidance, within this system lays the functionality to target ghosts when they are in a vulnerable state. This optimizes the chances of this model of achieving a higher score.

7.4.1 Code Analysis

The majority of the code for the more intelligent Pac-Man was stored inseperate methods. Within the move-pacman procedure, a simple call to choose- pacman-heading was made.

Andrew Waterhouse 10219812 30

First three temporary variables are created to store the clear heading directions for the Pac- Man, a variable to store the ghost position and a direction variable that removes the opposite direction from where Pac-Man just came from.

If statements determine what Pac-Man should do depending upon the number of available heading choices to make. For example, if there are three possible directions Pac-Man can take, each direction is checked for a ghost using ‘see- ghost’ reporter method. If a ghost is detected, this direction is stored in the ‘ghost-dir’ variable. If no ghost is detected, the heading is set to one of the three directions.

If a ghost has been detected, this is where the nested if statements above end up. I have created this method so that Pac-Man will target a ghost is they are scared (for points) otherwise will remove the direction of which a ghost is and choose a new heading based upon the choices left.

Andrew Waterhouse 10219812 31

The clear-pacman-headings used in this model returns a list of possible directions that Pac-Man can move to. This procedure checks for the surrounding four patches for whether they are black (meaning they are a valid map colour for the agent to move onto, unlike ghosts who can move on a grey patch).

See-ghost takes in a direction as a parameter and uses local variables to check for a ghost. First, elimination of invalid patches to check is made through the while loop (as sometimes the patches provided by choose-pacman-heading can be blue if near a wall). Next, a check is made directly on the patch in question for a ghost using ‘ghosts-here’. If there are any on that patch, saw-ghost is set to true and reported back. If however, there are no ghosts directly next to Pac-Man, a calculation is made to change the patch (to check for ghosts) to the next patch in that direction. This is looped until the method returns to patch-here and then the outcome is reported back to choose-pacman-heading.

Andrew Waterhouse 10219812 32

7.5 Very Intelligent Pac-Man This version of the Very Intelligent Pac-Man model takes into account the position of both ghosts and Pac-Dots. This system was created to optimize the chances of the model of completing mazes of their Pac-Dots, alongside the life- saving techniques incorporated in the earlier Ghost detecting simulation.

7.5.1 Code Analysis

The main move-pacman procedure remains the same as that used for the previous ‘More Intelligent Pac-Man’ model. The majority of the decision making for the movement of Pac-Man is made in the choose-pacman-heading method. The move-pacman command is used to manage the consuming of Pac-Man and ghosts on screen as well as moving forward, a simple check to determine whether the level has been completed and the famous Pac-Man animation.

Andrew Waterhouse 10219812 33

Variables are set up to manage the direction of ghosts and pacdots the Pac-Man detects. The other two variables setup the clear headings of Pac-Man and the other removes the opposite heading of Pac-Man.

If there is only one direction Pac-Man can go, he takes it. This happens to be rare in the real model and entirely relies upon the map design.

If there are two or more possible headings that Pac-Man could take, the following is performed. Pac-Man uses the see-ghost command to check for ghosts in each position of the clear-headings. If a ghost is detected, the direction of the ghost is stored in the ghost-dir variable, otherwise it stays in the false state.

If no ghosts are detected in the free headings of Pac-Man, a similar check is made next for pacdots. If a pacdot is detected with the see- pacdot command, pacdot-dir changes from the false state to the location of the dot.

If anything is detected by the see-ghost or see-pacdot commands, dirs changes. However, if nothing is detected by the current position of Pac-Man, he continues to progress through the map by choosing a clear-heading by utilizing new-pacman- dirs (which removes the opposite heading). This helps to make Pac-Man progress through the map without going back upon himself when there is no threat.

Andrew Waterhouse 10219812 34

If a ghost has been detected, Pac-Man checks to see whether the ghosts are in a scared state. If the ghosts are vulnerable, Pac- Man has been programmed to target their direction to achieve a highscore by consuming them. If the ghosts are invulnerable and could threat the player, the position of the ghost is removed from the clear headings of Pac-Man.

The next set of if statements check for more than one ghost. If Pac-Man has detected a ghost in one of the surrounding clear-headings, then Pac-Man should also check if other ghosts are also surrounding Pac-Man before making a move.

If the dirs variable at this point has a patch for Pac- Man to move into, one of these is selected for the heading. However, if there are ghosts in all of the possible directions Pac-Man could go, a check is made to see whether it would be best to reverse upon itself. Otherwise, a clear-pacman-heading is selected for Pac-Man to go toward as this is the most likely area a life will be lost.

Andrew Waterhouse 10219812 35

If no ghost has been detected, the else part of the ghost-dir if statement activates. A check is made to see whether a pacdot has been detected. If it has, the heading is set to the pacdot. If no pacdot has been detected, that also means no ghost has been seen. This is wwhere a simple check is made to see whether Pac-Man can move without reversing. If he can’t, then a clear-pacman-heading is selected for the new heading – as required for level 3 designed map.

The same code as used for the More Intelligent ghost detection has been implemented here. It simply checks along the corridors to see each patch if it has a ghost occupying it. If a ghost is detected, saw-ghost? returns true.

This see-pacdot method operates in a similar fashion to the see-ghost method defined above. It checks each patch along a corridor to check whether a pellet occupies it. If there is a pacdot, saw-pacdot is returned as true.

Andrew Waterhouse 10219812 36

7.5.2 Problems Encountered The complexities with this model lied in how to program a Pac-Man, which could prioritize the ghost detection over the Pac-Dots. Using the ghost detection system as a basis helped greatly and I implemented the Pac-Dot detection system in a similar manner to the ghost detection. This would only be called after checks for all clear patches had been made for ghosts. Another problem I had with this model was Figure 3 - Early Reversing Problem how Pac-Man was making a random heading direction when confronted with a ghost. I implemented four if statements Pac-Man Reversing Runtime Error when a ghost has been detected To resolve this, I changed the following to remove the direction of other code: ghosts if Pac-Man is surrounded. This resulted in a Pac-Man [ifelse pacdot-dir != false program that can be somewhat [ set heading pacdot-dir ] [ set heading one-of new-pacman-dirs ]] aware of being ganged up upon. Another problem that I had to early on with this model was how Pac-Man would find itself [ifelse pacdot-dir != false stuck in some areas of the later [ set heading pacdot-dir ] [ifelse length new-pacman-dirs > 0 pac-maps. In maze 5, Pac-Man [set heading one-of new-pacman-dirs] would find himself stuck in a [set heading one-of clear-pacman- patch where the only way out headings]] was to reverse the direction it ] had come in. The problematic code involved how Pac-Man would treat situations where there was no Pac-Dot or ghost. I had programmed him to never reverse in these situations.

Andrew Waterhouse 10219812 37

8. Testing

8.1 Purpose of Testing With five different systems created using various artificial intelligence techniques, it is essential to test and compare how each performs in the game. There are going to be two main methods to measure how effective each of the systems are, based upon the parameters programmed in the game. The first is to check how many Pac-Dots are eaten by Pac-Man in each simulation. This will help to determine how effective the movement patterns were and also indicate the number of levels Pac-Man completed. The second measure of performance is the score that Pac-Man ends with. This will provide an indication of how effective Pac-Man was of eating the ghosts and the bonuses that spawn in the maze. The goal of the project at the beginning was to create a system that performs in a similar manner to that of a human. This essentially means that the best Pac-Man system would not only have to achieve the highest score, but also complete the most Pac-Dots and levels. The five different Pac-Man algorithms will be run three times and results collated into a line graph to compare how each system perform against each other.

8.2 Hypothesis The Random Pac-Man was created to set a basis for all others to compete against. Its score may be relatively high for the low number of Pac-Dots it consumes because of the nature of how bonuses spawn into the map where there is no Pac- Dot. Corridor Pac-Man should clear more Pac-Dots than its Random counterpart but I believe its score will be much less. After studying the code of the ghosts movement, I’d expect corridor Pac-Man to die quickly whenever a ghost detects Pac-Man with the ‘saw-pacman’ command. The ghosts will head toward Pac-Man whilst Pac-Man makes no direction change until faced with a maze wall, meaning they should collide head-on. Left Pac-Man should perform slightly better than the previous models to clear more Pac-Dots. I’d expect the progress of this model to be repetitive and not achieve that great of a score though as much of the movement is deterministic. The more intelligent Pac-Man should be vastly superior to the scores achieved in the previous models. There should be better consistency over the course of the three attempts, which will be shown in the standard deviation of the figures. Avoiding ghosts, I’d expect the fourth artificially controlled model to be the first to complete more than one level of Pac-Dots. Its score should be high, but the high-scores from the Random model may be higher in some instances.

Andrew Waterhouse 10219812 38

8.3 Test Results

The results of the test proved to show different levels of performance for each AI technique to Pac-Dots Cleared For Each AI Program complete the Pac-Man game. I will first be 100 looking at how each program completed each 90 maze of its Pac-Dots and then how each program achieved in its score based 80 Random

performance measure. 70 Corridor The worst performing program to clear Pac- 60 Left (random 0) Dots was where the Left Pac-Man had no chance 50 to make a random move to break out of Left (random 0.3) repetitive situations, averaging 7 Pac-Dots each Dots Dots Cleared 40 - Left (random 0.5) time. From the programs that have no

Pac 30 Left (random 0.8) knowledge of ghosts or Pac-Dots, Left Pac-Man 20 Left (random 1) with 100% random movement chance when 10 stuck performed the best. 0 Attempt 1 Attempt 2 Attempt 3

Andrew Waterhouse 10219812 39

Score For Each AI Program The score for each program proves interesting 18000 when comparing how they performed against each other. Minus the models that detect other 16000 agents of the game, the best performing for the 14000 Random highest score was the left simulation where a 12000 Corridor random move was always made whenever Pac- 10000 Left (random 0) Man was stuck in a loop. As expected, the random moving Pac-Man Left (random 0.3) Score 8000 performed the worst, averaging 5966 throughout 6000 Left (random 0.5) the three attempts. 4000 Left (random 0.8) 2000 Left (random 1)

0 Attempt 1 Attempt 2 Attempt 3 Pac-Dots Cleared for Advanced Systems The advanced systems performed much better than the 700 results from the earlier systems created. Being able to 600 detect ghosts and make attempts to avoid losing lives meant the scores and Pac-Dots cleared were far greater 500 than the results earned from earlier models. 400 No system prior to these advanced turtle-detecting SawGhost 300 models was able to complete the first maze of Pac-Man. SawGhost + PacDot The worst score achieved throughout the three runs of 200 the two systems was attempt three of the sawGhost model. It was unable to complete the first map as it 100 struggled to find the last two remaining Pac-Dots. 0 Attempt 1 Attempt 2 Attempt 3

Andrew Waterhouse 10219812 40

The scores for the advanced systems had some Scores for Advanced Systems correlation with the amount of Pac-Dots the 90000 model was able to clear in its attempts. The first run of sawGhost did outperform the same attempt 80000 by the model that detects Pac-Dots as well as the 70000 ghosts (despite the ghost + Pac-Dot detecting 60000 model clearing more Pac-Dots). The reason for 50000 SawGhost this was how bonuses in the map operate and 40000 spawn. They had a direct influence over the score SawGhost + PacDot 30000 achieved by all of the models and their spawning 20000 code meant they only appear in areas already visited by Pac-Man. The sawGhost game had 10000 trouble finding the last few Pac-Dots in the maze 0 so would often wander around the map, Attempt 1 Attempt 2 Attempt 3 occasionally collecting bonuses by accident.

8.4 Observations and Testing Conclusion

As expected, the random moving Pac-Man model provided the basis for scoring and pellets consumed for the simulation runs. The Left Pac-Man model was interesting to see how the random values affected the final results produced by the program. For instance, where the random value was zero, it always took the same route and was stuck in one area of the maze. It did however, score highly relative to the number of Pac-Dots consumed (1161.9) as bonuses only spawn in empty patches. There had to be a balance of random value and I believe the most efficient was where 0.3 was used for the randomness slider. It averaged 62 Pac-Dots per attempt and its deviation for the score was low - providing more predictable results compared than those found in the higher parameter values. The Left model with a 100% random move when stuck made random moves quicker so could complete the map better. It didn't complete more Pac-Dots though as the algorithm often got the agent stuck in one area.

Andrew Waterhouse 10219812 41

The models that are able to detect other agents performed a lot better than those without the ability. On average, the sawGhost program completed the maze of 224 Pac-Dots, over three times better than what was achieved by the third best performing model (Left Pac-Man Random 1). sawGhost was very efficient in its movement and scored 170 when dividing the average score by the average number of Pellets per maze

Pac-Dots completed by the program. Its score was high as it completed the maze of its Pac-Dots and Level 1: 150 pellets was able to attack vulnerable ghosts to score high too. It was the first model that was able to Level 2: 171 pellets complete the maze and complete more than one level. sawGhost did very well consistently with its Level 3: 173 pellets first life but became problematic as it struggled to clear up map with a few Pac-Dots remaining. Level 4: N/A sawGhost + Pac-Dot model was by far the best performing model as expected in the hypothesis. It Level 5: 117 pellets was the only model that was able to complete all of the maps available and ‘lap-around’ to the beginning level again. The model was created to put ghosts as a priority to save the lives of Pac-Man and get a high score when the ghosts are vulnerable. The targeting of Pac-Dots as a backup proved successful, as on average 467 Pac-Dots were eaten across the three attempts, double the record set by the previous program. sawGhost + Pac-Dot program set the new benchmark, breaking all Pac-Dot and score records set by the previous AI controlled Pac-Men. The problems for this model though were where lives were lost a lot of the time when a ghost and Pac-Man make appear at the corner of a turn at the same time. I observed stutter behaviour occasionally too when Pac-Man would attempt to avoid a ghost.

I believe to create the ultimate Pac-Man; it would be hard to resist creating a model that has a global view of its world. I created each Pac-Man simulation so the agent would make local decisions to complete each maze of which would be hard to improve to avoid the loss of lives at corner meeting occurrences.

Note: The level 4 programming error from 3.2.3 earlier in this report meant the maze had to be removed from circulation for the models. After a detailed analysis, the maze code appears to be ok but the cause may be related to ‘tool which-ghost’ which are variables needed to properly load levels 4 and above. This error was in place prior to the development of the artificially controlled Pac-Man meaning any player would be unable to complete the fourth level.

Andrew Waterhouse 10219812 42

9. Evaluation

9.1 Design Evaluation It was the correct decision to use Netlogo as the main programming language for this project because of the need for local interactions of agents. The specialist AI program allowed the results from multiple different versions of an artificially controlled Pac-Man to be compared easily.

9.2 Implementation Evaluation Creating five separate systems that utilize different AI algorithm patterns provided a wide range of testing opportunities later in the project. One of the major decisions made within the implementation stage was to stand by local interactions of Pac-Man and its world. By only allowing Pac-Man to act and have knowledge of its local surrounding patches meant a more honest system could be produced. Globalization of variables and allowing Pac-Man to have a top- down view of its maze would have created an over-powered system, which is not relatable to the real world. During the implementation, I would have liked to create a fix for the level 4 bug within the original system that would have allowed Pac-Man to have completed all of the levels available. I believe that there would have been no problems with any Pac-Man model from completing the level 4 maze design based upon the flawless, bug-free performances from each system. I would have expected only the latter two systems that are able to avoid ghosts to have reached the omitted level because they would have been able to save lives till this point. 9.3 Testing Evaluation The testing of each automated Pac-Man model proved interesting in comparing the results from the simple models to those that are advanced and can react to its environment. The scores and Pac-Dots completed were so great that they would not be able to fit appropriately into the graph without scaling occurring. Based upon the statistics alone, the Left Pac-Man with a randomness of 1.0 was the best performing simple model. It achieved the highest number of Pac-Dots overall at 72 and was able to average a highscore of 13566. The results for the Random Pac-Man proved to be as expected and set the lower boundary for all other automated Pac-Men to perform against. It’s average Pac- Dots cleared (37) was only superior to one model in all of its three attempts, Left Pac-Man with a randomness rating of zero. Consistently, the Left Pac-Man with zero randomness achieved just 7 Pac-Dots cleared in each of its three attempts. The reason for this was how the model was too deterministic and predictable. The program with no randomness meant that the agent would consistently make the same moves in its world and would wait for the enemy ghosts to take a life. Interestingly, the Left Pac-Man with zero randomness did much better in the score it achieved. Averaging 8133, the model with no random movement had a far higher ratio of its score to the number of Pac-Dots cleared compared to any other model. Through observations in the testing phase, it was clear that the primary reason for this was the manner in which bonuses were being spawned.

Andrew Waterhouse 10219812 43

10. Conclusion

10.1 Technical Conclusion Completing this project has made me more confident in how to program and customize games in Netlogo. I believe that the language was perfect for what I had set out to complete at the start of the project due to its flexibility and customization to handle the management of local interactions. In the future, I’d like to try using Netlogo to customize other classic games like Frogger, Bomberman and possibility even . Frogger and Bomberman would be the ideal candidates for the next game to start creating AI controls for, as they operate in a similar manner to Pac-Man with pre-determined agent. Tetris on the other hand, would be more interesting as there is no player to control; instead the AI would have to recognize the rules and objective of the game.

10.2 Project Conclusion The progression made throughout the project on an artificially controlled Pac- Man feels rewarding after the numerous systems created proved to have different levels of performance. I am pleased with the creativity used to create different algorithms to control Pac-Man’s movement techniques and the advanced programs that are able to react to Pac-Man’s environment. One of the major limiting factors of the project was the time constraint. The fourteen weeks available for the project meant that only a certain amount of time could be spent actually programming and creating multiple iterations of an artificially intelligent Pac-Man. If more time was available, I would have perhaps created a Pac-Man system that did use global control in order to compare its performance to the best that could be achieved through local interactions. I also would have liked to create my own mazes for the Pac-Men to be placed into and test certain features (like how they each perform when ghosts are surrounding but taking the right choice will still save a life). The coding problem from the original system for maze four would have been beneficial to have fixed to have the complete original code as it was designed. I am pleased with the results obtained from the experiment into the research of which algorithm would perform best in the 1980’s classic game Pac-Man. I found that in order to produce an intelligent agent, they would have to have input and knowledge of the world in which it resides. Being able to adapt to ghosts movement patterns and Pac-Dot location meant that the final ‘Very Intelligent Pac-Man’ could perform in a manner which would be hard to directly distinguish from a human controlled game. The Turing test for the final Artificially Intelligent Pac-Man would be an area I would like to explore if I had the appropriate regulations from the University’s Ethics Board.

Andrew Waterhouse 10219812 44

Appendix

11. Introduction to Pac-Man

Pac-Man is a classic released in the 80’s and has since become an instantly recognizable icon within the . Typically one player, Pac-Man is of a maze genre where the player has to control the Pac-Man character to complete the level of its collectibles. As the player clears the level of Pac-Dots, they face the threat of losing lives to ghosts. The combination of clearing the

maze of its Pac-Dots as well as consuming enemies with a Power pellet is the key to get the highest score on the leaderboard.

11.1 History Of The Game

The foundation for the game Pac-Man all starts in the old fashioned arcades. Around 1972, the first video game to reach mainstream popularity had arrived and set the scene for the industry. Over the course of eight years since its release, had changed the scene of the recreational centres. These institutions for teenagers had previously been the go to place for , Pool and Air Hockey but were about to be revitalized by an emerging video game market. Soon after Pong had released in 1972, many more games started to enter the arcade scene. made video games exciting and of popular culture in places like Japan and the United States. Asteroid was the next big title, released in 1979 by Atari to a booming target market. Asteroid provided the competitive, social desire to beat friends and was the first game to have leaderboards – one of the key ingredients that made arcades so attractive. Figure 4 - Asteroid, released 1979

Andrew Waterhouse 10219812 45

So the scene was perfectly set for when Pac-Man was due to release in May 1980. The arcade market was just beginning to reach growth in its product life cycle and was a fantastic opportunity for a new game to ride the wave. A Japanese developer from was enthusiastic Pac-Man’s shape is somewhat enshrined about a game he had an idea for. Toru in video game legend. The team of nine Iwatani saw that many of the games in working on the project supposedly went out for pizza on a lunch break. One arcades were shooters, primarily aimed at member took a slice, and the rest they say the male market. He planned to design a is history. game that would appeal to women and be non-violent, something very few games had really done until now. Over the course of months of research, Iwatani wanted to use the Japanese word ‘taberu’ meaning ‘to eat’ within the game. Working at Namco, Toru Iwatani brought the idea to attention of the CEO and the feedback was positive. He was allocated nine employees to develop this game and the supposed story about how Pac-Man’s shape was created emerges here.

Video game shows and conferences didn’t really put Pac-Man in much light. The arcade market at these events was mainly interested in games that were designed for them, much like Rally- X. Developed by Namco, Rally-X is a driving game that utilizes a similar maze like design to Pac-Man. At the gaming events, Rally-X overshadowed Pac-Man and so the expectations for the game were minimal on its release. Rally-X had the same release date as Pac-Man but the sales results were in stark contrast to the feedback at the gaming events.

An immediate success in Japan, Pac- Man took the arcades by storm. Offering a fun maze genre of game to Figure 2 - Rally-X Arcade Game a uni-sex target market, Pac-Man had stuck gold. The game revolved around learning trial and error techniques to complete level-by-level and the hunger of replayability with leaderboard scoring meant players would socially compete with friends. When released in Japan on May 22nd 1980, the immediate success brought thousands of new players into arcades. The success of the release was so great that for only the second time in Pac-Man was going to be named Puck- Japans history, they had to start Man when released in the USA, due to the shape of the character. However, due to making more 100 Yen pieces due to the concern about vandalism and graffiti, the vast quantity stored in Pac-Man arcade Pac-Man name took its place. machines!

Andrew Waterhouse 10219812 46

The immense success was brought to the attention of American distributor, . Pac-Man was released in October 1980 in the United States to an audience that craved the international phenomenon. The success made Pac-Man the biggest arcade game of all time. Manufacturers had struggle keeping up with demand, despite over 350 arcade units being created each day. The demand for arcades soared and finally became into public mainstream. From grocery stores to retirement homes, arcades were starting to decentralize from arcade stores alone. Marketing for Pac-Man was perfect, with merchandise being the key to what Pac-Man so recognizable even to this date. Pac-Man was the first game to branch out from being ‘just a game’. Games today follow the path that Pac- Man had set in the 1980’s to be a success. By mid-1981, the heyday of arcades was starting to decline and the reason for this can be pinpointed to a number of contributing factors. The overwhelming popularity of arcades meant that the wider community

Figure 3 - Pac-Man Snapback Cap starting having cause for concern regarding the number of hours being spent in there by young teenagers. In 1981 alone, players spent 20 billion quarters to play Pac-Man. This accumulated to over 75,000 years collectively spent by players playing Pac-Man in this one year. Court cases opened and started to regulate the opening times of arcades to better regulate the time spent in arcades – when it could better be spent on homework for example. From mid-1982, it was clear that arcades were starting to phase out of the community with figures highlighting 20,000 arcades down to 4,000 from the boom in 1980 to mid-1982 Figure 4 - Classic Pac-Man Arcade Machine [2]. Player skill was a double-edged sword for the arcade industry. On one hand, tapping into a players desire to become better at a game meant consumers would come back to play. This was effectively a slow death pill though without any real blockbuster games being released to whet their appetite. As players got better at Pac-Man and other games like Pong and Space Invaders, running an arcade was starting to become expensive. Players who would have before be putting money in after five

Andrew Waterhouse 10219812 47 minutes of gameplay would now be able to last a lot longer as their skill had increased. The players who used to be funding the arcade establishments would now only be paying every hour or so because of how long they could last with just one payment.

As arcades were slowly fading back out of communities, home versions of arcades were hitting the market in spring 1982. Atari took the Pac-Man game from arcades and developed this into a fully-fledged home arcade game. The Atari 2600 was the first home console of which Pac-Man was ported on to. Pac- Man was such a well-recognized brand at the time that Atari believed once the game released, it would be a console seller, so made more Pac-Man games than there were consoles. The quality of the final Pac-Man game for Atari 2600 was widely criticized for the large number of glitches and how it only vaguely represented what the fans had loved about the arcade version of the title. Although Pac-Man was the biggest selling game for the Atari 2600, this turned out to be one of the nails in the coffin for Atari. Figure 5 - Pac-Man on Atari 2600 There were numerous sequels to the original Pac-Man still being released by Namco throughout the early 1980’s for arcades but this didn’t stop arcades becoming something of a short craze. The Pac-Man franchise had a huge following and the fans of the yellow man meant he still lives on consoles to this date. Over the past 20 years, Pac-Man has featured on virtually every system in some form or another. He is one of very few gaming characters from the 1980’s who are still around today. On July 3rd 1999, Billy Mitchell of United States of America achieved the first perfect score of Pac-Man. The perfect score of 3,333,360 was achieved by not losing a single life after clearing all 256 levels that Pac-Man has to offer. Whilst Pac-Man was designed to be infinite, Billy found that the 256th level in fact had a split-screen glitch that made the level impossible to complete. His high-score was achieved by eating ghosts as he completed levels, eating all Pac-Dots and as much fruit as he could for the maximum score. Only six players to date have fully completed this feat of the perfect Pac-Man score from the original Figure 6 - Billy Mitchell, Pac-Man Player arcade machine. Billy Mitchell is somewhat a small celebrity in the gaming community, being rewarded with numerous Guinness World Record Titles.

Andrew Waterhouse 10219812 48

11.2 Rules

The basic control system of Pac-Man involves just one analog stick. There are four directions of which the Pac-Man character can move – up, down, left or right. There is an original map design of which Pac-Man is renowned for, where there are two exits along the horizontal axis that allow the character to move from one side of the map to the other. The map consists of a maze like design with the center used to store ghosts, Pac-Man’s enemy. The Ghost House in the center of the map is the spawn point for the four ghosts that Pac-Man faces each level. Inaccessible to the player controlled Pac-Man, the Ghost House also regenerates ghosts once eaten by Pac-Man. Eaten ghosts leave two eyes behind as they move back to respawn. Contact with any ghosts whilst Pac-Man has not eaten a Power pellet results in a lost life for the player. Pac-Dots cover the maze, being spread out with one Pac-Dot occupying one patch. In total, there Figure 7 - Original Pac-Man Map are 240 Pac-Dots as well as four extra Power pellets. Power pellets are the key to defeating the on-screen enemies (ghosts) because once they are consumed, the ghosts are vulnerable to attack so scatter away from the players position. Once levels are stripped of their Pac-Dots and Power pellets, the player can progress to the next level. Levels increase with difficulty as the ghosts become faster in speed and eventually become invincible at the later stages. Whilst the player is in full control of Pac-Man’s direction, ghosts don’t have the same level of control. They are unable to reverse direction and only reverse direction once a Power pellet has been eaten and they enter a vulnerable mode. This reverse of direction rule never changes and proves invaluable to all players when caught in tricky situations.

Andrew Waterhouse 10219812 49

11.3 Scoring

To better represent the scoring system of Pac-Man the following table has been set up.

Item Picked Up Score 10

Pac-Dot 50

Power Pellet 100

Cherry 300

Strawberry 500

Orange

700

Apple 1000

Melon

Andrew Waterhouse 10219812 50

2000

Galxian Boss 3000

Bell 5000

Key #1 in succession – 200 points #2 in succession – 400 points #3 in succession – 800 points #4 in succession – 1600 points

Ghosts

After the collection of 10,000 points, a new life is given to the player (up to a limit where the player has four lives at their disposal). Rules can differ depending upon the platform and preferences of the arcade owner.

Andrew Waterhouse 10219812 51

11.4 Characters

To the untrained eye, the four ghosts are identical (minus the colour). However, under each of the four different coloured ghosts lies four distinctly different personalities.

Original Japanese Puck-Man American Pac-Man

Ghost Japanese Translation Character Nickname Movement Colour Name Pattern Red Oikake Chaser Shadow Blinky Targets Pac- (追いかけ) Man’s tile Pink Machibuse Ambusher Speedy Pinky Targets 4 (待ち伏せ) tiles ahead of where Pac-Man is going Cyan Kimagure Fickle Bashful Inky Uses a (気まぐれ) combination of Blinky and Pac- Man position Orange Otoboke Stupid Pokey Clyde If further (お惚け) than 8 tiles to Pac-Man, uses Blinky’s targeting. If closer than 8 tiles, targets bottom left corner of maze

Looking at the four different ghosts highlights four very different targeting techniques that are not obvious when playing the game. Each of the ghosts targets tiles of the maze to identify where they wish to move toward – rather than Pac-Man himself. Blinky has perhaps the foundation of targeting methods, the tile of which Pac-Man is on. The pink ghost, Pinky, targets four tiles ahead of where Pac-Man is, meaning the enemy moves where the Pac-Man is going rather directly where he is. This movement pattern can best be summarized by the Japanese translation, ‘Ambusher‘. Inky’s targeting pattern is somewhat more complex than the previously mentioned ghosts. It uses the position of not only Pac-Man, but the position of Blinky too. It is the only ghost that uses a combination of positions to judge

Andrew Waterhouse 10219812 52 where it should target. Difficult to predict, Inky uses the position/direction of Pac-Man as well as Blinky in its calculation. To start the calculation, we look two tiles ahead of where Pac-Man is facing. A vector is then drawn from the position of Blinky (red ghost) in the direction of the two tiles ahead of Pac-Man. The distance between Blinky and Pac-Man has a direct effect on the target tile of Inky (cyan ghost). The vector drawn from Blinky to the tile two ahead of Pac- Man is doubled in distance to identify the tile targeted by Inky. Figure 5 - Inky's Targetting Vector

The orange coloured ghost has the Japanese translation of ‘Stupid’ and this can be understood when looking at the movement pattern it uses. If the ghost is more than eight tiles away from Pac-Man, a simple targeting system, similar to Blinky is evoked. Once Clyde is closer than eight tiles to Pac-Man’s Figure 6 - Clyde's Bizarre Targetting position, bizarrely it then targets a tile outside of the map. Clyde will stay around the bottom left hand corner of the maze performing a circuit of sorts until it then retargets Pac-Man. All the meanwhile, all ghosts always move in one direction and cannot reverse direction until a Power Pellet has been eaten. It’s pretty complex!

11.5 Variations

Ever since the first installment of the original Pac-Man in 1980, there have been many spin off versions of the game. A focus will be made upon the variants of Pac-Man that were released into arcades – where the Pac-Man craze all began. What is perhaps most interesting is how many of these were never actually authorized by Namco despite becoming so popular! A bitter feud and slightly unprofessional attitude to copyright by Bally-Midway (US manufacturer and distributor of arcade machines) caused the influx of unlicensed Pac-Man games to come to an abrupt stop when Namco cancelled their partnership. The early 1980’s attitude of copyright and partnerships between businesses were lackluster whereas today they form the foundation of the most successful franchises.

Andrew Waterhouse 10219812 53

11.5.1 Arcade

Pac-Man (1980) Ms Pac-Man (1981) Super Pac-Man (1982)  Original game  Female character  Gated areas of  Grossed $2.5  Sold 115,000 map billion till 90’s arcade cabinets  Collectibles are  Instant gaming  First unofficial fruit, no Pac-Dots classic title by Midway,  Super pill makes  Basic premise eventually taken huge, inedible remains to date by Namco Pac-Man  Unofficial title

Pac- Man Plus (1982) Professor Pac-Man

 Fan made Baby Pac-Man (1982) (1983)

 Fast paced  Incorporates  Quiz/memory pinball as well as  Fruit causes game arcade game invisible ghosts  Nothing really to

 Score in pinball do with Pac-Man  Power pellet has has effects in  Scoring relies on unpredictable arcade game quick thinking effects on gameplay

Andrew Waterhouse 10219812 54

Pac & Pal (1983) Jr Pac-Man (1983) Pac-Land (1987)  Introduces Pac-  The final straw  First Pac-Man 2D Man friend Miru for Namco Platformer  Cards on map for meaning last  Ghosts are player to find unofficial game enemies, pairs for abilities  Much larger map vulnerable to core like smokescreen with several Power Pellet when Rally-X cosmetic changes mechanic cards collected to original  Made by Namco

Pac-Man VR (1996) Pac-Man Battle Royale

Pac-Mania (1996)  First person view (2011)

 3D view of Pac-Man  Round based,

 Themed levels  Ability to unlock focused on

 Limited number extra time as you multiplayer

of levels play  Eat other Pac-

 Distributed by  Can see over Men and ghosts

new partners: walls for Pac-Dots  Trophies at end of

Atari and ghosts game

Andrew Waterhouse 10219812 55

What is really interesting is how the games develop and adapt over the course of the 31 years from Pac-Man’s first installment until its latest arcade release Pac- Man Battle Royale. The development of Pac-Man games had one large obstacle that no other game would normally have to contend with these days – plagiarism. The definition of the word means the practice of taking someone else's work or ideas and passing them off as one's own. Bally-Midway was a company in the USA which was Namco’s partners to reach out into one of the main markets for arcade games. They were a distributor and licensor of Pac-Man but released many titles that were not authorized by the publisher Namco.

From the popularity of Pac-Man, many fans created their own version and add on packs for the arcade game. Ms. Pac-Man was the first unofficial sequel to Pac- Man, created by an MIT student for a project. The game was developed as an add-on board Top Five Arcade Games for the existing Pac-Man machines. Looking (Hardware units sold) [4] for a new sequel to the hottest game on the 1. Space Invaders - 360,000 market, Bally Midway bought the add-on pack from the student and started work on units new Ms. Pac-Man arcade consoles. Craving 2. Pac-Man – 350,000 units for a sequel, arcades all around the globe 3. Street Fighter II Champion wanted to get their hands on the customized Edition – 140,000 units Pac-Man game. Selling 115,000 arcade cabinets up until 1988, it became one of the 4. Ms Pac-Man – 115,000 units most popular arcade games of all time. 5. Asteroids – 70,000 units

In the 1980’s, there were only three official Pac-Man titles that were made by the publisher Namco: Pac-Man, Pac & Pal and Pac-Land. All of the other titles in the 1980’s were made and distributed without permission by Bally Midway (now called Midway Games). Many of the Pac-Man titles released in the 1980’s were simple hacks/tweaks to the original game. The basic premise was at the foundation of games like Super Pac-Man and Jr Pac-Man where the player would have to collect items to advance to the next level. Baby Pac-Man arcade game is the rarest of all Pac-Man cabinets with only 10,000 being manufactured. It started to push the boundaries of what could be expected in arcade games by incorporating two games into one. The player would be able to play pinball, just like arcades used to be prior to the video game market revolutionizing everything. How the player performed in the pinball game would have a direct effect of what would happen when the player re-enters the maze on screen, like determining how many Power Pellets were on the map.

Professor Pac-Man was the first real push away from the roots of where Pac-Man had begun. It made Pac-Man enter a new type of video game market – quiz. Although this wasn’t too popular with players, Namco was using this breakthrough as a premise for a game to be released in 1987. Pac & Pal was a game made by Namco which had taken the ideas of gated areas of a maze (from Super Pac-Man) and included a new character to put a spin on gameplay. Miru was an AI character who was on Pac-Man’s team to help collect fruit in the map. A new mechanic was introduced to the franchise too – a card based system

Andrew Waterhouse 10219812 56 where the objective was to uncover matching pairs to unlock special abilities. One such ability was taken from Rally-X where the player could stun ghosts with a smoke-screen. Players liked the incorporation of other Namco titles but this game was only released in Japan whilst the lawsuit against Bally-Midway was taking up pace. The last unofficial game to be released by Bally-Midway was Jr Pac-Man, which was not too different from the original Pac-Man (beside the much larger maze).

Pac-Land was one of Namco’s more risky moves with the Pac-Man brand. The 2D platform title was something Pac-Man had never ventured into previously. This title set the ground for the home consoles as it moved away from the monotonous gameplay in the original game. Featuring themed levels and a storyline featuring a fairy, players liked the twist offered by Namco to freshen up the cluttered franchise of knock-off titles. As technology improved, so did the graphics of which is found in the much clearer Pac-Mania 1996 title. The 3D view gave avid fans of Pac-Man a new game to play since arcades had become stale after the 1980’s. Also released in the same year, Pac-Man VR had a clearer focus on generating more cash out of players no matter how good they were. Players were given a limited time to do as well as they could and players could unlock extra time as they played. The first person view of Pac-Man was seen as groundbreaking to players as it provided a view they had only dreamt of when talking to friends back in 1980.

The latest Pac-Man game to be released into arcades was Pac-Man Battle Royale. A core Key Points How Pac-Man Has focus was made on making the game social, Adapted Over Time encouraging multiplayer gameplay. The  Moving into new video game premise was to be the overall winner of markets like platformers round-based gameplay against friends. The competitive nature allowed you to eat  Basic premise from original other players on screen as well as the game remains to date (no traditional ghosts. Achievements are given matter the type of game) to each player at the end of the game to provide feedback on how they performed,  Timed gameplay or limited like ‘Player 2 ate the most ghosts’ or levels helps generate more ‘Player 3 got the highest score’. This capital reward at the end of playing with friends  Focus on multiplayer to encourages players to play more than one round and helps to generate more money encourage social gaming as it for the arcade machine. Pac-Man had keeps modern with latest cleverly moved away from potentially home console offerings unlimited amount of time a player could spend at one cabinet.

Andrew Waterhouse 10219812 57

11.6 Project Time Management Gantt Chart

Andrew Waterhouse 10219812 58

Each of the main stages of the system life cycle have been considered and included in the Gantt Chart above. They each appear as a main task category whilst they are coloured on the chart with diagonal lines to represent how I intend to spread my time between each stage. These main stages have been broken down into sub tasks to better manage and complete them within the dates shown.

Andrew Waterhouse 10219812 59

Bibliography

[1] Intelligent Agents, Russell & Norvig [Online], Available: http://cs.brynmawr.edu/cs372/slides/02_IntelligentAgents.pdf [Feb 2014]

[1] Worldwide Video Game Market to Total $93 Billion (2012), Gartner [Online], Available: http://www.gartner.com/newsroom/id/2614915 [Feb 2014]

[2] The History of Pac-Man (2012), G4 Show [Online], Available: http://www.youtube.com/watch?v=_13wt0p1XeE [Jan 2014]

[3] Pac-Man Rules (2009), eHow [Online], Available: http://www.ehow.com/facts_5163464_pac-man-rules.html [Jan 2014]

[4] The 10 Best Selling Video Arcade Games Ever (2011), SEOJoe [Online] Available: http://seojoe.hubpages.com/hub/The-10-Best-Selling-Video-Arcade- Games-Ever [Feb 2014]

Credits Original Pac-Man Model Code (Found within the Netlogo Models Library): http://ccl.northwestern.edu/netlogo/models/Pac-Man

Andrew Waterhouse 10219812 60