Aalto University School of Arts Master’s Programme in Game Design and Production

Yuanqi Shan

A procedural character generation system

Master’s Thesis

26/04/2021

Supervisor and Advisor: Perttu Hämäläinen, Professor Aalto University, P.O. BOX 31000, 00076 AALTO www.aalto.fi Master of Arts thesis abstract

Author Yuanqi Shan Title of thesis A procedural character generation system Department Department of Media Degree programme New Media: Game Design & Production Year 2021 Number of pages Language 55 English Procedural content generation refers to using algorithms and code to create content like text or audiovisual assets. In the context of video games, the term is usually associated with generated levels and environments which are often used to increase replayability and lower the development costs of a game. In a world where more games are being made every year, some with high development costs, it would be natural to see some developers shifting towards using procedural content generation for many high-cost game development tasks such as character creation. Procedural character generation is not a new thing, as games such as Spore and No Man’s Sky have used it to great effect. However, the topic in general appears to be still unexplored. This thesis presents a design for a procedural character generation system called Unity Procedural Character System. The system was designed for the Unity game engine with the goal of being an accessible, out-of-the-box solution that could simplify the character creation part of the game development process by allowing developers to easily generate 3D characters. In addition, the system is capable of mutating and fusing characters together, allowing developers to generate large amounts of different characters in a short period of time. The design of the Unity Procedural Character System was based on existing procedural tools and video games that used procedural character generation. In order to evaluate the capabilities of the Unity Procedural Character System, an implementation of the system was made and used to procedurally generate a variety of 3D characters. Although there were still improvements to be made on the quality of the generated characters, the system was well capable of generating large amounts of different characters. Therefore, the author recommends the system to be used for game projects with short development cycles or lower standards for visual quality, such as mobile games, game prototypes or game jam projects. Keywords procedural generation, character generation, game development Aalto-yliopisto, PL 11000, 00076 AALTO www.aalto.fi Taiteen maisterin opinnäytteen tiivistelmä

Tekijä Yuanqi Shan Työn nimi Proseduraalinen hahmogenerointijärjestelmä Laitos Median Laitos Koulutusohjelma New Media: Game Design & Production Vuosi 2021 Sivumäärä 55 Kieli englanti Proseduraalisella generoinnilla tarkoitetaan sisällön tuottamista algoritmien avulla. Pelien kehityksessä sitä käytetään usein pelien uudelleenpelattavuuden parantamiseen sekä kehityskustannusten vähentämiseen, esimerkiksi generoimalla tasoja sekä peliympäristöjä. Julkaistujen pelien määrä sekä pelien kehityskustannukset ovat kasvaneet vuosittain, minkä seurauksena olisi luonnollista että pelikehityksessä aloitetaan vähitellen käyttämään proseduraalista generointia tuottamaan muuta pelisisältöä kuten esimerkiksi hahmoja. Muun muassa Spore sekä No Man’s Sky ovat esimerkkejä peleistä, joissa proseduraalista hahmogenerointia on käytetty onnistuneesti. Tästä huolimatta proseduraalinen hahmogenerointi on akateemisen tutkimuksen aiheena harvinainen. Tässä opinnäytetyössä esitellään proseduraalinen hahmogenerointijärjestelmä nimeltä Unity Procedural Character System, joka suunniteltiin toimimaan Unity-pelimoottorissa. Työn tavoitteena on tarjota pelikehittäjille yksinkertainen tapa generoida iso määrä 3D hahmoja lyhyellä aikavälillä. Tämän mahdollistaa järjestelmän kyky mutatoida ja yhdistää hahmoja keskenään. Järjestelmän toimintaperiaate perustuu olemassa oleviin proseduraalisiin työkaluihin sekä peleihin, joissa on käytetty proseduraalista hahmogenerointia. Järjestelmän toimivuutta arvioitiin tarkastelemalla generoituja hahmoja. Arviointi tuo ilmi, että järjestelmä kykenee generoimaan paljon erilaisia hahmoja, mutta hahmojen laadussa on vielä parannettavaa. Tästä johtuen järjestelmää suositellaan käytettäväksi pääosin peliprojekteihin, joissa on lyhyt aikataulu tai matalat vaatimukset peligrafiikan laatuun, kuten mobiilipeleihin, peliprototyyppeihin tai game jam -peleihin. Avainsanat proseduraalinen generointi, hahmogenerointi, pelikehitys Table of Contents

1. Introduction 6

2. Background 9

2.1 Examples of procedural character generation in video games 9

2.1.1 Impossible Creatures 9

2.1.2 Spore 11

2.1.3 No Man’s Sky 12

2.2 Procedural tools for development 13

2.2.1 13

2.2.2 Autodesk Character Generator / Character Creator 3 13

2.3 Other tools 15

3. Design 17

3.1 Terminology 17

3.1.1 Unity Procedural Character System 17

3.1.2 Node 17

3.1.3 Modifier 18

3.1.4 Blueprint 19

3.2 Creature generation 21

3.2.1 Creating a blueprint from scratch 22

3.2.2 Mutating an existing blueprint 22

3.2.3 Fusing blueprints 24

4. Implementation 26

4.1 Important Unity features 26

4.1.1 Physics 26

4.1.2 Mesh manipulation 27 4.1.3 Scriptable objects 27

4.2 UPCS 27

4.2.1 Graphical user interface 28

4.2.2 Generation logic 29

4.3 Results 32

5. Conclusion 45

5.1 Acknowledgements 47

6. Appendices 48

6.1 List of body part node properties 48

6.2 Mutation descriptions for each modifier/node property 50

6.3 Generation performance 51

6.4 Web demo link 52

References 53 1. Introduction

Procedural content generation (PCG) refers to automatic and algorithmic creation of content. In the context of video games, PCG is often used to generate graphical assets, such as environments and . Instead of an artist creating an object or environment from scratch, a computer program generates them instead using predetermined sets of rules. [1]

Certain and strategy games, such as Dead Cells ( Twin, 2018) and the Sid Meier’s Civilization (Firaxis Games) franchise, use procedural generation for increased replayability by generating their levels during runtime. In other words, the levels are generated on the fly when the user starts playing the game rather than being constructed manually during development by a designer. Most of the time, the levels are generated by combining certain building blocks, such as tiles or hexes, together by following a series of rules set by developers. For many players, procedurally generated levels bring increased replay value to the games, since each level is different from the previous. This means that the players can experience different challenges even if they start the same game from the beginning. For developers, procedural level generation saves development time and costs since it removes the need for manual level design and creation, which was also the case for Dead Cells according to the Lead Designer of the game, Sebastien Benard [2]. However, a common critique of procedurally generated levels is that the levels can often feel bland and lack the amount of details that would normally be carefully added by a designer.

The amount of visual detail and the overall quality of the worlds in video games has risen a lot in recent years. With new technologies and more powerful hardware emerging in short timespans, many game developers have been trying to match the increase of resulting performance budgets in graphics by adding larger, higher-quality worlds and assets into their games. The increasing quality of games often results in larger install sizes. This can be seen, for example, in the Call Of Duty video game franchise by . In 2003, the first Call of Duty game was released for PC with an install size of 1.4 gigabytes. Since then, Activision has released a new game for the Call of Duty franchise annually, with nearly every new game larger than its predecessors [3]. The install size of the newest game in the franchise, Call of Duty: Modern Warfare, with all of its content is roughly 175 gigabytes, an over hundred times increase compared to the original [4]. Some developers, however, have used procedural generation to achieve smaller install sizes. A recent example can be found in No Man’s Sky (2016, Hello Games), a space exploration game that takes place in a procedurally generated universe which, according to developers, consists of over 18 quintillion explorable planets [5]. Despite the unbelievably large number, the actual install size of the game is roughly 10 gigabytes [6]. Compared to many of the recent AAA video games and considering the sheer size of its world, the install size of No Man’s Sky is extremely small.

In addition to large install sizes, procedural generation offers possible relief to another major problem in the game industry, development costs. Game development is often an iterative process, meaning many assets and features are constantly changed and remade until they reach satisfactory levels of completion [7]. Large amounts of changes to assets results in longer development time which results in higher development costs. The increasing costs were also a topic in the Conference 2018 talk, The Next Leap: How A.I. will change the 3D industry, by professional 3D artist Andrew Price. In his talk, Price explains that the expensive assets are by-products of a static workflow, and suggests that in the future procedural workflows and machine learning will become commonplace as developers strive for lower costs. [8] This talk by Price and the possibilities of procedural generation became some of the main motivations of this thesis.

The goal of this thesis is to design and implement a system for procedurally generating different types of 3D characters for video games. This focus was chosen because procedural characters play a crucial role in only a handful of games and the topic appears underexplored. Most importantly, there was no freely available plug-and-play character generation tool that could be added to the development process of any kind of game, as described by Togelius et al. [9]

This thesis begins with an introduction to video games where procedurally generated characters and creatures play a major part in the worlds or mechanics. Following that, we describe the design and implementation of the procedural character generation system in detail. Finally, we present the final results of this thesis by analyzing the advantages and disadvantages of the system and discussing the future of the system and procedural character generation in general. 2. Background

In this chapter, we provide some background on how procedural content has been used in the video game industry. In the first part, we introduce video games where PCG has been used to generate characters and plays a crucial role in their core features. In the second part we introduce some industry standard software that can be used to make procedural assets. Finally, we analyze some academic research related to procedural characters.

2.1 Examples of procedural character generation in video games

2.1.1 Impossible Creatures

Impossible Creatures is a real-time strategy game developed by . The game was released in 2003 for platform. At the beginning of the game, the player is introduced to the mysterious Sigma Technology that makes it possible to combine the body parts of different creatures. This feature of combining creatures becomes one of the main gameplay elements in Impossible Creatures, alongside the real-time strategy part where the player controls creatures to fight and defeat their enemies.

Combining monsters in Impossible Creatures is done inside a creature editor (Figure 1) where the player chooses two animals and selects 5 body parts from either of them. The game then procedurally combines the selected body parts to form the desired creature. Furthermore, the new creature will have different abilities and combat parameters depending on which creatures and body parts were used for its creation. In other words, the procedural generation in Impossible Creatures is not only used for visuals but gameplay elements as well. Figure 1: The creature editor in Impossible Creatures1

During its release in 2003, Impossible Creatures had 51 creatures that could be used to create new ones. Using basic combinatorics we can calculate that there are 1275 possible creature combinations and 31 body part combinations for each combined creature, which means 39525 combinations in total. Therefore, it is very likely that Relic Entertainment managed to save a considerable amount of time and money by creating a system where a relatively small pool of assets was hugely expanded through procedural algorithms.

In an interview by the video game website IGN, the lead technical programmer of Impossible Creatures, Drew Dunlop, compared the creatures to LEGO models that could be broken down into individual pieces and rearranged them to form a new creature. In order to achieve visually consistent combinations, the programmers defined a set of points that each

1 Image source: https://store.steampowered.com/app/324680/Impossible_Creatures_Steam_Edition/ body part could be attached to. Then, the 3D models and textures of the creatures were created in a way that they could easily be blended together. [10] The procedural character generation system presented in this thesis uses a simplified version of this approach to generate different characters.

2.1.2 Spore

Spore is a real-time strategy game developed by Maxis. The game was released in 2008 for PC platforms. The gameplay of Spore is split into 5 different stages, starting from the Cell Stage where the player controls a small organism whose goal is to eat other organisms and grow. Once the organism has grown into a full-fledged creature, the game enters the Creature Stage where the player is greeted by a creature editor (Figure 2). The editor allows the player to create and customize their own creature by attaching premade, deformable body parts together. The editor also provides different color palettes and skin patterns to further customize the creatures. The game also randomly generates creatures as non-playable characters for the player to interact with.

Figure 2: The creature editor in Spore2

2 Image source: https://store.steampowered.com/app/17390/SPORE/ 2.1.3 No Man’s Sky

No Man’s Sky is a space exploration game developed by Hello Games. The game was originally released in 2016 for PC and Playstation 4. One of the key features of the game is the procedural generation of an entire galaxy which includes planets, flora, fauna and much more. According to Hello Games the entire galaxy of No Man’s Sky consists of over 18 quintillion procedurally generated planets. [5,11] Many planets contain a variety of procedurally generated creatures, which helps the planets feel more alive and worth exploring.

In 2014, one of the founders of Hello Games, Sean Murray, showcased their procedural generation tool (Figure 3) during a video interview by the video game magazine . In the video, Murray is seen selecting a creature inside the tool and then generating dozens of similar but visually distinct creature variants in an instant. [12] The simplicity of the generation process highlighted the power of procedural generation and one of its main benefits: saving development time.

Figure 3: Hello Games’ procedural generation tool for No Man’s Sky3

3 Image source: https://www.youtube.com/watch?v=h-kifCYToAU 2.2 Procedural tools for video game development

2.2.1 Houdini

Houdini is a 3D procedural software developed by Side Effects Software (SideFX) that is used in many graphics related tasks, such as , rigging and . When creating a new 3D model in Houdini, every action related to the modeling process is stored inside a node. The final 3D model is the result of several nodes connected together. Each node contains parameters used to perform some action, which means that by doing small modifications, such as changing node parameter values, removing a node or adding a node, it is possible to create similar but slightly different 3D models in a short period of time.

Figure 4: A simple 3D mesh in SideFX Houdini interface before (left) and after (right) removing a node.4

2.2.2 Autodesk Character Generator / Character Creator 3

Autodesk Character Generator and Character Creator 3 are 3D character creator software developed by Autodesk and Reallusion. Both software contain large selections of premade body parts, clothing and textures that can be combined to create customized characters. The characters can be further customized by changing various parameters, such as body part width, using sliders and buttons in the graphical user interface. The generated characters

4 Image source: https://www.sidefx.com/learn/getting_started/ can then be exported into game engines or other modeling software for finishing touches.

Figure 5: Autodesk Character Generator interface

Figure 6: Character Creator 3 interface 2.3 Other tools

Hudson [13] presented genetic algorithms as one possible way to generate creatures procedurally in their creature generation tool for SideFX Houdini. The tool takes user input regarding the number and sizes of different body parts, such as legs, arms and wings, and then applies these input values to part-specific genetic algorithms. The algorithms are then performed inside Houdini to generate the creature and its animation rig. The tool is also capable of applying two genetic algorithm operations, crossover and mutation, which are used to make variations of previously generated creatures.

Another solution demonstrated by Bach and Madsen [14] uses principal component analysis to find the important components of an example 3D model and then applies reference fitting to these components to generate low detail models similar to the example model. The method seemed to have been successful at generating low-quality humanoid characters, although the solution did not analyze whether the texture data and of the example model could be used on generated models, which could be a target for further research in the future according to Bach and Madsen.

In addition to example 3D models, it has also been proven to be possible to generate characters based on a 2D image, as shown by Dvorožňák et al. [15] and their 3D modeling and animation tool “Monster Mash”. The tool takes a 2D sketch of a character drawn by a user, detects regions that represent a body part, inflates these regions into a 3D mesh and finally performs mesh deformation in order to separate overlapping body parts. Additionally, the tool lets the user animate the generated mesh by adding control points anywhere on the character’s surface. Each control point affects the generated mesh like a joint, which means that the user can record different poses for the generated mesh by dragging the control points around. Finally, an animation is created by looping through the defined poses.

Chelliah et al. [16] used machine learning to teach a neural network to generate 3D human face models from an image database. The results seem promising, and the possibility to expand the solution into generating full body models and creating animations is something to look forward to.

Although animations can easily be added to game characters during development, in most cases it would not be possible to add animations to a character that was generated during runtime. This was an important problem to solve for the developers in the game Spore where players could create their own creatures with any number of limbs and other body parts. The solution that the developers came up with was real-time motion retargeting supported by an animation tool called Spasm, as described by Hecker et al. [17] The tool allows animators create animations for characters and body parts the traditional way, by setting keys and editing animation curves, but it also transforms them into generalized, character-independent animations. These generalized animations would then be applied to creatures during the runtime of Spore, which would allow all creatures in the game to have believable animations regardless of the creatures’ morphologies.

Altogether, compared to procedural generation solutions for buildings and vegetation, there is a lack of academic literature about procedurally generated humans or characters for video games, as stated by Gaisbauer and Hlavacs [18]. However, the growing interest around the topic in recent years shows there is room for new, innovative research approaches, such as character creation frameworks similar to Autodesk Character Generator, as mentioned by Freiknecht and Effelsberg [19], and deep learning, as stated by Liu et al. [20]. 3. Design

This chapter goes through the terminology and design of a procedural character generation system which is the main focus of the thesis. The core principles behind the design of the system are as follows:

1. The system and its parts should be easily configurable and reusable, which should be achieved using a node based design, similar to the workflow in Houdini. 2. The system should generate characters by combining different body parts, as seen in Impossible Creature and Spore. 3. The system should support combining multiple characters into new characters, similar to the creature editor in Impossible Creatures and the crossover function of the creature generation tool by Hudson [13]. 4. The procedural generation of the system should be deterministic, as seen in No Man’s Sky. In other words, the system should always generate the same creatures when given the same input parameters.

3.1 Terminology

3.1.1 Unity Procedural Character System

Many features of the procedural character generation system presented in this thesis are designed with certain core features of Unity, a game engine developed by Unity Technologies, in mind. Such features include runtime mesh manipulation and raycast physics. As such, we decided to refer to our procedural system as Unity Procedural Character System (UPCS).

3.1.2 Node

According to the National Institute of Standards and Technology, a node is a “unit of reference in a data structure” [21]. Similarly, nodes in UPCS are used to store the data that are used to generate each creature. Most of the time, a node represents a type of body part, such as head or tail, of a character. The exceptions to this rule are nodes that are used to configure the default color palette and material properties of the character.

In UPCS, we define individual parameters inside the nodes as properties. Each property affects some aspects, such as the scale and color, of a body part. After all properties of a node have been set, UPCS uses them to generate a 3D mesh of a body part. All node properties are listed on the table in Appendices section 6.1 and their functionalities inside UPCS are described in more detail in section 3.2.

Figure 7: Visualization of a node in UPCS

3.1.3 Modifier

For each type of property we also define at least one type of modifier. A modifier is used to modify the values of node properties. For example, in order to create a large head we first create a node with its Part Type value set to “Head”. Then, we add a scale modifier that sets the Scale value of the node to large value. Figure 8: Visualization of a modifier affecting a property of a node in UPCS

3.1.4 Blueprint

Lastly, we define a blueprint as a collection of nodes and modifiers. When UPCS receives a blueprint, it applies all modifiers to their target nodes. Each modifier may affect one or multiple nodes. Once the modifiers have been applied to the nodes UPCS creates body parts one by one using node parameters, which then form a single character when combined. In addition, we store default values for Colors and Visual Settings node properties that will be used for nodes that have undefined values for these properties.

In Figure 9, we’ve visualized a complete blueprint for a simple slime creature with a dome-shaped body and two eyes (Figure 10). This example blueprint consists of a body node and an eye node. For the body node we’ve added a Colors modifier that sets the body color to pink and a Mesh modifier that flattens the bottom of the default body part mesh which is a sphere. Similarly, we’ve added a Colors modifier to the eye node for setting the light green color. In addition, a Scale modifier and a Raycast Settings modifier have been used to define the size and position of the eye relative to the body. Finally, we’ve applied a Mirror Axis modifier to set the Mirror Axis property value to Y, which instructs UPCS to dynamically create another eye node with the position mirrored against Y-axis.

Figure 9: A blueprint visualization of a slime creature Figure 10: A simple 3D slime randomly generated with UPCS

In the previous example, we had used modifiers to set node property values. UPCS also supports setting these values directly in the nodes, as we had done to the Part Type properties shown in Figure 9. The reason for using modifiers to set other property values is to make the blueprint more susceptible to changes from blueprint mutations and blueprint fusions which are features in UPCS used to create new blueprints. These features are explained in more detail in sections 3.2.2 and 3.2.3 respectively. Another reason behind designing UPCS around blueprints, nodes and modifiers is reusability. A node can be reused between different blueprints and modifiers can be reused between different nodes. Therefore, we minimize the amount of duplicate data when creating multiple blueprints. This approach was partly inspired by the 3D modeling workflow in SideFX Houdini.

3.2 Creature generation

In order to generate creatures using UPCS, it is necessary to create blueprints that determine how each creature should be generated. In the following sections, we present three ways for a developer to create new blueprints in UPCS. 3.2.1 Creating a blueprint from scratch

The first way to create a blueprint for UPCS is to construct one from scratch. This means either creating new nodes and modifiers, reusing old nodes and modifiers or a mix between the two.

Using the slime blueprint described in section 3.1.4 as an example, if we wanted to create the same blueprint from scratch we would need to create the nodes and modifiers shown in Figure 9. Now, if we wanted to create a new blueprint for a creature with a pink dome-shaped body or green eyes we could reuse the same nodes created for the slime blueprint. In case some property, such as the eye color, was not suitable for this creature, we would create a new eye node with the same modifiers except the Colors modifier which would be replaced by a new Colors modifier with a different value. Additionally, this new modifier, as well as the pink and green Colors modifiers, can be reused for any new node that should have the same color.

Sometimes there can be cases where the existing modifiers in UPCS are unable to achieve the look that we desire. In this situation, we would need new implementations that handle these cases. For example, we could want to add wrinkles to the body of a character. The goal here is to modify the mesh of the character, but none of the existing Mesh Changer implementations are able to do that, so we need to create a new type of Mesh Changer that takes in certain parameters, such as wrinkle size and shape. Finally, we create a modifier that modifies these parameters, after which we have a new UPCS modifier that can be used in all future blueprints.

3.2.2 Mutating an existing blueprint

The second way to create new blueprints is to mutate an existing blueprint. In the context of UPCS, mutation means programmatically changing the values of modifiers that are connected to the nodes and creating a new blueprint using changed nodes and modifiers.

The way each modifier mutates differs depending on which node property the modifier is affecting. For basic properties, such as Scale and Colors, we slightly change each numerical value within predetermined minimum and maximum values, similar to the mutation algorithm used by Hudson [13]. A table with descriptions on how each modifier mutates can be found in Appendices section 6.2. As seen from the table, Part Type and Mirror Axis modifiers do not mutate at all as a precaution against bizarre results. As an example, if an eye node mutated into a hand node there would be cases where UPCS would generate fingers on top of eyes on certain blueprints. Such behavior could result in interesting characters, but hurts the overall quality of the generated characters. Each node and modifier can also be flagged so that UPCS does not mutate them at all.

Looking at the example blueprint presented in section 3.1.4, we see from Figure 9 that the blueprint contains Colors and Mesh Changer modifiers for the body and Colors, Scale, Raycast Settings and Mirror Axis modifiers for the eyes. Applying the mutation rules from section 6.2 to the blueprint results in a new blueprint where:

- The values of the Colors modifiers for both the body and the eyes have changed randomly. - The values of the bottom-flattening Mesh Changer modifier have changed slightly. - The values of position-controlling Raycast Settings modifier have changed slightly.

In conclusion, the slime creature generated from the mutated blueprint will have different body and eye colors, different body height and differently positioned eyes. Examples of slime creatures generated from mutated blueprints can be seen in the Results section 4.3 (Figure 13).

Although each value resulting from a mutation is generated randomly, UPCS uses a pseudo-random number generator to generate new values to mutated modifiers using a seed number that the user assigns to a blueprint. In other words, a user can assign a number to a blueprint, and until the number is changed the blueprint will always mutate in the same way, upholding the fourth UPCS design principle, deterministic generation.

The goal behind the design of mutation feature is to enable UPCS to generate similar variants of already designed characters. Similar to palette swaps, character mutations can help make the game worlds feel more alive by making the characters visually distinct from each without using much space or memory. As an additional motivation, the mutation feature would go well together with certain types of role-playing games where the playable characters grow while progressing through the game, usually in the form of increasing combat parameters. Adding small changes to characters through mutations would be simple using UPCS since its implementation is made using the same game engine that would be used to create all the logic inside a game.

3.2.3 Fusing blueprints

The third way to create new blueprints is by fusing two existing blueprints together. In the context of UPCS, fusion means combining the nodes and modifiers of two blueprints into a new blueprint.

When UPCS receives the blueprints for fusion, it first creates an empty blueprint, Then, it tries to find nodes that fulfill the same purpose for both blueprints, such as generating the same type of body parts. These matching nodes are fused together while the other nodes have a random chance to be added to the newly created blueprint. Node fusion works the same way as blueprint fusion, except with modifiers. On the other hand, the logic behind modifier fusion differs between each modifier, but in most cases we randomly interpolate between the values of the modifiers. Finally, the fused nodes are added into the blueprint, finishing the fusion process. Similar to mutation, blueprint fusion is a deterministic process where the values of the resulting blueprint are determined by a pseudo-random number generator.

Currently, there is no way to determine if Mesh Changer modifiers from two different nodes fulfill the same purpose. Due to this, UPCS will prioritize the Mesh Changer modifiers of the first blueprint when fusing blueprints. In other words, fusing blueprint A into blueprint B will yield different results than fusing blueprint B into blueprint A.

The goal behind the design of the fusion feature is to enable UPCS to generate characters that are strongly distinct from already designed characters, which could bring inspiration and new ideas for character designers. In theory, adding a new blueprint to the system will increase the amount of character variations exponentially, since the new blueprint can be fused into every previous blueprint. The fusion feature would go well with certain types of monster raising games where players would be able to fuse monsters together or breed new monsters from two parent monsters. For example, in the Pokémon (Game Freak) games the player can breed new Pokémon from two parents. However, the resulting Pokémon is always from the same evolution line as one of the parents, because the concept of fusion does not exist in the franchise. Similar to mutation, the UPCS implementation of the fusion feature makes it easy to combine character generation and fusion logic with gameplay logic. 4. Implementation

Unity is a cross- engine developed by Unity Technologies that was originally launched in 2005. Currently, the engine supports publishing games on over 20 different platforms [22]. Based on data available from popular game hosting services and itch.io, Unity has been one of the most popular engines used for game development in recent years [23]. Some of the popular video games made with the engine include Hearthstone (Blizzard Entertainment, 2014), Cities: Skylines (Colossal Order, 2015), Cuphead (Studio MDHR, 2017) and Fall Guys (Mediatonic, 2020) [22,24]. In addition, all individual developers who make less than $100,000 in annual revenue can use Unity for free. The popularity and accessibility of the engine were the main reasons why Unity was chosen as the base of UPCS.

In this chapter, we describe the implementation of UPCS inside Unity. First, we introduce certain features in Unity that play a crucial role in the implementation of UPCS. Afterwards, we describe the implementation of the system in more detail. Lastly, we present the graphical user interface of UPCS and how a developer would use it to generate characters.

4.1 Important Unity features

4.1.1 Physics

Unity has a built-in physics system for detecting collisions. One of the features of the physics system, called raycasting, shoots an invincible ray from a point towards a target direction to detect if there is a collider, a user-defined invincible area, in its path. This feature is used by UPCS to determine how different body parts should be positioned relative to each other using the values in the Raycast Settings node property. 4.1.2 Mesh manipulation

Unity makes it possible to modify a 3D mesh runtime. This includes adding and removing vertices, normals and triangles. Mesh manipulation is used by UPCS to apply changes to the meshes of generated characters using the values in Mesh Changer node properties.

4.1.3 Scriptable objects

Scriptable objects are built-in objects in Unity mostly used for storing data. In most cases, each scriptable object is an asset file that can store references to other assets. In UPCS, scriptable objects are used to store blueprint, node and modifier data as well as configuration settings. Each of these objects can be easily created from the menu toolbar in the Unity editor.

4.2 UPCS

To ensure that UPCS would be easily configurable we made sure that blueprints, nodes and modifiers could be represented using Unity scriptable objects. Because scriptable objects are able to reference each other, creating new blueprints in Unity can be simply described in the following steps:

1. Create a blueprint object. 2. Create new node objects if needed. 3. Create new modifier objects and set their values if needed. 4. Drag references of required modifier objects into required node objects. 5. Drag references of required node objects into the blueprint object.

Node or modifier objects that were created for previous blueprints can be reused for new blueprints, which means that we can skip the creation step for these objects when creating new blueprint objects.

Figure 11: Part of the Unity editor interface with windows for a blueprint and node object open

Figure 11 shows the relevant windows inside the Unity editor interface when creating a blueprint object. The top-left section of the figure shows the window for a blueprint with three node references added. The right section shows the windows for one of the nodes with three modifier references added. The bottom-left section shows the project explorer with a folder containing one blueprint and two node objects open. One of the ways to add a node object reference to a blueprint object is to drag and drop it from the project explorer into the blueprint object window.

4.2.1 Graphical user interface

Once a blueprint is ready it can be used to generate a character from the graphical user interface (GUI) of the UPCS (Figure 12). The left side panel of the GUI contains selectable buttons, each representing a blueprint. The right side panel contains a slider for controlling the strength of mutations for generated characters and buttons for generating new characters, removing generated characters, grouping generated characters together and ungrouping previously grouped characters for viewing purposes. Additionally, there is a statistics box at the bottom of the right side panel which shows how many character generations were successful, partially successful or failures. If a generation was partially successful, it means that UPCS was unable to apply one or more node modifiers. Similarly, if a generation was a failure, it means that UPCS was unable to find or hit a parent node using the Raycast Settings values of a node.

Figure 12: GUI of UPCS

Each successfully generated character can be saved as a new blueprint by clicking on the save button at the bottom of the GUI. Another way to add new blueprints is to fuse two blueprints together which can be done in the GUI by shift-clicking an unselected blueprint button.

4.2.2 Generation logic

When the generate button is clicked, UPCS first generates a character based on the selected blueprint. Then, it creates new blueprints by mutating the selected blueprint and generates a mutated character for each new blueprint. The generation process of each blueprint begins by applying all modifiers to their respective nodes. When finished, UPCS goes through each node one by one. Then, for each body part node UPCS creates a body part based on the values of the node properties listed in Appendices section 6.1. The creation process goes as follows:

1. Assign Colors values to the material of the body part. 2. Assign Visual Settings values to the material of the body part. 3. Get the 3D mesh asset that matches Mesh Type value from a configuration file. If there are values in Mesh Changers, create a copy of the mesh asset and apply the changers to the new asset. 4. Assign the mesh to the body part. 5. Get the texture asset that matches Texture Type value from a configuration file. If there are values in Texture Changers, create a copy of the texture asset and apply the changers to the new asset. 6. Assign the texture to the material of the body part. 7. Find the position of the body part by raycasting to the parent body part according to Raycast Settings. (This step is skipped for the first body part) 8. Set the position of the body part 9. Set the size of the body part according to Scale value

The body parts use a custom material that uses three colors and a texture to control the visual appearance of the body parts. Each color is matched with the red, green and blue channels of the texture. For example, if the material receives a red texture the body part will be covered completely in the first color. The material also has parameters for lighting and shadows which are modifiable by the Visual Settings node property.

UPCS uses configuration files to store references to textures and meshes used for body part generation. Each Texture Type and Mesh Type value is mapped to a texture or mesh respectively. By design, body part nodes use mainly simple textures (single color and horizontally or vertically split two-color textures) and meshes (spheres, cylinders and cubes) to define the base appearance of the body part. However, it is possible to add custom textures and meshes to achieve better quality or reduce the number of modifiers required to achieve similar results. An example of a custom texture being used can be seen in the Results section 4.3 (Example #4). In order to optimize the size of UPCS, all non-custom textures are generated dynamically when they’re needed by the system for the first time.

The Texture Changers and Mesh Changers node properties are used to add finer details to the appearance of the generated characters, similar to what UV painting and sculpting would do in a 3D modeling software. Just like body part nodes, each Texture and Mesh Changer has its own Raycast Settings property which is used to determine the location and direction of the changes that we want to apply. As an example, instead of using eye nodes for the slime creature presented in section 3.1.4, we could create a Texture Changer that uses the same Raycast Settings values to paint a circle in the same location. At the moment, UPCS is able to use Texture Changers to paint ellipses or custom shapes defined by a set of points onto body parts. On the other hand, Mesh Changers can be used to stretch, shrink and flatten a mesh.

When a blueprint is selected to be mutated or fused with another blueprint, UPCS applies changes to the nodes and modifiers as described in sections 3.2.2 and 3.2.3. Sometimes, fusing Mesh Changer modifiers from different nodes together can cause unnatural mesh artifacts, which can be seen in Examples #9 and #13 in the Results section 4.3. Currently, UPCS is unable to detect these artifacts, which is something that should be fixed in future iterations. 4.3 Results

In order to evaluate the capabilities of UPCS, we created example blueprints based on existing animals, video game creatures and characters. For normal examples, we generated a character based on a blueprint without mutations and then eight characters based on randomly mutated blueprints. For examples using fused blueprints, we fused the selected blueprints nine times using a different random seed and then generated a character for each blueprint. The generations were performed on a 2.5GHz i5-7300HQ machine inside the Unity Editor. The resulting generation times can be seen in Appendices section 6.3.

- Example #1: Simple slime characters created using spheres and dynamic textures. The shape of the body was modified using a Mesh Changer.

Figure 13: Slime creatures generated using UPCS - Example #2: Simple panda characters based on real-life pandas. Once again, only a combination of spheres, dynamic textures and Mesh Changers was used.

Figure 14: Panda characters generated using UPCS - Example #3: Mouse characters based on Pikachu from the Pokémon (Game Freak) franchise. In addition to the spheres and dynamic textures from the previous examples, Texture Changers were used to add the details on the characters’ face and eyes. The lighting and shadow related values of the blueprint were configured to make the unmutated character match the original Pikachu as closely as possible. This resulted in some of the mutated characters with lighter colors having visual clarity issues due to lack of contrast. Although these issues can be fixed manually, a more flexible solution would be to enable users to limit the values of mutated modifiers, as mentioned in section 4.2.2 related to possible issues with mutations.

Figure 15: Pikachu-based characters generated using UPCS - Example #4: Butterfly characters based on Butterfree from the Pokémon franchise. Unlike previous examples, we used a cylinder mesh for the wings in addition to the sphere mesh for the other parts. We also made a custom texture for the pattern on the characters’ wings, due to the current implementation of UPCS lacking suitable Texture Changers needed to create a pattern that would have looked similar enough to the pattern on the wings of the referenced character.

Figure 16: Butterfree-inspired character and its mutations generated using UPCS - Example #5: Humanoid rock-based characters, inspired by Gotsumon from the Digimon (Bandai) franchise. Created with spheres combined with Mesh and Texture Changers.

Figure 17: Gotsumon-inspired character and its mutations generated using UPCS - Example #6: Bear characters based on Monzaemon from the Digimon franchise. Created with spheres combined with Mesh and Texture Changers.

Figure 18: Monzaemon-inspired character and its mutations generated using UPCS - Example #7: Characters generated by fusing Example #3 blueprint into Example #5 blueprint.

Figure 19: Characters generated from fused Gotsumon-Pikachu blueprints - Example #8: Characters generated from the mutated blueprint of one of the characters generated in Example #7.

Figure 20: Characters generated from a mutated blueprint of a Gotsumon-Pikachu fusion character - Example #9: Characters generated by fusing Example #5 blueprint into Example #3 blueprint. In other words, this is the same as Example #7 with the exception that the order of the blueprints was changed.

Figure 21: Characters generated from fused Pikachu-Gotsumon blueprints - Example #10: Characters generated by fusing Example #5 blueprint into Example #4 blueprint.

Figure 22: Characters generated from a mutated blueprint of a Pikachu-Gotsumon fusion character Example#11: Characters generated from a blueprint that was created through the following process: 1. Fusing Example #4 blueprint into Example #5 blueprint. 2. Fusing Example #6 blueprint into Example #1 blueprint. 3. Fusing blueprints from 1. and 2. together.

Figure 23: Characters generated from blueprints that underwent multiple fusions. Example#12: Characters based on Moogles from the Final Fantasy (Square Enix) franchise. Created with spheres combined with Mesh Changers.

Figure 24: Moogle-inspired character and its mutations generated using UPCS Example#13: Characters generated from a blueprint that was created through the following process: 1. Fusing Example #3 blueprint into Example #12 blueprint. 2. Fusing Example #5 blueprint into Example #1 blueprint. 3. Fusing blueprints from 1. and 2. together.

Figure 25: Characters generated from blueprints that underwent multiple fusions. 5. Conclusion

We have presented the design for a procedural character generation system capable of generating large amounts of creature-like 3D characters by enabling users to define the character generation process using procedural rules that are represented by nodes. The node-based design makes the system flexible and easily configurable. For example, nodes can be reused between different characters and simply replacing a node by another can modify the appearance of a character significantly. The design of the system also makes it easy to mutate and fuse previously generated characters together, meaning that every newly designed character increases the character variety of the system by a large amount. The character generation process is fast compared to the traditional way of 3D modeling and uses mostly simple assets, such as sphere meshes and dynamically generated textures, which keeps the memory and disk space usage low. Compared to previous procedural generation solutions, our system was directly implemented inside a popular game engine, making it more accessible to developers and more flexible for creation of game mechanics related to the generation process.

One of the main trade-offs of our procedural character generation system is the lowered quality of generated characters. Traditionally, 3D characters are created from the meticulous work of the artist. Emulating such detailed work requires a wide variety of ways to modify meshes and textures, something that our implementation of the system lacks in its current state. In addition to generation, character mutation and fusion logic would also need improvements in order to reduce the number of generated characters with unnatural body proportions. Other possible improvements include optimizing the character generation speed, simplifying the creation process of blueprints and adding animation support for the generated characters. Currently, it is possible to create animations for generated characters directly inside Unity, which we had done for in our web demo. However, due to lack of proper animation rigging the results end up looking rigid and unnatural. Exploring procedural options such as automatic rigging, real-time motion retargeting and procedural animations would be desirable if we want to further optimize the character creation process for game developers.

In conclusion, our system is capable of quickly generating a large amount of simple 3D characters for video games inside the Unity game engine in a short period of time. Although the visual quality of generated characters is often low, the automation of the character design and creation tasks can ease the burden on game developers with limited budget or development time. Therefore, we recommend the usage of the system mainly for game projects with short development cycles or lower standards for visual quality, such as mobile games, game prototypes or game jam projects. 5.1 Acknowledgements

I would like to extend my thanks to Perttu Hämäläinen for providing valuable feedback and suggestions for the thesis as the supervisor and advisor. I would also like to thank Teemu Leinonen for his support on the writing and research process. Finally, I want to give my appreciation for all the support and encouragement given by my family and my friends. 6. Appendices

6.1 List of body part node properties

Property Property Description Name

Part Type The type of body part that is created from the node, e.g. body, head, eye, arm or foot.

Mesh Type Defines the base mesh of the body part. The default mesh is a sphere.

Texture Type Defines base UV texture used by the mesh. The default UV texture is a red texture.

Scale The scale/size of the body part in 3 dimensions. (1, 1, 1) by default.

Raycast Defines how the body part should be positioned Settings relative to its parent body part. Consists of the following parameters: - Raycast Direction: The direction of the raycast that determines the body part position - Raycast Target: The relative position inside the bounding box of the body part that the raycast should go through. For example, with the value (0, 0, 0) the raycast would go through the center of the body part and with (0.5, 0.5, 0) it would go through the top-right side of the body part. - Mirror Direction: Should Raycast Direction be mirrored against Mirror Axis - Mirror Target: Should Raycast Target be mirrored against Mirror Axis - Target Part Type: The part type of the parent body part that the raycast should be targeted at. - Secondary Target Part Types: A list of substitute Part Types, in case there is no matching body part for Target Part Type. Mirror Axis Defines the axis (X, Y or Z) that is used to create a duplicate, mirrored node if needed. For example, setting Mirror Axis to Y can create a right eye node from a left eye node. None (meaning that UPCS does not generate a mirrored node) by default.

Colors The color palette of the body part, defined by 3 colors.

Visual Settings related to lighting/shadows used by the Settings material of the body part.

Mesh A collection of mesh changers that are used to modify Changers vertices of the base mesh.

Texture A collection of texture changers that are used to Changers modify the base UV texture. 6.2 Mutation descriptions for each modifier/node property

Modifier/Pro Mutation Description perty Name

Part Type Does not mutate.

Mesh Type Mutates to a similar mesh type. List of similar mesh types defined in a configuration file in UPCS.

Texture Type Mutates to a similar texture type. List of similar texture types defined in a configuration file in UPCS.

Scale Mutates to a scale that is up to 50% smaller or larger.

Raycast The parameters mutate as following: Settings - Raycast Direction: Does not mutate - Raycast Target: Values mutate up to 10%. - Mirror Direction: Does not mutate - Mirror Target: Does not mutate - Target Part Type: Does not mutate - Secondary Target Part Types: Does not mutate

Mirror Axis Does not mutate.

Colors All 3 colors can mutate to any color.

Visual Settings Does not mutate.

Mesh Changers Mutations depend on the type of mesh changer. As a general rule, mutated values should not differ from the original by over 50%.

Texture Mutations depend on the type of texture changer. As a Changers general rule, mutated values should not differ from the original by over 50%. 6.3 Generation performance

Example # Generated character Average generation time (9 characters)

1 Slime 100ms

2 Panda 380ms

3 Pikachu 460ms

4 Butterfree 600ms

5 Gotsumon 950ms

6 Monzaemon 200ms

7 Gotsumon-Pikachu 1800ms (Fusion)

8 Gotsumon-Pikachu 800ms

9 Pikachu-Gotsumon 2200ms (Fusion)

10 Butterfree-Gotsumon 2300ms (Fusion)

11 Multi-fusion character 900ms

12 Moogle 520ms

13 Multi-fusion character 2 860ms 6.4 Web demo link

A web demo of our UPCS implementation can be found in the following URL: https://zanagi.github.io/upcs/ References

1. Hendrikx M, Meijer S, Van Der Velden J, Iosup A. Procedural content generation for games: A survey. ACM Trans Multimedia Comput Commun Appl. 2013;9: 1–22.

2. Benard S. Building the Level Design of a procedurally generated Metroidvania: a hybrid approach. In: Gamasutra [Internet]. 29 Mar 2017 [cited 23 Mar 2021]. Available: https://www.gamasutra.com/blogs/SebastienBENARD/20170329/294642/B uilding_the_Level_Design_of_a_procedurally_generated_Metroidvania_a_h ybrid_approach.php

3. Marks T. Every Call of Duty Install Size Compared, From 2003 to Modern Warfare. In: IGN [Internet]. 25 Oct 2019 [cited 24 Jan 2021]. Available: https://www.ign.com/articles/2019/10/25/every-call-of-duty-install-size-com pared-from-2003-to-modern-warfare

4. Activision Publishing, Inc. Call of Duty®: Modern Warfare | PC. In: Call of Duty®: Modern Warfare [Internet]. [cited 24 Jan 2021]. Available: https://www.callofduty.com/modernwarfare/pc

5. Khatchadourian R. World Without End - Creating a full-scale digital cosmos. In: The New Yorker [Internet]. 11 May 2015 [cited 24 Jan 2021]. Available: https://www.newyorker.com/magazine/2015/05/18/world-without-end-raffi- khatchadourian

6. No Man’s Sky on Steam. In: Steam [Internet]. [cited 23 Mar 2021]. Available: https://store.steampowered.com/app/275850/No_Mans_Sky/

7. Kultima A. Developers’ perspectives on iteration in game development. Proceedings of the 19th International Academic Mindtrek Conference on - AcademicMindTrek ’15. 2015. doi:10.1145/2818187.2818298

8. The Next Leap: How A.I. will change the 3D industry - Andrew Price. In: YouTube [Internet]. 5 Nov 2018 [cited 24 Jan 2021]. Available: https://www.youtube.com/watch?v=FlgLxSLsYWQ

9. Togelius J, Champandard AJ, Lanzi PL, Mateas M, Paiva A, Preuss M, et al. Procedural content generation: Goals, challenges and actionable steps. Schloss Dagstuhl - Leibniz-Zentrum fuer Informatik GmbH, Wadern/Saarbruecken, Germany; 2013. p. 75. 10. Sulic I. Impossible Development Diary - IGN. In: IGN [Internet]. 20 May 2012 [cited 25 Apr 2021]. Available: https://www.ign.com/articles/2002/11/18/impossible-development-diary

11. Baines T. Maths, No Man’s Sky, and the problem with procedural generation. In: Thumbsticks [Internet]. 31 Aug 2016 [cited 24 Jan 2021]. Available: https://www.thumbsticks.com/no-mans-sky-problem-procedural-generation/

12. Informer G. A Behind-The-Scenes Tour Of No Man’s Sky's Technology. In: YouTube [Internet]. 5 Dec 2014 [cited 24 Jan 2021]. Available: https://www.youtube.com/watch?v=h-kifCYToAU

13. Hudson J. Creature Generation using Genetic Algorithms and Auto-Rigging. [cited 1 Apr 2021]. Available: https://nccastaff.bmth.ac.uk/jmacey/OldWeb/MastersProjects/MSc13/06/Jo n%20Hudson%20Thesis.pdf

14. Bach E, Madsen A. Procedural Character Generation: Implementing Reference Fitting and Principal Components Analysis. B. Madsen C, Bangsø O, editors. MSc, Aalborg Universitet. 2007.

15. Dvorožňák M, Sýkora D, Curtis C, Curless B, Sorkine-Hornung O, Salesin D. Monster mash: a single-view approach to casual 3D modeling and animation. ACM Trans Graph. 2020;39: 1–12.

16. Chelliah BJ, Vallabhaneni VK, Lenkala SR, J, Mithran, Reddy KKM. 3D Character Generation using PCGML. International Journal of Innovative Technology and Exploring Engineering (IJITEE). 2019;8: 5.

17. Hecker C, Raabe B, Enslow RW, DeWeese J, Maynard J, van Prooijen K. Real-time motion retargeting to highly varied user-created morphologies. ACM Trans Graph. 2008;27: 1–11.

18. Gaisbauer W, Hlavacs H. Procedural attack! Procedural generation for populated virtual cities: A survey. Int J Serious Games. 2017;4. doi:10.17083/ijsg.v4i2.161

19. Freiknecht J, Effelsberg W. A Survey on the Procedural Generation of Virtual Worlds. Multimodal Technologies and Interaction. 2017;1: 27.

20. Liu J, Snodgrass S, Khalifa A, Risi S, Yannakakis GN, Togelius J. Deep learning for procedural content generation. Neural Comput Appl. 2021;33: 19–37. 21. Black PE. node. In: National Institute of Standards and Technology [Internet]. 17 Dec 2004 [cited 24 Jan 2021]. Available: https://xlinux.nist.gov/dads/HTML/node.html

22. Unity Technologies. Unity Real-Time Development Platform | 3D, 2D VR & AR Engine. In: Unity [Internet]. [cited 7 Mar 2021]. Available: https://unity.com/

23. Toftedahl M, Engström H. A Taxonomy of Game Engines and the Tools that Drive the Industry. DiGRA 2019, The 12th Digital Games Research Association Conference, Kyoto, Japan, August, 6-10, 2019. Digital Games Research Association (DiGRA); 2019. Available: https://www.diva-portal.org/smash/record.jsf?pid=diva2:1352554

24. Drake J. 10 Great Games That Use The Unity Game Engine | TheGamer. In: THEGAMER [Internet]. 27 Feb 2020 [cited 8 Mar 2021]. Available: https://www.thegamer.com/unity-game-engine-great-games/