<<

Clemson University TigerPrints

All Theses Theses

5-2018 Effects Workflow and Asset Creation for , Live-Action, and Virtual Reality Stephen Thaddaeus Wassynger Clemson University, [email protected]

Follow this and additional works at: https://tigerprints.clemson.edu/all_theses

Recommended Citation Wassynger, Stephen Thaddaeus, "Effects Workflow and Asset Creation for Animation, Live-Action, and Virtual Reality" (2018). All Theses. 2889. https://tigerprints.clemson.edu/all_theses/2889

This Thesis is brought to you for free and open access by the Theses at TigerPrints. It has been accepted for inclusion in All Theses by an authorized administrator of TigerPrints. For more information, please contact [email protected]. Effects Workflow and Asset Creation for Animation, Live-Action, and Virtual Reality

A Thesis Presented to the Graduate School of Clemson University

In Partial Fulfillment of the Requirements for the Degree Master of Fine Arts Digital Production Arts

by Stephen Thaddaeus Wassynger May 2018

Accepted by: Dr. Eric Patterson, Committee Chair Dr. Jerry Tessendorf Dr. Brian Malloy Abstract

Animation, live-action, and virtual reality all use effects to further express the artistic vision of story creators. The effects help immerse the viewer in a fantasy world that is unlike anything that exists on Earth. In this paper I explain the planning, creation, and final implementation of various effects assets for Making Friends (an animated short), Right to Bear and Hot Air (live-action shorts), and Journey to Proxima Centauri: Terror of the Mnar (a virtual reality experience). I also cover some of the differences and limitations when creating effects for animation, live-action, and virtual reality. Lastly, I explore if it is possible and beneficial to take techniques used in one workflow and apply the concept or tools to make a different workflow easier and more efficient.

ii Dedication

I would like to dedicate this project to my parents Stephen and Patricia Wassynger; without them and all their support that they have given me over the years, achieving this milestone would only be but just a dream.

iii Acknowledgments

I would like to thank my thesis advisor, Dr. Eric Patterson, for teaching, inspiring me, and supporting me throughout my graduate career. My thesis would not be possible without his knowledge, experience, and most of all his patience.

I would also like to thank my committee members, Dr. Jerry Tessendorf and Dr. Brian

Malloy, for all the help, guidance, and inspiration they have given me over the years. They have always reminded me that if I am not tired and having fun then I am doing something wrong.

Finally, I would like to thank all of my family, friends, and all who have supported me throughout this endeavor and for keeping my spirits high when I started to tank.

iv Table of Contents

Title Page ...... i

Abstract ...... ii

Dedication...... iii

Acknowledgments ...... iv

List of Figures...... vii

1 Introduction & Background...... 1 1.1 History of Effects...... 2 1.2 How Effects Drive Entertainment Media...... 3 1.3 Efficiently Creating and Managing Effects...... 12 1.4 Conclusion ...... 15

2 Planning the Effects ...... 17 2.1 Meeting with Directors...... 17 2.2 Differences and Similarities Between the Requested Effects...... 18 2.3 Directability of Effects...... 21 2.4 Effects Creation Tools...... 22

3 Asset Workflow and Creation...... 23 3.1 Making Friends...... 24 3.2 Right to Bear Arms ...... 27 3.3 Hot Air ...... 30 3.4 Journey to Proxima Centauri...... 31

4 Implementation and Limitations...... 36 4.1 Implementation...... 36 4.2 Limitations ...... 37

5 Results ...... 39 5.1 Making Friends...... 40 5.2 Right to Bear Arms ...... 41 5.3 Hot Air ...... 41 5.4 Journey to Proxima Centauri...... 42

6 Postmortem ...... 45

7 Conclusion...... 47 7.1 Effects...... 47

v 7.2 Limitations ...... 48 7.3 Applications...... 48

Appendices ...... 50 A Making Friends - Environment Creation...... 51 B Right to Bear Arms - Particle Simulation ...... 52 C Journey to Proxima Centauri: Terror of the Mnar - Air Particulate...... 53 D Journey to Proxima Centauri: Terror of the Mnar - Electricity Explosion ...... 53

Bibliography...... 54

vi List of Figures

1.1 Water Spirit interaction Moana [31] ...... 4 1.2 Original Plate [10] ...... 5 1.3 Final Shot [10] ...... 5 1.4 Original Plate [10] ...... 5 1.5 Final Shot [10] ...... 5 1.6 Original Plate [10] ...... 5 1.7 Final Shot [10] ...... 5 1.8 Original Plate [10] ...... 6 1.9 Final Shot [10] ...... 6 1.10 Original Plate [10] ...... 6 1.11 Final Shot [10] ...... 6 1.12 Final Shot [28] ...... 7 1.13 Final Shot [28] ...... 7 1.14 Effects leading the user to a new place [34] ...... 8 1.15 Effects evoke the emotion of yearning [34]...... 9 1.16 Cloth effects showing motion [4]...... 10 1.17 Subtle dust and sand effects [4]...... 11 1.18 Flocking effect that draws the player to that location [4]...... 12 1.19 General entertainment media pipeline [7]...... 13 1.20 Reality of a pipeline [9] ...... 14

2.1 Low Polygonal Style [5] ...... 19 2.2 Seemless Computer Generated Bear [2]...... 20 2.3 Environment of Journey to Proxima Centauri: Terror of the Mnar ...... 21 2.4 Art Directable Paint...... 22 2.5 Instanced Geometry from Paint...... 22

3.1 Basic Workflow...... 23 3.2 Making Friends Workflow ...... 24 3.3 Example of the low-polygonal style [5]...... 25 3.4 Live-Action Workflow ...... 27 3.5 Shader sampling from texture...... 28 3.6 Custom Hair Shader...... 28 3.7 Example of water vapor [1] ...... 29 3.8 Live-Action Workflow ...... 30 3.9 Journey to Proxima Centauri: Terror of the Mnar Workflow...... 32 3.10 Example of Air Particulate [3]...... 32 3.11 Breaking the JPC Workflow: Air Particulate ...... 33 3.12 Breaking the JPC Workflow: Water ...... 35

5.1 Shot 01 without effects...... 40

vii 5.2 Shot 01 with effects ...... 40 5.3 Shot 05 without effects...... 40 5.4 Shot 05 with effects ...... 40 5.5 Without Bear Fur ...... 41 5.6 With Bear Fur ...... 41 5.7 Without apple explosion...... 41 5.8 With apple explosion...... 41 5.9 Without Whirlwind ...... 41 5.10 With Whirlwind ...... 41 5.11 Without particulate ...... 42 5.12 With particulate ...... 42 5.13 Without Electricity Explosion...... 42 5.14 With Electricity Explosion...... 42 5.15 Without Dust...... 42 5.16 With Dust...... 42 5.17 Without Laser Blast...... 43 5.18 With Laser Blast...... 43 5.19 Without Mining Cart Particles ...... 43 5.20 With Mining Cart Particles...... 43 5.21 Without water stream...... 43 5.22 With water stream...... 43 5.23 Without Charging Beam...... 44 5.24 With Charging Beam...... 44

viii Chapter 1

Introduction & Background

Effects have been utilized in animation, live-action, and virtual reality for a long time.

Effects have not always been the flashiest or most noticeable parts of entertainment media, however, that does not mean they are not used or that they are not important. In fact effects are used now more than ever and are crucial to helping an artist drive their narrative to the next level. To better understand how this came to be the case, a small examination of the history of effects in animation, live-action, and virtual reality is required. An examination of how effects drive modern entertainment media is also required. Finally, a short overview of how data is handled for an animation, live-action, and virtual reality production will help explain how effects are a necessity in the modern entertainment world.

Before an overview of the history of effects, the term “effects” should be more clearly defined.

There are three main effects categories in the entertainment industry: special effects, visual effects, and FX effects. Special effects are effects that are carried out live on set; these are more commonly referred to as practical effects. An example of a special effect would be detonating a gasoline explosion for a car crash. Visual effects are effects that are done on the computer during post-production.

An example of a visual effect would be creating a three-dimensional world and compositing actors standing in front of a green screen into the three-dimensional world. FX effects are effects that are computer generated and then composited into a shot. An example of an FX effect would be a computer-generated collapsing building. With the exclusion of the history of effects section when the term “effects” is said in this paper, it is meant in reference to FX effects. [29]

1 1.1 History of Effects

The creation and use of effects in entertainment media have changed dramatically ever since its inception. The first effects shot was a special effect was a shot done in 1895 called The Execution of Mary, Queen of Scots by Alfred Clarke. [20] This shot is about a beheading and was achieved by using . This means the director stopped the film right before the ax hit the actor, switched the actor out for a dummy, and then continued filming the shot. Although this can clearly be seen now, this shot was done over a hundred years ago and is nothing short of amazing for the time.

Not short after this film was made, the “Father of Special Effects” came into being. His name was Georges Melies who started out by doing stage illusions, but upon witnessing one of the first Lumiere shorts he went out and bought an Edison . Using the Kinetoscope he developed his own prototype camera. He originally produced one-shot films, but accidentally discovered “stop-action” and from there started using stop-action, double exposure, fast and slow motion, dissolves, and perspective tricks to help elaborate his short films. [20]

Throughout the decades the technology and techniques for effects have improved vastly. A traveling matte system was created by Frank Williams which was later refined to be a blue-screen matting system. A matte is a term for painting out a section of the film so that it could not be exposed when running through a camera. Fritz Lang introduced miniatures into full-size sets using mirrors, matte paintings, and rear projection. Linwood Dunn invented the optical printer which allowed compositing two pieces of film into one. However, a big crash happened during the 1960s due to the rise in popularity in television and because of this effects departments had to shut down due to reductions in driven production. It was not until the 1980s when effects would bounce back, effect houses formed and would bid for shots based on scripts of the films largely due to the team assembled for the effects in Star Wars. Finally, in 1993 computers were introduced to the equation and Jurassic Park had the first CG characters. In 1995, releases the first entirely computer-generated feature film, Toy Story which showed the world that it was possible to make a complete computer-generated film. Currently in the industry, there is a strive to create seamless effects that integrate perfectly into a film. This has yet to be achieved but is coming more of a reality with each passing day. [20]

Animation effects started out with two-dimensional animation. The most commonly used

2 effect in two-dimensional animation is known as squash and stretch. This is when the causes the object to squash to show that the object is slowing down; conversely, the animator can cause the object to stretch which helps show that the object is moving faster. Another effect used is called a smear frame; that is when an artist would smear paint on to a frame to give the illusion of motion blur. If the artist wanted to draw a natural effect such as fire, they would typically draw the animation straight ahead as this would allow for slight natural variations giving a more natural to the effect. [25] Just like normal two-dimensional animation, creating two-dimensional effects was a laborious task until the advent of computers. With computers an artist could copy and paste frames drawn digitally and quickly create looping two-dimensional effects. They could also scale, rotate, and transform these effect loops which made implementing and tweaking effects easier. Computers also added the ability to create three-dimensional effects. three-dimensional effects used for animation is not any different from any other type of computer-generated effects, the same techniques, programs, and math is still used. The effects are typically created to be more easily art directable compared to live-action. Today both two-dimensional and three-dimensional effects are still widely used and even combined as they were in Animation Studio’s Moana.[17]

Virtual-Reality effects are relatively new in comparison to the other forms of effects. The effects produced for this medium are closely related to the effects required for video-games. This is because video-games and virtual reality have a lot of similarities with respect to the constraints. For instance, both virtual reality and video-games are typically run in real-time and on one machine.

Because of this virtual reality effects are in essence born from video-game effects so they use the same techniques. Although we can run some small effect simulations in real time now it is still more efficient for virtual reality to use the older techniques for video-game effects. These comprise of two main things, sequences and particles. Sequences are effects that have been rendered out and the image sequence is played back on cards. Particle effects in real-time and is comprised a bunch of little spheres that can be affected by external forces such as a vector field.

1.2 How Effects Drive Entertainment Media

Over time effects have gained an increasingly more important spot in how entertainment media is driven and consumed. Due to this the number of shots that require effects to help drive the narrative of the media alone has increased. For example, in the animation realm, Walt Disney

3 Animation Studio’s Moana was a story centered around water and water effects. Over eighty percent of the shots in Moana contained some form of water effects to help drive the narrative. [32] Effects are so important to how the narrative was driven, there is even a complete character in the animated

film called the Water Spirit that was made completely out of water and uses water effects to help express the emotion of the character. [11]

Figure 1.1: Water Spirit interaction Moana [31]

The realm of animation is not the only form of modern entertainment media that utilizes effects to help express and drive the narrative forward, live-action does this too. For example, the movie Mad Max: Fury Road was loved by the general audience for its lack of effects; however, this is a false belief. The effects in this film were just implemented in a different manner so that they were less in your face and better integrated into the film. It was reported that Mad Max: Fury Road actually contains over two thousand effect shots. [10] This enforces the fact that effects serve a vital role in driving the narrative of entertainment media even if the average viewer ca not tell that there are effects. Below are examples of how Mad Max: Fury Road used effects to drive the dystopian emotion of the movie by really making the scenes dark, gritty, and emotional. On the left there are images of the original plate, which is a term to describe the original footage before it is modified and on the right there are images of the final shot as seen in the movie.

4 Figure 1.2: Original Plate [10] Figure 1.3: Final Shot [10]

Figure 1.4: Original Plate [10] Figure 1.5: Final Shot [10]

Figure 1.6: Original Plate [10] Figure 1.7: Final Shot [10]

5 Figure 1.8: Original Plate [10] Figure 1.9: Final Shot [10]

Figure 1.10: Original Plate [10] Figure 1.11: Final Shot [10]

Not only do these effects add to the emotion of the movie, they really define what the movie ultimately is and how it is perceived. If the effects were to be striped away from the movie that drove the narrative of the film, it would be a boring movie about people who rarely talk and are chasing other people in the desert for no apparent reason.

Another live-action film that uses effects to drive the narrative of the film without much dialogue is the film Blade Runner: 2049. The film had a total of twelve hundred effects shots which lasted an hour and forty-five minutes of the films total two hours and forty-three minutes. [15] This is an outstanding amount of run time for effects to be in a film. The effects throughout the movie were also not always in your face but mainly used to help further the emotion, tone, and narrative of the whole Orwellian dystopia society that future Los Angeles has to look forward to.

6 Figure 1.12: Final Shot [28] Figure 1.13: Final Shot [28]

Even though video-games are not always based the in virtual reality realm, the technology they both typically utilize is generally the same these days. Both Flower and Journey, made by

Thatgamecompany, are games that really utilize effects to help give the player an intended emotion, lead a player to the next object, and help progress the story of the game. Both game narratives are done with no dialogue so the creators really had to utilize effects and sound to drive how the world was received by a player.

Flower is a very simple game centered around promoting green energy and living while showing the pitfalls of non-renewable energy and unstable consumption habits. One way they achieved this was through the use of very pleasing particle effects. As shown below, each level in the game starts out gray, dead, and dull. The player controls a wisp of wind that collects flower petals as the user rolls over them this causes the world to change from gray to vibrant colors. As the user collects more petals and hits certain checkpoints the color burst with particles effects that lead the user to a new portion of the map. Not only is the path of the user typically driven by particle effects, the user itself is a giant particle effect with flower petals instanced on each particle.

7 Figure 1.14: Effects leading the user to a new place [34]

The creators of Flower really wanted to show the players how beautiful the world can be without pollution. Below is an example of a level in the game that mimics the look of bio-luminescent creatures at night, which can typically only be seen without a lot of light pollution. In this shot, we see an exceptional amount of vibrant particle effects that really make the player feel as if they are on a beautiful alien planet that evokes the emotion of yearning to visit such a place.

8 Figure 1.15: Effects evoke the emotion of yearning [34]

Journey on the other hand is about a character who wakes up in the middle of the desert and is on a mission to reach a giant mountain. During the progression of the game, a player learns about the characters culture, history, and ultimate downfall. While the effects in this game are less in your face than in Flower there are some notable ones to be mentioned. First, the character is made up of baggy robes, so cloth effects had to be added to the geometry of the robes. This helped give the player a sense of movement throughout the game. Secondly, since the user is in a sandy environment most of the time, dust cards and particle effects were mainly used to immerse the player in a sandy place. Finally, the game implements particle effects that have pieces of robes instanced on them that organically flock to each other. This effect gives the sense of life to the flocking robes, an otherwise inanimate object, that makes the player have a connection with it on an emotional level and causes the player to want to go and investigate them.

9 Figure 1.16: Cloth effects showing motion [4]

10 Figure 1.17: Subtle dust and sand effects [4]

11 Figure 1.18: Flocking effect that draws the player to that location [4]

1.3 Efficiently Creating and Managing Effects

As more and more effects are required to help drive a narrative to grow, the way we plan, create, and manage these assets has been forced to change to become more efficient. Almost all projects no matter how big or small require a pipeline or at the very minimum a general workflow to follow. A pipeline is a workflow or process that explicitly states how assets should travel from one department to the next. [30] Below is a general example of what an entertainment media pipeline

flows like:

12 Figure 1.19: General entertainment media pipeline [7]

However, this in practice a pipeline is rarely as neat as shown above. More often than not the pipeline ends up working like this:

13 Figure 1.20: Reality of a pipeline [9]

The reality is that Figure 1.20 shows that a real pipeline in production is a lot more confusing than busy than the linear workflow shown in Figure 1.19. This is why having a set workflow is so important. A pipeline allows productions to cut down on confusion and avoid planning pitfalls.

Unfortunately, the productions covered in this paper had such a short time frame that building up a workable pipeline was not possible. Instead, a general workflow was created that will be discussed at the beginning of the Asset Creation chapter. Something to note though is that even though every production will have a different pipeline the overall importance of a pipeline cannot be stressed enough.

Most productions have the same startup workflow, however, there are some key differences in the downstream processes to note between animation, live-action, and virtual reality. The animation and virtual reality workflows have a lot of departments that can typically do work in parallel whereas the live-action workflow typically requires the live-action footage to be acquired first so that the artists can properly place and scale computer generated assets. Another difference is that the

14 virtual reality workflow typically allows for a real-time interactive design and placement through applications such as Tilt Brush. This allows directors to hop into the virtual world and draw to produce a three-dimensional representation of what they want to be done, whereas in animation and live-action it is typically done by doing two-dimensional paint overs.

Epic Games has even tried something new with their production department. They were trying to find a pipeline that was more seamless, could produce more iterations, and could still produce the same quality cinematic trails as they had in the past for video-games. Their solution was to force the Epic Games production department to use Unreal Engine to create the next Fortnight cinematic. While this was originally hard for the cinematic department to accomplish, the effort was well worth the payoff. They found that when game engine technology is applied to a traditional linear animation production pipeline it can positively alter the dynamics of content creation. It helps improve real-time interactivity, the iterative revision process, flexibility during scene assembly, and some rendering overhead can be reduced or even eliminated. In short, it forced the production team to find more creative, simpler, and elegant solutions. [23] Although this was done to a cinematic department for a video-game producer, it does not mean that lessons cannot be learned from this study and applied to another industry such as the animation or live-action industry.

There is no reason as to why animation and live-action can not use Unreal Engine to do previs with real-time effects to help block out how the overall shot will look and get a feel for time, camera placement, and size of the effects in an effort to save production time. Recently the people at Industrial Light & Magic did this with Star Wars: Rouge One. John Knoll was an executive producer for Star Wars: Rogue One and wanted all the computer-generated shots to feel consistent with the signature hand-held style that director Gareth Edwards used on the live-action set. To achieve this the R&D team at Industrial Light & Magic created a director-centric virtual camera system that allowed the director to explore the entire set of the computer-generated Star

Wars worlds. [16] It should be noted that this has been done before in animation for the movie

Surf’s Up, however, it was not on this scale.

1.4 Conclusion

Effects have come a very long way in a very short amount of time. Special, visual, effect effects have only been around for a little over 120 years for modern-day entertainment media, but in

15 recent years it has started to take a more commanding role in driving the narrative instead of just sprucing up the background. Although there are still some constrictions because of computing power and the inability to make effects a hundred percent realistic, that does not mean effects cannot fool the average person long enough to immerse them in a fantasy world. There are areas for improved methodology for creatively creating, managing, and storing assets from virtual reality, and if this were applied animation and live-action there could be some tremendous savings in productions.

16 Chapter 2

Planning the Effects

Planning is an important step for any project. If a project is well planned, it has a higher chance of success. The quality of work generally improves as well. This works so well because no matter the project it will develop a path to follow. The entire production team knows what is being done, how to do it, and what the expected outcomes are. During the course of these three productions some planning was done; however, better planning could have certainly been done that would have helped clear up some confusion about the restrictions of the effects. The planning stage went as follows:

• Meet with directors to find out what effects were needed

• Figure out what tools could accomplish these effects

• Figure out the differences and similarities between the requested effects

• Figure out a way to make the effects directable

2.1 Meeting with Directors

The first thing to do for each production was to meet with the project lead or director and figure out exactly what effects they needed to be done. This is a step that cannot be avoided because, in order to produce effects that are up to the directors’ standard, understanding without question what the director is looking for is crucial for success. It is also important to be able to let a director know ahead of time if an effect is feasible for the production time frame or if the effects

17 could be done at all. The effects that were requested by each project were vastly different. For instance:

• Making Friends required a procedural environment that could be easily art directed

• Right to Bear Arms required realistic bear fur and an apple exploding

• Hot Air required a whirlwind

• Journey to Proxima Centauri required mainly particle effects

However, just because the effects are way different, does not mean the techniques for one production cannot be applied to another one. After the initial meeting with the directors, a minimum of a weekly meeting would happen to show the progress of the effects work.

2.2 Differences and Similarities Between the Requested Ef-

fects

Differences and similarities might be challenging, but in essence, having only a three-month time frame to create effects for three different productions, it would help if any effects or techniques could be duplicated and used in more than one production. If it could save time, it would allow for more time to be allocated to more time-intensive effects. Another reason would be to start building up a catalog or database of pre-cached effects that artists and directors could use to drag and drop into a scene and get a feel for how it would initially look. This has been done in the video-game industry for a long time now, and both animation and live-action film companies are starting to take this approach. This method was done for Moana [8][11][19] and Guardians of the Galaxy [22] and seems to be increasing in popularity.

The director of Making Friends wanted a very simplistic low-polygonal environment and effects to help support their simple narrative. An example of this type of style can be seen below:

18 Figure 2.1: Low Polygonal Style [5]

Unfortunately, the other projects required a higher resolution, so initially, I would not be able to use these effects elsewhere.

The director of Right to Bear Arms and Hot Air wanted the effects to be more realistic.

For Right to Bear Arms they wanted effects that could easily be composited into a scene and would not draw much attention to a shot. The effects for Right to Bear Arms would have to be realistic and high quality much like the computer-generated bear from the The Revenant.

19 Figure 2.2: Seemless Computer Generated Bear [2]

For Hot Air there was a little leeway on the realism on the effects because that shorts’ goal was to be a little less realistic. However, realistic effects were only needed in this production so I initially was not able to use these effects elsewhere.

The directors of Journey to Proxima Centauri: Terror of the Mnar wanted the effects to be hyper-realistic so that it would mesh well with the hyper-realistic environment of a futuristic alien planet where the virtual reality experience would take place.

20 Figure 2.3: Environment of Journey to Proxima Centauri: Terror of the Mnar

As shown in the three images above that each environment and effects expectations and requirements are vastly different and would require some planning to get it to where the directors would be able to use them and be happy with them. However, there is a possibility that the techniques used for the virtual reality project and could be used in the live-action projects.

2.3 Directability of Effects

Directability of computer-generated effects is very important because it would be a huge waste of time and resources to simulate a 24-hour water simulation and have the director come back and ask for a completely different motion from the effects. It is much better to create the effects with art directability built into it, so when an effect needs a drastic change it can be done without building the effect from the ground up. This is where art directable effects are born. An art directable effect is an effect that a director could sit down with an artist and change stuff until they are happy. This notion of art directable effects is becoming more commonplace in the industry. For example a lot of the effects in Moana were built to be easily directable by artist. [8][11][6] With this in mind, it was imperative to plan the effects with directability in mind. For example, displayed

21 below the environment generation in Making Friends was made to be art directable and real-time interactive through paint-weights and the height field tools in .

Figure 2.4: Art Directable Paint Figure 2.5: Instanced Geometry from Paint

2.4 Effects Creation Tools

The style of effects involved with each project varied a lot so it was best to stick with tools that were already known. The majority of the effects work done was done with SideFX’s Houdini.

Unfortunately, after research and simple implementation test were done, it was not efficient to solely use Houdini for the virtual reality production. The virtual reality production was being done in

Epic Games Unreal Engine, and even though a port to Unreal Engine does exist for Houdini, the workflow required to achieve the effects for Journey to Proxima Centauri: Terror of the Mnar was not efficient enough for this production to follow. Instead, a new approach was formulated and that was to learn Unreal Engine’s Cascade effects system and use Houdini in tandem with it when absolutely needed. This proved to give better results, integrate better with the feel of the virtual world, and the effects would be optimized for Unreal Engine.

22 Chapter 3

Asset Workflow and Creation

The asset creation for each asset of each project generally followed the same workflow. That workflow is:

1. Meet with a director and get an idea of what effect was needed.

2. Plan out the asset on paper.

3. Create the asset in the appropriate or chosen program.

4. Add the asset to the scene/project.

5. Get feedback on the asset.

6. Implement feedback.

7. Go to step 4 and repeat until the directors were happy with the asset.

It can be better visualized in Figure 3.1:

Figure 3.1: Basic Workflow

23 Although this workflow is quite simple, it was imperative that it was strictly followed to help ensure quick iteration and improvements to the assets, thus resulting in better effects.

The majority of the effects created would be done using Houdini, so a short overview of how

Houdini works and terminology might help. Houdini is a node based 3-Dimensional environment that allows users to easily and procedurally build computer generated assets. Every node is a type of operators such as a surface operator or SOP. Below is a list of common operators and what they mean:

• SOP - Surface OPerator, these affect the surface of geometry

• DOP - Dynamic OPerator, this is where dynamic simulations happen

• SHOP - SHader OPerator, this is how materials are made

• POP - Particle OPerator, this is where particle simulations happen

3.1 Making Friends

Below is a list of FX and workflow that I created for Making Friends:

• Procedural Low-polygonal Environment

• Lava

• River

• Smoke

Figure 3.2: Making Friends Workflow

24 The first and biggest asset challenge that had to be tackled for Making Friends was creating a system to achieve a stylized low-polygonal look similar to the picture below:

Figure 3.3: Example of the low-polygonal style [5]

To accomplish this the style had to be analyzed. What really sticks out about this style is how the geometry facets are rarely ever flush or flat with one another. The approach taken to replicate this style was to apply random noise to the vertices of input geometry at the end of the workflow. This was implemented in Houdini using the mountain surface operator (SOP). What this node does is apply random noise to the vertices of geometry. This node can adjust things such as noise type, height, element size, and scale. This node could have been coded using VEX in an attribute wrangle node it in an attribute wrangle node with VEX code; however, the mountain SOP was chosen due to time constraints, a need for quick iterations, and much easier artistic control.

Now that there is a way of creating low-polygonal looking geometry no matter the situation, the environment needed to be generated procedurally. The first environmental piece to create was a background piece that consists of a mountain range with a volcano with lava and water running in a river. The first attempt at creating this was done by modeling the mountain geometry and applying

25 a mountain SOP to it, however, the results were unsatisfactory, and the director thought that it could be better. Without being able to control, shape, and layer the noise easily and efficiently, another solution had to be found. After some research, Houdini’s Terrain Tools seemed like a suitable route to do environment generation. These tools utilize height fields that are easier to layer noise. This resulted in much more pleasing results that only needed to be re-meshed to a smaller polygon count.

More information can be found in Appendix A for this process. The mid-ground/foreground for the remaining shots was created using a similar workflow.

The next asset that needed to be created for Making Friends was lava that would collide with the environment geometry. In order for the geometry to able to be used for the simulation, it needed to be close so that volume-based collisions could be used rather than geometric collisions.

Volume-based collisions were wanted because they are faster than geometry collisions which meant less time is spent waiting for a simulation to finish, increasing the number of iterations possible.

Closing the geometry was a simple process that can be done by extruding the edges of the mountain range mesh down and filling it with a face.

To create the lava simulation faces on the geometry where the lava was going to emit from were used to source particles for a simulation. The range was over 1000 square meters originally and the simulation took minutes to calculate frames. This is infeasible, so the simulation was scaled down to just 50 square meters. Originally an old lava solver that was previously created was used.

While this solver did work, it was extremely inefficient, so instead, the Houdini lava solver based on a FLIP solver was used. FLIP stands for Fluid-Implicit Particles solver and is a hybrid method that uses both particle and volume based fluid simulations to mimic water. This new solver choice was produced much more desirable results. After running the simulation and finalizing how the lava

flowed the simulation was converted to a polygon mesh and re-meshed to achieve a low polygon count.

The water simulation was originally done using the same technique that was done for the lava. The simulation area was shrunk down however unlike the lava, the simulation did not really

fit or flow with the environment geometry. Instead, a curve was made for the water to flow along and it was projected down on the mesh. A plane was then swept along the curve and the animated noise was added to the mesh to give the illusion of flowing water.

To create the smoke simulation, faces on the geometry where the smoke was going to emit were used to source particles for a simulation. The default wispy smoke pyro FX tools in Houdini

26 were used to get a base smoke simulation. When running the simulation it was to fast and not windy enough. To sell the illusion that the smokestack was really tall and far away, the simulation had to be run a hundred times slower. To get a more windy look the pyro solver was modified to affect the dissipation, disturbance, sharpening, and turbulence curves.

3.2 Right to Bear Arms

Two assets were required for the live-action short Right to Bear Arms. They were:

• Interactive fur on a bear and shading the fur

• Making an apple explode from being hit with a bullet

The workflow created for the live-action short Right to Bear Arms was:

Figure 3.4: Live-Action Workflow

The bear used in this live-action short was originally purchased from a third-party content creator; however, the file was only compatible with Maya 2017. This was an issue as the render queue runs Maya 2016 so the bear had to be forced to make it work with Maya 2016. The only thing that did not transfer well to Maya 2016 was xGen, which is how the bear fur was originally implemented. This meant the fur had to be redone, however, in a previous production a Houdini hair workflow was experimented with and was determined to be too laborious for this production.

The solution was to learn how to do xGen fur in Maya 2016, which allowed for interactive collisions between the hair. The bear could be animated in parallel with grooming and surfacing too. Initial research was done by reading the Maya documentation and watching tutorials about xGen fur. [27]

Following the tutorial, fur was able to be produced that resembled what bear fur looked like in reference photos. The shader was the next thing to accomplish and was originally done by sampling the bear body texture applying that color to the fur. These results could be better done and more directable as the surface would need to be redone entirely just to change the color of the fur. This

27 was not ideal so instead, a modified aiStandardHair shader was implemented. After watching a tutorial [26] explaining how the shader worked and reading the Arnold 5 for Maya documentation, a shader was made that looked much better with much more artistic control.

Figure 3.6: Custom Hair Shader

Figure 3.5: Shader sampling from texture

Unlike the bear, the apple explosion was done solely in Houdini. First, the apple geometry was imported into Houdini and since it was not solid geometry it had to be converted to VDB and remeshed into a solid piece of geometry. This had to be done for Houdini’s boolean shatter node to function. A separate VDB instance of the apple then had VDB spheres scattered over it and convert to geometry that had noise added via a mountain SOP to add some variation. The sphere apple was then used to shatter the original apple with the boolean shatter node. The next step was done to the actual rigid-body simulation on the shattered apple that required the pieces to be packed and glued.

The rigid body simulation was done by taking the packed shattered apple geometry and importing it into a dynamics operator network (DOP network). For realism geometry of a bullet was imported into the DOP network and initial velocity was added to the mesh. When the simulation started the bullet would literally shoot through the apple geometry and cause the glue on the pieces

28 to break and shatter the apple violently.

The apple particulate was originally going to be accomplished by using a volume and pyro solver that was spawned across the shattered geometry. The downside to this is that the simulations would take a very long time to solve the entire simulation. Another downside is that it is hard to control the simulation. Instead, millions of particles would be used as this was a much more controllable and faster simulation. The details of how the particle simulation was set up can be found in Appendix B. This simulation looked good; however, it was still missing all of the vapor that you can see in the reference photo:

Figure 3.7: Example of water vapor [1]

This water vapor really sells the look of the explosion, but unfortunately, it could not be easily achieved with a particle simulation so it had to be done with volumes and a pyro simulation.

The pyro simulation was derived from the particle simulation because it would look better if the water vapor and apple particulate followed the same forces. It also helped with keeping a much more linear workflow, so if something changed in the shattering simulation I would not have to waste time trying to get the vapor and particulate simulations to match each other because the vapor simulation would now inherently change whenever the particulate simulation change. To put this into practice

I spawned point volumes along the particles and combined it into a single volume. This volume was then injected into a pyro solver and a simulation was run. While this simulation looked good, more spray was needed from the volume than rather just following the particles, so 3-Dimensional curl noise was added to the initial state of the volume and it was re-simulated. Outside of that the

29 dissipation for the pyro simulation was increased to help the smoke disappear quicker.

3.3 Hot Air

The live-action short Hot Air required only one asset. This asset was a whirlwind for the balloons to be swept up into and fly away to avoid getting popped by the protagonist. The workflow created for the live-action short Hot Air was:

Figure 3.8: Live-Action Workflow

First, since this was a live-action shot and had a slight interaction with a real person, a review of the raw plate in which the effects would be placed in was required. After examining the plate, the main thing to note was that the ground was made up of fine white sand, which meant that the whirlwind could be comprised of a sandish material and be later fixed in compositing to help it fit better with the scene. For this reason, the problem was tackled with a particle simulation because sand is essentially a bunch of particles. The whirlwind should be easily directable and yet somewhat believable, because of this forcing the particles along curves that could be easily changed would be ideal. What was finally done was:

• A master curve was drawn.

• This curve was duplicated three times.

• Different noise was added to each curve.

• The curves were smoothed.

• The curves were combined back into one curve.

From there, inside the particle simulation operator network (POP Network), a sphere emitter was used to emit particles at the start of the curves and a pop curve node was used to drive the particles along the curve with curl noise added give random movement. This was good; the particles moved

30 along the curve; and it was easily directable. With the direction and movement looking good there was still one problem left to tackle: all the particles were the same size. This needed to change because in real life sand particles are not all the same size. To change this a point VEX operator

(point VOP) node was added to the output of the POP network so each point could be modified.

Inside that node, the scale of the particle was modified based on of randomness multiplied by a user-defined ramp and the age divided by the overall life of the particle. This allowed the particles to gradually appear, have varying size, and shrink when they die.

3.4 Journey to Proxima Centauri

The effects for Journey to Proxima Centauri were meant to help immerse the viewer into the virtual world itself. These effects would come into close contact with the user or even interact with the user so the effects had to fit into the world really well, but still efficient enough that it would not impact the user’s virtual experience by causing too much strain on the central processing unit (CPU) or graphical processing unit (GPU). Below is a list of effects that I created for Journey to Proxima Centauri:

• Air Particulate

• Robot Electricity Explosion

• Falling Dust

• Charging Laser Blast

• Mining Cart Engine particles

• Water Stream

• Battery Charging Beam

There were two main ways these effects could be implemented: learn Cascade, which is Unreal

Engine’s internal particle FX suite, or create a Houdini asset (HDA) and import that into the game engine. After trying the latter, it was decided that it would be best to just learn to use Cascade for most of the effects because it would allow for much quicker iterations; it allowed the directors to make tweaks if needed; it was optimized; and it allowed for quicker and better scene interaction.

31 The workflow created for virtual reality experience Journey to Proxima Centauri: Terror of the Mnar was:

Figure 3.9: Journey to Proxima Centauri: Terror of the Mnar Workflow

Below you will find an image that was provided by the directors for what they wanted me to replicate for air particulate in the virtual reality experience world:

Figure 3.10: Example of Air Particulate [3]

The only difference is that they did not want such polygonal look because most of the world was curved. To accomplish the movement of the air particulate 3-Dimensional vector fields created in Houdini were used. 3-Dimensional vector fields were able to create a 3-Dimensional representation of how the velocities in the room would move the particles. To create this 3D vector field a third party game-dev toolset was developed by Epic Games, and SideFX collaboration was used. Specifics

32 about this creation can be found in Appendix C. The vector field outputted from this process was then imported into Unreal Engine and GPU particles were spawned throughout the vector field.

Below you will see how this workflow broke away from the original workflow:

Figure 3.11: Breaking the JPC Workflow: Air Particulate

The robot electricity explosion was meant to mimic the idea of electricity arcing randomly from the robot and then exploding with a shower of sparks. To create this the effects were broken down into three different parts:

• Random electrical arcs

• Explosion of sparks

• Trails that followed the sparks to draw attention

Luckily the documentation website had multiple tutorials that covered each of these effects [12].

More information about how each effect was created can be found in Appendix D. After each effect was created, they were layered and timed to match up in Cascade.

The falling dust effect would normally be done by doing a particle or pyro simulation in

Houdini however, Cascade was showing to be a great effects engine for video games. A tutorial [33] that utilized Unreal Engines starter content and stylized shaders was followed for the set up of the shader. The emitter, however, was created from scratch using GPU based particles that gave the particles random downward velocity and random rotation. This effect would be used wherever dust would be needed.

33 The charging laser blast effect was needed because during the virtual reality experience the viewer ends up getting into a motion simulator and encounters a wall of rocks that the mining cart must blast away. This effect required two-steps:

• A charge up stage

• A laser beam that would fire into the rocks

To accomplish the charge up stage, particles were randomly spawned around a sphere and were sucked into the center of the sphere and die. The size of the particles was scaled with the velocity of the particle to give it a nice streaking effect. The laser beam that fired into the rocks would require a beam emitter much like the one used in the robot electricity explosion; however, unlike that one, it would need to move more smoothly. Figuring this out was ultimately achieved by a brute force way of messing with each knob in the emitter settings and seeing what changed. The two emitters were then layered on top of each other and the timing was lined up. Finally, the colors were modified so that it would mesh well with the world.

There is a sequence in this experience where a mining car comes to life and floats. The directors wanted it to appear that there was a lot of force coming up from under the car, and the engines on the back would come on. To accomplish this a mixture of GPU and CPU sprite particles were used. This is because if GPU particles where occluded at all they would instantly die. The particles were also meant to spread across the floor, but it is hard to get the GPU particles to collide properly when they die if they get occluded. The GPU particles were used to give volume to the effect, while the CPU particles would do the collisions and emit light into the scene. Each particle system was exactly alike in color, size, and speed. To get the engine particles, the lift of particles were rotated 90 degrees; emitted from a sphere; the life was shortened; and the speed was slowed down.

The stream was required because the director wanted to connect two waterfalls while match- ing the topology of the floor. To accomplish this Flow Map textures would be used. This tutorial [24] explained how to created flow maps in Houdini and implement them into Unreal Engine. To do this you:

• Import the geometry

• Create a curve on which you want the map to flow

34 • Project that curve on to the geometry

• Extrude geometry from the curve in uniform divisions

• Use the geometry curvature to create the flow velocities

• Save the velocities out into a texture map

Once all this is done, the texture and geometry created in Houdini are imported into Unreal Engine.

Next, a material is created and is animated using the flow map. This workflow worked really well, but the geometry imported was not the full quality because at runtime the Engine would displace the ground it was flowing on. Below you will see how this workflow broke away from the original workflow:

Figure 3.12: Breaking the JPC Workflow: Water

There is a sequence in this experience where the user needs to go around and “charge” up their battery pack from crystals. The hardest part about this would be getting something that sold as a charging effect. What was done was two different beam emitters were created (like in the charge up beam) and layered on top of each other. The colors for each beam were slightly different, and the speed at which they fluctuated was also different.

35 Chapter 4

Implementation and Limitations

No matter how much planning is put into a project and no matter how much research is put into learning a subject, there will always be issues with implementation or certain limitations will be met. The sections below go over how the effects were implemented in each project, issues that happened with these implementations, and the limitations encountered over the course of these projects.

4.1 Implementation

The environments for Making Friends were exported as an .OBJ file and then imported into a Maya landscape file. From here the file could be referenced in the layout file, and the director could modify it as they saw fit. The lava, water, and smoke assets had animation and in some cases colors that needed to be saved, so those files had to be exported as .ABC files. The lava, water, and smoke effects all had issues with implementation. The lava color was going to be driven by the temperature attribute of the lava; however, it was nontrivial to get this data to transfer over to the

Maya Arnold shaders. It was eventually done after writing the color as a colorSet that Maya could interpret. The water’s implementation issue had to deal with the Alembic cache saving the mesh as a smoothed mesh. This smoothed mesh would not have fit in with the landscape so it was scrapped.

The smoke implementation problem was that it just popped too much and did not fit in the scene as it was not done in a low-polygonal style.

The assets for Right to Bear Arms and Hot Air were rendered straight out of Houdini with

36 Arnold 5 shaders. The .exr files were then placed in a predetermined location in the pipeline so that the compositor could access them and composite them into the appropriate shot. All of these effects had implementations issues. The main issue was that the effects timing is slightly wrong and they did not look realistic enough to be composited in the shot seamlessly.

Most of the assets for Journey to Proxima Centauri: Terror of the Mnar were created in

Unreal Engine so implementing them in the virtual reality experience was painless and could be dragged and dropped into the scene. Of the two assets that were created outside of Unreal Engine, both required an export using the game-dev tools. They were then imported into the project directory and could then be used. The only effect that really had implementation problems was the river that traveled between two waterfalls. It was hard to get the river to flow at a proper speed and getting it to mesh seamlessly with the waterfalls was also never fully implemented.

One implementation to note is that even though the water in Making Friends was not implemented in the final shot. The idea of how to create this asset was taken from the game-dev tools flow map tutorial. [24] The technique to draw a curve, project it and build a mesh from that curve really helped with controlling how the water would flow.

4.2 Limitations

The biggest limitation out there no matter what the problem or project is, is always time.

It takes a lot of time to do even the simplest and smallest task well. Another major limitation of these projects was the on the fly learning that had to be done to create the effects in the given time frame. Finally, the last limitation for these projects was the sheer amount of effects that had to be created in such a short amount of time.

More specifically the main limitation for Making Friends was getting all of the assets to mesh properly. There is only so much that can be represented in a low polygonal form before it becomes too abstract to the viewer. This means that a cool explosion or gas effects might not be possible given the time frame and resources. Another limitation would be computational power, at a certain point, the computer will crash during a simulation if all the RAM and Swap memory is taken up.

In Right to Bear Arms and Hot Air the main limitation that was faced was trying to pass off computer-generated assets as realistic effects. The amount of time it takes to mimic the real world

37 and all its physical properties still is not feasible to achieve. That’s why people have created shortcut or approximations such as FLIP fluids, or importance sampled path tracing. Until the computing power is sufficient enough to mimic those forces perfectly tricking an audience into believing these effects are real will always be a limitation. This coupled with a lack of knowledge about how to properly create effects for live-action made it extremely challenging to make the effects mesh well.

Virtual reality has its own set of limitations. Unlike most of the films done today, it needs to be rendered real-time all on a single computer. This can be both good and bad; it was good for Journey to Proxima Centauri: Terror of the Mnar because the computer that was running the virtual reality system had very good hardware which meant efficiency could sometimes take a backseat to “wow factor”. However, the downside to this is sometimes the effects quality has to suffer so that the experience can run smoothly. Good can come from this though because an artist is forced to find cheaper and more creative ways to find a solution to a problem. A creative solution that came about because of limitations like this was the layering of CPU and GPU particles for the mining car. This is even being utilized in short films these days like in Allumette [21] where they had to find a super cheap way to render volumetrics in real time or in the Fortnight trailers where

Epic Games required the production team to use Unreal Engine. [23]. Along with all of this virtual reality has its own limitations in not being able to fully immerse a user into a world or making the user sick due to low frame-rates and lagging software. [18]

38 Chapter 5

Results

The results for each project varied a bit. The best-implemented effects would have to be the effects done for Journey to Proxima Centauri: Terror of the Mnar. These effects mesh the best with the world and really help drive the narrative of the project. The worst implemented effects would have to be the effects done for Right to Bear Arms and Hot Air. It was just extremely hard to yield photo-realistic effects. Improvements still need to be done on all of the effects; however, there was a strict time schedule to meet. Below are before and after shots for each project I worked on. The before shots show what the scene would look like without the use of effects, while the after shows the effects integrated into the project.

39 5.1 Making Friends

Figure 5.1: Shot 01 without effects Figure 5.2: Shot 01 with effects

Figure 5.3: Shot 05 without effects Figure 5.4: Shot 05 with effects

40 5.2 Right to Bear Arms

Figure 5.5: Without Bear Fur Figure 5.6: With Bear Fur

Figure 5.7: Without apple explosion Figure 5.8: With apple explosion

5.3 Hot Air

Figure 5.9: Without Whirlwind Figure 5.10: With Whirlwind

41 5.4 Journey to Proxima Centauri

Figure 5.11: Without particulate Figure 5.12: With particulate

Figure 5.13: Without Electricity Explosion Figure 5.14: With Electricity Explosion

Figure 5.15: Without Dust Figure 5.16: With Dust

42 Figure 5.17: Without Laser Blast Figure 5.18: With Laser Blast

Figure 5.19: Without Mining Cart Particles Figure 5.20: With Mining Cart Particles

Figure 5.21: Without water stream Figure 5.22: With water stream

43 Figure 5.23: Without Charging Beam Figure 5.24: With Charging Beam

44 Chapter 6

Postmortem

Over the course of these projects, many issues arose that required making a quick fix. This happens sometimes in every production, but it is always bothersome when a problem does not get properly fixed so an asset has to get scrapped. Below will list some improvements that I would like to be made to some of the assets created during this project.

In Making Friends it would be nice if the procedurally generated environment was flush with the ground and if all the geometry did not intersect with each other. This is because some of the parts of the environment had to be rotated away from the camera so it would be hard to see this issue. It would also be nice if this approach was packed as an artist tool that any artist could import and use to make easy procedural low-polygonal environments. The main improvement area would be to make the lava, water, and smoke effects mesh into the world so that they could help drive the narrative of the project more.

In Right to Bear Arms it would be nice to get the apple explosion to look like the videos that were watched for reference. The water particulate moves in a weird way and does not disappear fast enough. It also does not look like it is suspended in air. The apple particulate also moves in an awkward way, and there should be better clumping of the particulate that would help sell the idea of an apple exploding and disintegrating. There should also be more visible apple chunks that spray out initially. Although the bear fur turned out quite well, it could still be improved by fixing how the fur is groomed. It needs to not be as clumped and clean, but rather have more noise.

In Hot Air it would be nice if the whirlwind had a more realist movement. It would also be nice to add flying debris, such as twigs and leaves to join the sand in the whirlwind.

45 Even though In Journey to Proxima Centauri: Terror of the Mnar almost all of the resources allocated to effects was almost used up, the effects could still be improved. This is because most of them were created on the fly while learning how to use Cascade while following tutorials. Following tutorials is good for a starting place, but these effects could help drive this narrative further once more efficient effects are implemented via Cascade.

46 Chapter 7

Conclusion

Although this paper was centered around the effects for these projects, other contributions to projects were all made in the form of asset creation for more departments than just effects. In the Making Friends project, help was given with pipeline, model, surfacing, and lighting. In the

Right to Bear Arms project help, was given to pipeline, modeling, and lighting. In the Journey to

Proxima Centauri project, help was given to building the physical motion simulator. With all that being said it is time to conclude this paper with regards to effects, Limitations, and Applications.

7.1 Effects

Effects are still often thought of as cheap tricks or stuff that should be done to enhance entertainment media, however, this is not completely true. Yes, it is true effects can entail the former, but Moana, Mad Max: Fury Road, Blade Runner: 2049, Flower, and Journey show that effects have taken a core role in the modern day entertainment media. They help drive the narrative and express what the artist wants the audience to feel. It was even shown that some effects are so well blended that the audience can’t decipher what is real and what is fake. This mindset was taken for the Right to Bear Arms and Hot Air projects, but was not met. In the Journey to Proxima

Centauri: Terror of the Mnar project the effects added a lot to the narrative and helped express exactly what the directors wanted. The differences between the effects where shown, however, there is still potential to use techniques from video-game production to help assist in film production, by utilized Unreal Engine to prototype or block out general effects in real-time. What was also done

47 in the Making Friends and Journey to Proxima Centauri: Terror of the Mnar projects was a great deal of reuse for the effects. The directors were able to duplicate and easily modify the effects to suit their needs which led to less production time without sacrificing quality. This is much like what the big companies are doing now by building caches and databases of effects, just on a much smaller scale.

7.2 Limitations

The main limitation across the board was the short amount of time there was to produce all these assets. There just wasn’t enough time to do everything and unfortunately, sacrifices had to be made which resulted in a worse end product. More specifically the limitations of a low-polygonal style in Making Friends made the process of creating effects about thinking about how to create an effect that would either be low poly or would mesh well with the style. It seems counter-intuitive at first, but making stuff simple and low poly isn’t the easiest when it comes to effects. In Right to

Bear Arms, Hot Air, and Journey to Proxima Centauri: Terror of the Mnar, the main limitation was a lack of knowledge. Another much broader limitation that can’t be overlooked is computational resources as there wasn’t enough of them to accurately simulate all the effects. Furthermore, on a much broader spectrum, there are some limitations that each workflow suffers from. The virtual reality workflow is able to get more iterations done because everything has to be run in real-time, whereas the animation and live-action workflow typically have to wait for at the very minimum a playblast to render. The live-action workflow is slowed down even more because they have to capture the live-action footage before the computer generated assets can be implemented.

7.3 Applications

Although animation, live-action, and virtual reality experiences are vastly different in and of themselves, we can always try to take some techniques that work for one of them and try to implement them into another one. For instance, a lot of virtual reality experiences use the same effects over and over again. This was true in Journey to Proxima Centauri: Terror of the Mnar with the electrical explosion and dust effect. If animation and live-action productions were to look at reusing effects or saving all previous effects in a database that could be used later, it would help

48 save on the need to handcraft every single effect. It would also allow more effects to be added to a shot if needed easily. From a business standpoint, this would save time, money, and memory space.

It would also be beneficial to start using real-time applications to help assist with layout, previs, and quick prototyping for all of the industries. This would allow for a more clear direction on how to build the effects in the shot and would allow for pitfalls to be avoided. There is still a long way to go; however, the industries should collaborate with one another to create technology and new paradigms that benefit all.

49 Appendices

50 Appendix A Making Friends - Environment Creation

To start with the mountain ranges proxy geometry was placed in the world and then sculpted to the requested silhouette. A height field was then projected down on to the geometry. Next, the height field was blurred to smooth out any harsh places. After that, some curl noise was added to the height field that would only affect the height field where geometry was. Then, sparse convolution noise was layered on top to affect the entire height field. From there the height field was converted to polygons. This produced a mesh that had around 250k polygons. This was too high of a polygon count so to get it down to an appropriate polygon count using Houdini’s Remesh SOP. The remesh

SOP gave results with non-uniform polygon size. There were more polygons around areas of high curvature and give an uneven distribution of the size of polygons. This didn’t fit the style so

Houdini’s Poly Reduce node was used instead. This node would allow the polygon count to be reduced by a percentage, but keep the overall size of each polygon close to every other polygon in the mesh. Another nice feature of this node allowed control over how well the flow of triangles follow the topology of the input mesh making creases better and more pronounced. Once this geometry was created trees needed to be added to it. In order to do this efficiently and art directable, the mesh was painted on where trees were meant to grow from. Points were then randomly scattered and jittered along the painted parts of the mesh. Then tree models were imported and instanced them across the mesh using each point that had been scattered to a point that would instance a tree. To break up the computer generated feel of this VEX code was added to a copy stamp node to randomly scale and rotate all of the trees that had been instanced.

51 Appendix B Right to Bear Arms - Particle Simulation

To accomplish the particle simulation for the apple explosion the shattered geometry that had been simulated on was imported and unpacked in a geometry node. The velocities of each piece were calculated by using a trail SOP. This information was then imported into a Particle Operator

Network (POP Network) and used the unpacked geometry as the input that would spawn particles every frame and set the particles to inherit the velocities from the geometry pieces that they were spawning from. The initial simulation was decent, however, the movement was too boring as the velocities were linear and the life was too long and even. To fix this the life of each particle was set to be roughly 0.5 seconds with +/- 0.25 seconds of random life variation and added some large curl noise to the velocity of each particle.

52 Appendix C Journey to Proxima Centauri: Terror of the

Mnar - Air Particulate

To create the vector field to drive the particulate the room was imported into Houdini and a number of curves were drawn that twisted and turned throughout the room. Next, an array of planes that intersected these curves were placed and vectors based on the direction of the curve and intersection of the plane were calculated and added to the vector field.

Appendix D Journey to Proxima Centauri: Terror of the

Mnar - Electricity Explosion

To create the electrical arc a beam emitter in Cascade would be the best option. The main setup of the system was derived from a tutorial found on the documentation website. [12]

After the initial set up was finished the emitter node properties were messed with to change the noise, color over life, size, and timing of the beam. With that done, a shower of sparks was needed to explode after the arcing ended. To create this a tutorial for sprite emitters was found on the documentation website and was used to set up the base simulation. [13] After the main set up was complete the color, velocities, collisions, and timing on the particles were modified for better integration with the production. The ribbon particles were added to trail the sparks to help bring attention to the explosion. It was done by using a ribbon emitter in Cascade following a tutorial on the documentation website. [14]. Light was chosen to emit from the sparks to help add a little more realism to the effect.

53 Bibliography

[1] https://www.youtube.com/watch?v=mxDiTPYe950, 2014.

[2] http://filmonic.com/leonardo-dicaprio-will-stop-at-nothing-to-get-revenge-in-the-revenant/, 2015.

[3] https://www.youtube.com/watch?v=gpoUGJaIMK4, 2016. [4] https://www.playstation.com/en-us/games/journey-ps4/, 2018.

[5] https://www.wallpapervortex.com/wallpaper-61905_3d_3d_low_poly_ladnscape.html# .WtfdeHUvy0o, 2018. [6] Brett Achorn, Sean Palmer, and Larry Wu. Building moana’s kakamora barge. In ACM SIGGRAPH 2017 Talks, SIGGRAPH ’17, pages 66:1–66:2, New York, NY, USA, 2017. ACM.

[7] Andy Beane. 3d production pipeline. https://i.pinimg.com/originals/ba/08/27/ ba0827dd0192c5530116650b1260299c.jpg, 2012. [8] Marc Bryant, Ian Coony, and Jonathan Garcia. Moana: Foundation of a lava monster. In ACM SIGGRAPH 2017 Talks, SIGGRAPH ’17, pages 10:1–10:2, New York, NY, USA, 2017. ACM. [9] Codex DNA. Production Pipeline. 2017.

[10] Ian Failes. A graphic tale: the visual effects of mad max: Fury road. https://www.fxguide. com/featured/a-graphic-tale-the-visual-effects-of-mad-max-fury-road/, 2015. [11] Ben Frost, Alexey Stomakhin, and Hiroaki Narita. Moana: Performing water. In ACM SIG- GRAPH 2017 Talks, SIGGRAPH ’17, pages 30:1–30:2, New York, NY, USA, 2017. ACM.

[12] Epic Games. Intro to Cascade: Creating a Beam Emitter — 07 — v4.2 Tutorial Series — Unreal Engine. 2014. [13] Epic Games. Intro to Cascade: Creating a GPU Sprite Emitter — 05 — v4.2 Tutorial Series — Unreal Engine. 2014. [14] Epic Games. Intro to Cascade: Creating a Ribbon Emitter — 08 — v4.2 Tutorial Series — Unreal Engine. 2014. [15] Jason Guerrasio. A body double, cgi skull, and secret filming sessions all helped ’blade runner 2049’ earn a vfx oscar nomination. http://www.businessinsider.com/ oscars-2018-how-blade-runner-2049-used-cgi-2018-1, 2018. [16] Mike Jutan and Steve Ellis. Director-centric virtual camera production tools for rogue one. In ACM SIGGRAPH 2017 Talks, SIGGRAPH ’17, pages 45:1–45:2, New York, NY, USA, 2017. ACM.

54 [17] Kim Keech, Rachel Bibb, Brian Whited, and Brett Achorn. The role of hand-drawn animation in disney’s moana. In ACM SIGGRAPH 2017 Talks, SIGGRAPH ’17, pages 3:1–3:2, New York, NY, USA, 2017. ACM. [18] Robert W. Lindeman and Steffi Beckhaus. Crafting memorable vr experiences using experiential fidelity. In Proceedings of the 16th ACM Symposium on Virtual Reality Software and Technology, VRST ’09, pages 187–190, New York, NY, USA, 2009. ACM. [19] Sean Palmer, Jonathan Garcia, Sara Drakeley, Patrick Kelly, and Ralf Habel. The ocean and water pipeline of disney’s moana. In ACM SIGGRAPH 2017 Talks, SIGGRAPH ’17, pages 29:1–29:2, New York, NY, USA, 2017. ACM. [20] Eric Patterson. Brief history of special/visual effects in film. https://people.cs.clemson. edu/~ekp/courses/dpa8150/assets/00_History.pdf. [21] Devon Penney. Volumetric clouds in the vr movie, allumette. In Proceedings of the 2016 Symposium on Digital Production, DigiPro ’16, pages 61–64, New York, NY, USA, 2016. ACM. [22] Rob Piek´e,Lucy Bailey, Kai Wolter, and Jo Plaete. Creating the flying armadas in guardians of the galaxy. In ACM SIGGRAPH 2014 Talks, SIGGRAPH ’14, pages 7:1–7:1, New York, NY, USA, 2014. ACM. [23] Brian J. Pohl, Andrew Harris, Michael Balog, Michael Clausen, Gavin Moran, and Ryan Brucks. Fortnite: Supercharging cg animation pipelines with game engine technology. In Proceedings of the ACM SIGGRAPH Digital Production Symposium, DigiPro ’17, pages 7:1–7:4, New York, NY, USA, 2017. ACM. [24] Go Procedural. Flowmaps! // Houdini for Games. 2018. [25] Oona Salla. Mastering the elements basics of 2d effect animation. https://www.theseus.fi/ bitstream/handle/10024/137717/Salla_Oona.pdf?sequence=1&isAllowed=y, 2017. [26] Arvid Schneider. Mtoa 504 — hair introduction with arnold 5. https://www.youtube.com/ watch?v=N5sy4vwoM40, 2017. [27] Arvid Schneider. Mtoa 507 — xgen fur with aistandardhair. https://www.youtube.com/ watch?v=EXF_7Zgg65M, 2017. [28] Mike Seymour. The techniques used in the blade runner 2049 hologram sex scene. https://www.fxguide.com/featured/ the-techniques-used-in-the-blade-runner-2049-hologram-sex-scene/, 2017. [29] Charlie Sierra. What’s the difference between special effects & visual effects? http: //filmescape.com/whats-the-difference-between-special-effects-visual-effects, 2018. [30] Creative Staff. How to set up a vfx pipeline. https://www.creativebloq.com/audiovisual/ how-set-vfx-pipeline-10134804, 2013. [31] Walt Disney Animation Studio. Moana Animation. 2016. [32] Drew Turney. Wave of animation: Disneys moana ups the cgi ante. https://www.autodesk. com/redshift/moana-animation/, 2016. [33] UnrealCG. Stylized Smoke/Fire Particle System Tutorial - [UE4]. 2017. [34] Isaac Yuen. Interactive storytelling: Thatgamecompanys flower. https://ekostories.com/ 2012/03/30/flower-thatgamecompany-nature/, 2012.

55