Expanded Virtual Puppeteering
Total Page:16
File Type:pdf, Size:1020Kb
Expanded Virtual Puppeteering Luiz Velho1 and Bernard Lupiac2 1IMPA - Instituto de Matematica Pura e Aplicada, Brazil 2PUC-Rio - Universidade Catolica Rio de Janeiro, Brazil [email protected], [email protected] Keywords: Virtual Reality, Digital Puppetry, Artistic Performance, Animation. Abstract: This work proposes a framework for digital puppeteering performances, using solely the performer’s bare hands. The framework relies on Unity as a base, and hand information is captured from a Leap Motion device. The performer employs a gesture interaction system and precise hand movements to manipulate the puppet in a number of different manners. It is then possible for the audience to view the puppet directly in a big screen, or using virtual or augmented reality headsets, which allows rich interactions. 1 CONTEXT AND MOTIVATION Thereafter, Section 6 shows the different types of pos- sible visualizations. Finally, Section 7 gives an exam- New Media is a phenomenon where recent develop- ple where this framework is used by artists to do a ments in technology are integrated with traditional small performance. types of media. There is a range of mediums con- cerned by it, like for example books, movies, mu- seum exhibits and plays. These past few years, 2 BACKGROUND, HISTORY AND new paradigms of interaction and visualization have RELATED WORKS emerged, with the resurgence of virtual and aug- mented reality, and markerless hand and body inter- Digital puppetry isn’t a new concept by any means. action. Since their emergence is very recent, there is In the early sixties, Lee Harrison III used body suits still much to explore about them and the possibilities with potentiometers, animating a 3D model in a CRT they will bring. screen. This work proposes a framework for digital pup- A few decades later, around 1988, digital pup- peteering performances, where the performer uses peteering gained the attention of the general public only his or her bare hands. The program uses the with the help of two projects: Waldo C. Graphic and Unity framework, and hand related information is Mike Normal. captured with a Leap Motion device. The aim of this Waldo C. Graphic is a character on the Muppet’s work is to preserve the expressiveness and know-how show The Jim Henson Hour. Waldo is an entirely of classic puppeteering, while at the same time, to computer generated puppet, controlled by a single look at it in a different perspective with the help of puppeteer with a big contraption. the aforementioned advances in technology. Mike Normal, created by Brad DeGraf, was un- The paper is structured in the following manner: veiled at the 1988 SIGGRAPH conference, and was Section 2 will give a brief history of the development the first live performance of a digital character. It al- of digital puppetry in academia and in the show and lowed a performer to control many parameters on its film business. Following it, section 3 will give an face, with real-time interpolation between them. overview of all the different ways in which the pup- In 1994, in Charleville-Mezi´ eres,` a significant pet can be moved and the reasons why they will or event also took place: e-Motion Capture. It united won’t be used in our work. Afterwards, section 4 researchers in the field of Computer Graphics and per- will describe in a detailed manner all the modules im- formers from around the world to experiment the dif- plemented in this framework. Then, section 5 will ferent technologies developed for virtual puppeteer- explain the necessary process to generalize this pro- ing up to that date. gram, so it can be used in many different contexts. A small number of other digital puppets were de- with physics, where the performer’s hand position and veloped in the next few years. Among them are Mat rotation corresponds to the puppet’s head position and the Ghost, who appeared daily on French national rotation. The last mode is a junction of the first two: television for years, and Moxy, part of a Cartoon Net- one hand controls the puppet’s head position and fa- works show, created by Brad DeGraf. cial features while the rest of its body is moved by the In the same way there wasn’t much advancement physics system. In addition to that, the remaining free from the early sixties until late eighties, this field hand is used to manipulate one of the puppet’s own wasn’t very active from mid nineties until mid 2010s. hands. This resurgence was caused by recent advances in Our work proposes a framework that uses a new technology that allowed to overcome longstanding approach to puppet manipulation. Instead of hav- problems in the field. ing all possible movements mapped to different parts However, there were still attempts to adapt 2000s of the hand, we use a gesture-based system that al- technology to digital puppeteering. The main devices lows switching between different contexts in run- used during that time were the dataglove, a very un- time, where each context corresponds to a different wieldy and expensive device. Some attempts were type of movement. This results in more straightfor- also made to use accelerators, with the controllers ward and precise controls, since the whole range of provided with the Wii video-game console(Shiratori both hands’ movement is available to every type of and Hodgins, 2008). movement. Other important advantages this system What made the current developments in this field brings is the ability to support an extensive amount of possible were advancements in computer vision and movement types simultaneously, and the lack of diffi- the vulgarization of depth-finding cameras, in prod- culty associated with developing a new one. In addi- ucts like Microsoft’s Kinect, the Leap Motion and tion to this, our framework addresses other challenges eventually smartphones. associated with puppeteering performances, such as a way for a director to control different cameras and In Seth Hunter’s and Pattie Maes’ work(Hunter visualization by the audience in a screen, virtual or and Maes, 2013), a number of methods for perform- augmented reality. ing digital puppeteering using green screens and com- puter vision are described. Lins’ and Marroquin’s work(Souza et al., 2015) proposes a way to control a puppet’s walking motion by using the Kinect and 3 PUPPET MOVEMENT trackers at the tip of two fingers. More recently, An- PARADIGMS deregg et al. present a way to manipulate a puppet in an augmented reality environment by using a smart- This section shows all the explored forms of move- phone’s sensors and screen(Anderegg et al., 2018). ment that can be used to manipulate the puppet. It Around that period, some commercial products then explains their advantages and disadvantages for were developed for digital puppeteering. Puppet Pa- the purpose of this work. Based on that, the overall rade uses the Kinect to allow manipulating multiple suitability of each one is evaluated. puppets and let the audience interact with them at the The fundamentals of our puppet movement con- same time. Marionette Zoo (Botond Gabor´ et al., trol rely on three aspects: i) blended animations; ii) 2014) lets users control a set of puppets using the inverse kinematics; and iii) physically-based simu- Leap Motion and a physics-based system with strings. lation. All these three features are fully supported Some research involving puppet manipulation by modern game engines such as Unreal and Unity, using the Leap Motion already exists. Oshita which we have chosen for the implementation of our et al.(Oshita et al., 2013) propose a manipulation system. scheme inspired from a real puppet manipulation crossbar. It allows a user to manipulate arms, legs 3.1 Animations and trunk simultaneously by using both hands’ posi- tions, rotations and the stretching of fingers. Other Animations are pre-recorded sets of movement that approaches, such as (Ciccone et al., 2017), have also are applied to a model over a finite amount of time. used Leap devices for puppet animation. They can either record the position of every vertex Leite’s and Orvalho’s work(Leite and Orvalho, or bone in every frame, or just keep a few sets of 2017) describe three separate puppet manipulation recorded positions (called keyframes) and interpolate modes: a first one that manipulates many different between them. facial features of a puppet’s head, such as eyes, eye- Movements using animations are very well inte- brows and mouth. A second that moves the puppet grated in Unity because of their widespread use in video-games. It also means that there are many open 3.3 Physically based Animation license animations available to test and develop, espe- cially for humanoid models. Physically based animation makes an object perform With animations, it is possible to perform many realistic movement by using physics simulation. Sim- complex and varied types of movements very eas- ulating physics with mathematical equations has been ily. With a simple gesture, the puppet can perform a topic of research for centuries, and because of that, a dance, make a basketball move or climb a wall. most of the theory behind what we need for this work On the other hand, because of their repetitive na- has been around for a while. Yet to perform these ture, animations can end up limiting the amount of simulations in real-time in a stable and aesthetic fash- expression a performer can infuse into a marionette. ion is a whole different matter. It is still possible with However, this is a problem that can be avoided by us- a lot of loose simplifications, but it can still behave ing methods like blending between two animations, or unexpectedly.